At Stripe, you need a tiny training routine for a linear pricing model. Implement batch gradient descent for linear regression with an intercept (bias).
Write train_linear_regression(X, y, lr, epochs) where:
X is a 2D list of floats with shape (n_samples, n_features)y is a list of floats with length n_sampleslr is a positive float learning rateepochs is a positive integerReturn a dictionary:
{"weights": [bias, w1, ..., wd], "preds": [p0, ..., p(n-1)]}
where preds[i] is the prediction for X[i] using the final weights.Raise ValueError if:
X or y is empty, or len(X) != len(y)X is ragged (rows have different lengths) or has zero featuresExample 1
Input: X = [[1.0],[2.0],[3.0]], y = [2.0,4.0,6.0], lr = 0.1, epochs = 2000
Output: weights ≈ [0.0, 2.0], preds ≈ [2.0, 4.0, 6.0]
Explanation: Perfect line y = 2x.
Example 2
Input: X = [[0.0],[1.0]], y = [1.0,3.0], lr = 0.1, epochs = 2000
Output: weights ≈ [1.0, 2.0], preds ≈ [1.0, 3.0]
Explanation: Line y = 1 + 2x.
1 <= n_samples <= 2 * 10^41 <= n_features <= 501 <= epochs <= 10^40 < lr <= 1X and y are finite floats in [-1e6, 1e6]