Polynomial Regression from Scratch
MediumRegressionPolynomial FeaturesModel FittingVisualizationOverfittingFeature Engineering
Implement polynomial regression from scratch and visualize how different polynomial degrees affect the model's performance. Learn about overfitting, underfitting, and the bias-variance tradeoff.
Problem:
Implement polynomial regression from scratch and visualize how different polynomial degrees affect the model's performance. Learn about overfitting, underfitting, and the bias-variance tradeoff.
Examples:
Input: X = np.array([[1], [2], [3]])
degree = 2
Output: array([[1, 1, 1],
[1, 2, 4],
[1, 3, 9]])
Generate polynomial features of degree 2
Input: X = np.linspace(0, 1, 5).reshape(-1, 1)
y = np.sin(2 * np.pi * X)
Output: MSE: 0.0123
R-squared: 0.9876
Fit polynomial regression and calculate metrics
Input: X = np.array([[0], [0.5], [1]])
y = np.array([0, 1, 0])
Output: Degree 1: R² = 0.0
Degree 2: R² = 1.0
Quadratic relationship requiring at least degree 2 polynomial
Constraints:
- Input X must be a numpy array with shape (n_samples, 1)
- Polynomial degree must be a positive integer
- Must implement all required functions
- Must handle numerical stability issues
Code Editorpython
Run your code to see the output here.
Output
Run your code to see the output here.