Note
Go to the end to download the full example code.
Warm-up: numpy#
Created On: Dec 03, 2020 | Last Updated: Sep 29, 2025 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(\pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 2028.2529236775276
199 1376.2243627235878
299 935.6992922477887
399 637.7704733586546
499 436.0721547665856
599 299.3782026898492
699 206.63925194677424
799 143.65258123002124
899 100.82578880885674
999 71.67376522760051
1099 51.80765035639834
1199 38.254133565008196
1299 28.99674462105816
1399 22.66644906705791
1499 18.332749595503675
1599 15.362505513768387
1699 13.324421376475886
1799 11.924364577249616
1899 10.961514568895378
1999 10.298602596542514
Result: y = -0.0329347571695189 + 0.8347170887150409 x + 0.005681793648466868 x^2 + -0.09019770176760789 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: (0 minutes 0.235 seconds)