Note
Go to the end to download the full example code.
Warm-up: numpy#
Created On: Dec 03, 2020 | Last Updated: Sep 29, 2025 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(\pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 3052.3285581389755
199 2101.231105689561
299 1448.962938502716
399 1001.1222526545394
499 693.2912335926904
599 481.46098829148957
699 335.53099466366257
799 234.88988191993673
899 165.40778612173372
999 117.38708654519182
1099 84.16454622581762
1199 61.15670120831415
1299 45.207199119984885
1399 34.14006526328805
1499 26.45357069231725
1599 21.110209965762127
1699 17.39245106188341
1799 14.803547872126408
1899 12.99926186323586
1999 11.740809011942455
Result: y = -0.051995161302446156 + 0.8347736541045064 x + 0.008970030527890624 x^2 + -0.0902057477016836 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: (0 minutes 0.236 seconds)