Note
Go to the end to download the full example code.
Warm-up: numpy#
Created On: Dec 03, 2020 | Last Updated: Dec 03, 2020 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 4106.283516683774
199 2718.76337576286
299 1801.121546428586
399 1194.2277799528533
499 792.846336517627
599 527.381256119497
699 351.80596642102404
799 235.68097981699566
899 158.8750608555447
999 108.07421997335143
1099 74.4730180888146
1199 52.24775254674347
1299 37.546717483216995
1399 27.822418818013027
1499 21.389934448031365
1599 17.13483255057122
1699 14.319999047895134
1799 12.457878559737779
1899 11.225976960751606
1999 10.41097558084346
Result: y = -0.004096530795289633 + 0.8181154742572432 x + 0.000706719728754132 x^2 + -0.08783626667911676 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: (0 minutes 0.236 seconds)