Note
Click here to download the full example code
Warm-up: numpy¶
Created On: Dec 03, 2020 | Last Updated: Dec 03, 2020 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 1198.7043659170026
199 795.633147354381
299 529.1013843735973
399 352.85660693815896
499 236.3142953710245
599 159.25037620220178
699 108.29163929952436
799 74.59502058323648
899 52.3130238396512
999 37.57897820882353
1099 27.83603783678338
1199 21.393481604541172
1299 17.13331542764694
1399 14.316262404023695
1499 12.453473366119924
1599 11.221695182054242
1699 10.407175584732201
1799 9.868570093677405
1899 9.512414114926202
1999 9.276903795745245
Result: y = 0.00028613693875776465 + 0.8358995074935835 x + -4.936338327492509e-05 x^2 + -0.09036589054226225 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: ( 0 minutes 0.233 seconds)