.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "beginner/examples_tensor/polynomial_numpy.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_beginner_examples_tensor_polynomial_numpy.py: Warm-up: numpy -------------- A third order polynomial, trained to predict :math:`y=\sin(x)` from :math:`-\pi` to :math:`pi` by minimizing squared Euclidean distance. This implementation uses numpy to manually compute the forward pass, loss, and backward pass. A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations. .. GENERATED FROM PYTHON SOURCE LINES 16-54 .. rst-class:: sphx-glr-script-out .. code-block:: none 99 1198.7043659170026 199 795.633147354381 299 529.1013843735973 399 352.85660693815896 499 236.3142953710245 599 159.25037620220178 699 108.29163929952436 799 74.59502058323648 899 52.3130238396512 999 37.57897820882353 1099 27.83603783678338 1199 21.393481604541172 1299 17.13331542764694 1399 14.316262404023695 1499 12.453473366119924 1599 11.221695182054242 1699 10.407175584732201 1799 9.868570093677405 1899 9.512414114926202 1999 9.276903795745245 Result: y = 0.00028613693875776465 + 0.8358995074935835 x + -4.936338327492509e-05 x^2 + -0.09036589054226225 x^3 | .. code-block:: default import numpy as np import math # Create random input and output data x = np.linspace(-math.pi, math.pi, 2000) y = np.sin(x) # Randomly initialize weights a = np.random.randn() b = np.random.randn() c = np.random.randn() d = np.random.randn() learning_rate = 1e-6 for t in range(2000): # Forward pass: compute predicted y # y = a + b x + c x^2 + d x^3 y_pred = a + b * x + c * x ** 2 + d * x ** 3 # Compute and print loss loss = np.square(y_pred - y).sum() if t % 100 == 99: print(t, loss) # Backprop to compute gradients of a, b, c, d with respect to loss grad_y_pred = 2.0 * (y_pred - y) grad_a = grad_y_pred.sum() grad_b = (grad_y_pred * x).sum() grad_c = (grad_y_pred * x ** 2).sum() grad_d = (grad_y_pred * x ** 3).sum() # Update weights a -= learning_rate * grad_a b -= learning_rate * grad_b c -= learning_rate * grad_c d -= learning_rate * grad_d print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3') .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.233 seconds) .. _sphx_glr_download_beginner_examples_tensor_polynomial_numpy.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: polynomial_numpy.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: polynomial_numpy.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_