Friday, 11 October 2019

PyTorch: Single Neuron Linear Regression with Gradient Descent

PyTorch has auto-differentiation just like TensorFlow. The following code show a single neuron with linear activation (identity activation) that learns simple regression through Gradient Descent.

Source code:
%reset -f

#libs
import torch as t;

#data
X = t.tensor([[1.,2.]]);
Y = t.tensor([ 3.    ]);

#model
W1 = t.rand(2, requires_grad=True);

def feedforward(Inp):
  return t.dot(Inp,W1);

#before train
print("Before train:");
print(float(feedforward(X[0])));

#train
Steps = 50;
print("\nTraining...");

for I in range(Steps):
  for J in range(len(X)):
    #forward
    Inp = X[J];
    Exp = Y[J];
    Out = feedforward(Inp);

    #backward
    Delta = Out-Exp;
    Loss  = Delta**2; #loss function of delta (error)
    Grad  = 2*Delta;  #gradient of loss function
    Out.backward(Grad);

    #apply grads
    W1.data -= 0.01*W1.grad.data;
    W1.grad.data.zero_();

  if I%(Steps/10)==0:
    print("Loss:",float(Loss));

print("\nAfter train:");
print(float(feedforward(X[0])));

print("\nDone.");
#eof

1 comment:

  1. Excellent Blog! I would like to thank for the efforts you have made in writing this post. I am hoping the same best work from you in the future as well. I wanted to thank you for this websites! Thanks for sharing. Great websites! android evaluation board

    ReplyDelete