Thursday, 26 September 2019

TensorFlow: Get Gradients of Any Function Using Auto-Differentiation

Gradient-based optimisers need derivatives of activation functions, and partial derivative of loss function to optimise and minimise loss. TensorFlow has tf.GradientTape to record changed value to get gradient value dy/dx.

Source code:
#ipython
%reset -f

#libs
import tensorflow as tf;
import numpy      as np;

#init
tf.enable_eager_execution();

#code
X = tf.convert_to_tensor([-3,-2,-1,0,1,2,3],dtype=tf.float32);

with tf.GradientTape() as T:
  T.watch(X);
  Y = Fx = X**2 + X + 1;

Dy_Dx = T.gradient(Y,X);
print(Dy_Dx);

print("\nDone.");
#eof

No comments:

Post a Comment