The matter is that training data might be in form [x1...xN] => class. The following code uses onehot to transform classes in training data to probabilities.
Source code:
#libs import tensorflow as tf; tf.enable_eager_execution(); tf.executing_eagerly(); #for single neuron class-index output layer Y_Labels = [0,1,2,3,2,3]; #for multi-neuron probabilities output layer
#depth is number of classes Y_Probs = tf.one_hot(Y_Labels, depth=4); print(Y_Probs); #eof
Result:
tf.Tensor( [[1. 0. 0. 0.] [0. 1. 0. 0.] [0. 0. 1. 0.] [0. 0. 0. 1.] [0. 0. 1. 0.] [0. 0. 0. 1.]], shape=(6, 4), dtype=float32)
Colab link:
https://colab.research.google.com/drive/1jgBlqQLS4j5pMP5W6Ae3uInrxT2DNQUz
No comments:
Post a Comment