Tuesday, 8 October 2019

Sample: TensorFlow 2.x Estimator 'DNNRegressor'

TensorFlow has ready-to-use networks like DNNRegressor, DNNClassifier, RNNClassifier (experimental at TF 2.0). These are easy networks to use but there's a limitation: Hard to customise them. For example, DNNRegressor has only 1 constructor param named `activation_fn` which is used for all layers of the network, but in good practice, it's good to use different activation functions at hidden layers and output layer.

Source code:
%tensorflow_version 2.x
%reset -f

#libs
import tensorflow              as tf;
from   tensorflow.keras.layers import *;

import numpy             as np;
import matplotlib.pyplot as pp;
import logging;

#disable tf log
tf.get_logger().setLevel(logging.ERROR)

#input shape
Input = tf.feature_column.numeric_column("X",shape=[2]);

#ready-to-use network
Model = tf.estimator.DNNRegressor(
  hidden_units    = [20,1],
  feature_columns = [Input],
  activation_fn   = tf.identity
);

#data
def input_fn():
  return {"X":tf.convert_to_tensor([[1.,2.]])},tf.convert_to_tensor([3.]);

#train
print("Training...");
for I in range(10):
  R = Model.evaluate(input_fn,steps=1);
  print("Loss:",R["loss"]);
  Model.train(input_fn,steps=1000);

#test
print("\nPredict:");
print(next(Model.predict(input_fn))["predictions"][0]);

print("\nDone.");
#eof

No comments:

Post a Comment