Example: Prepare a TensorFlow Model for Deployments
Train and output a Tensorflow Model
This example code demonstrates how to use TensorFlow to export a trained model so that it is compatible Tensorflow Serving and Gradient Deployments. The code below should be incorporated into your experiment, and assumes you are using Tensorflow 1.x with Python.
Note: you must specify --modelPath /artifacts
when running the experiment, in order to have the model parsed and uploaded by Gradient. Learn more about model path options here.
Example with TensorFlow
The code above first gets the specified --modelPath
from parameters used to run the experiment using the environment variable PS_MODEL_PATH
, which is available to the experiment while it is running. (Note: if --modelPath
was not specified, the model export directory defaults to ./models
using the code above.)
It then uses TensorFlow's SavedModelBuilder module to export the model. SavedModelBuilder
saves a "snapshot" of the trained model to
so that it can be loaded later for deployments.
SavedModel format
For details on the SavedModel format, please see the documentation at SavedModel README.md.
Last updated