Serving
Tensorflow/Serving
To create a Tensorflow/serving server, the recommended course of action is using Docker and getting an image to deploy the model for you, using the following docker command:
docker run -p <External port>:8501 --mount type=bind,source=<Path to deployable model>,target=/models/<model name> -e MODEL_NAME=<model name> --name <model_name> -d tensorflow/serving
After that the model is going to be available as a JSON query server, and for estimators with 1D inputs,use the code (example):
curl -d '{"instances": [[1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0 ,8.0 ,9.0 ,10.0, 11.0, 12.0, 13.0, 14.0, 15.0]]}' -X POST http://192.168.0.10:9000/v1/models/teste:predict
For Conv1D based networks(for time series classification problems) use (example):
curl -d '{"instances": [[[1383.2],[1409.55959596],[1434.25151515],[1453.42727273],[1470.16060606],[1484.07575758],[1496.49393939],[1508.25959596],[1518.44949495],[1527.86363636],[1533.85959596],[1539.18888889],[1543.48787879],[1547.68080808],[1551.81010101],[1554.0969697],[1556.16262626],[1559.1040404],[1561.82727273],[1564.08989899],[1565.88282828],[1567.2969697],[1568.66666667],[1570.01616162],[1570.92121212],[1571.72727273],[1572.23636364],[1572.8],[1573.42222222],[1573.96363636],[1574.47272727],[1575.03535354],[1575.58080808],[1576.03333333],[1576.39292929],[1576.61919192],[1576.80909091],[1576.97878788],[1576.8020202],[1576.57575758],[1576.34949495],[1576.31515152],[1576.65454545],[1576.8969697],[1577.06666667],[1577.14545455],[1577.2040404],[1577.31717172],[1577.4],[1577.4],[1577.28686869],[1577.06060606],[1576.83434343],[1576.60808081],[1576.65454545],[1576.72222222],[1576.83535354],[1576.77878788],[1576.4959596],[1576.4],[1576.4],[1576.4],[1576.39292929],[1576.33636364],[1576.11818182],[1575.60909091],[1575.53333333],[1575.75959596],[1575.75353535],[1575.68787879],[1575.46161616],[1575.2030303],[1574.86363636],[1574.7],[1574.7],[1574.27575758],[1573.71010101],[1572.81111111],[1571.78484848],[1570.31414141],[1568.54040404],[1566.39090909],[1563.7040404],[1560.76262626],[1556.58484848],[1552.32626263],[1548.31010101],[1544.03939394],[1539.34444444],[1534.44343434],[1529.40909091],[1524.27979798],[1519.12020202],[1513.8030303],[1508.4],[1502.8],[1497.26060606],[1491.77373737],[1500]]]}' -X POST http://127.0.0.1:9001/v1/models/rede:predict
A command that may come in handy is the additional flag for curl to tell it to send the POST as a JSON object:
... -H "Content-Type: application/json" -X POST...