TransWikia.com

Model deployment with Flask, Tensorflow Serving and Docker

Data Science Asked by happynomad on February 18, 2021

I am puzzled over this example: https://towardsdatascience.com/deploying-deep-learning-models-using-tensorflow-serving-with-docker-and-flask-3b9a76ffbbda

In the write-up, only the TF Serving is within the docker container while both model and flask app still reside outside the container. In this case, how is isolation achieved? For portability, i suppose all components (model+flask) should also be "encapsulated" within the same container?

I don’t quite understand what was actually achieved in this example.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP