Service Context Understanding with KFood DB
https://github.com/imadelh/ML-web-app
https://github.com/aai4r/aai4r-ServiceContextUnderstanding
- Clone the Repo
git clone https://github.com/yunsujeon/MLserving_ServiceContextUnderstanding.git
- Download model and locate at app/output/
model_best.pth.tar
faster_rcnn_1_7_9999.pth
class_info_Kfood.pkl
class_info_FoodX251.pkl
class_info_Food101.pkl
- Anaconda create and activate
conda create -n <name> python==3.6.2
conda activate <name>
- Install requirements
pip install -r requirements.txt
- Run
python app.py
Go to http://0.0.0.0:8888 , then you can see wep page and explanation.
-
Install Docker your self
-
Create Docker image by build Dockerfile
sudo docker build -t <image name> .
or
docker build -t <image name> .
- Run docker file
docker run -i -t --rm -p 8888:8888 -v <your path>:/<docker path> --shm-size=2GB --gpus all <image name>
ex)
docker run -i -t --rm -p 8888:8888 -v /home/intern/MLserving/app:/app --shm-size=2GB --gpus all <image name>
If you need more memory in docker env, and select specific gpus ..
--shm-size=8G
--gpus '"device=0,1"'
Go to http://0.0.0.0:8888, then you can see wep page and explanation.
You can run this codes at SSH server, Its all same this repo's local, docker examples
You have to match CUDA version to SSH server. So you have to change Dockerfile
But you will change the access url
0.0.0:8888 -> [your remote server ip]:8888
Enjoy this Repo. thank you.