From 425f6ebb5d4eded02d77f81fd76c385b871f6665 Mon Sep 17 00:00:00 2001 From: Ettrig Date: Thu, 8 Feb 2024 15:46:48 +0100 Subject: [PATCH 1/4] Instruction for installation in Windows --- docs/Docker-install-windows.md | 206 +++++++++++++++++++++++++++++++++ 1 file changed, 206 insertions(+) create mode 100644 docs/Docker-install-windows.md diff --git a/docs/Docker-install-windows.md b/docs/Docker-install-windows.md new file mode 100644 index 00000000..a6941245 --- /dev/null +++ b/docs/Docker-install-windows.md @@ -0,0 +1,206 @@ +Docker Compose is used to create a full installation of fAIr (with redis, worker, postgis database, api and frontend), all in one Docker container. This setup is suitable for development. For production it is not recommended. ***TODO*** What is production? Is it the version that executes on the web site? + +## Installation of fAIr using Docker + +In the following, four directories are created: ```fair, ramp-code, ramp``` and ```training```, for the fAIr code, the model code, the model variables, and temporary data used during fine-tunining training of the (RAMP) model. It is a good idea to create these directories in the same, new, directory, created for this installation of fAIr and its companions. + +1. Check your graphics card + + It is highly recommended to use a graphics card to run fAIr. It might not work with CPU only. (You can set up and test from bottom of this document). ***TODO*** What does this mean? NVIDIA graphics cards are tested. The bottom of this document is about a tile server, whatever that is. But it is not a GPU test?! ***TODO*** Does this mean that it works for NVIDIA cards but not for cards from other brands? + + For the local fAIr installation to work the necessary drivers for the graphics card need to be installed. + + By the following command you can see your graphics card and the graphics driver details and the NVIDIA container toolkit that is installed. There is more information on the [NVIDIA site](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html). + + ``` + nvidia-smi + ``` + + You need to see your graphics card details here. ***TODO*** Does this mean that if this command shows details about my GPU, then it is OK for fAIr? + +2. Install Docker Compose + + If Docker is not installed, install it from the [Docker site](https://docs.docker.com/engine/install/). + +To check whether Docker is installed, type + + ``` + docker compose version + ``` + +3. Clone the repository for fAIr. + + In a Windows Command Prompt window, type + + ``` + git clone https://github.com/hotosm/fAIr.git + ``` + + This creates directory ```./fAIr```. + +4. Clone the Base Model and create environment variable RAMP_HOME + + - Clone the RAMP code. + + ``` + git clone https://github.com/kshitijrajsharma/ramp-code-fAIr.git + ``` + + This creates directory ```./ramp-code-fAIr```. + + - Create environment variable RAMP_HOME + + Set RAMP_HOME to be the file path to the directory ```ramp-code-fAIr```. + + ``` + set RAMP_HOME=C:\Users\kshitij\fAIr_install\ramp-code-fAIr + ``` + + - Download pre-trained variable values for the RAMP model from [here](https://drive.google.com/file/d/1YQsY61S_rGfJ_f6kLQq4ouYE2l3iRe1k/view). + + If that doesn't work, you can alternatively use the variables in the original RAMP model, [RAMP Baseline](https://github.com/radiantearth/model_ramp_baseline/tree/main/data/input/checkpoint.tf) + + - Create a new folder called ```ramp-variables```. + + ``` + mkdir ramp-variables + ``` + + - Unzip the downloaded base model variables in subdirectory ```ramp-variables```. + + Move the compressed file to directory ```ramp-variables```. Right-click on it in File Explorer and choose "Extract All...". + + - Create environment variable TRAINING_WORKSPACE + + Training workspace is the directory where fAIr will store its training files + for example. ***TODO*** Is this about a new checkpoint after fine-tuning? The name implies that this is for temporary files. + + ``` + mkdir trainings + set TRAINING_WORKSPACE=C:\Users\kshitij\fAIr_install\trainings + ``` + +5. Register your Local setup to OSM ***TODO*** Settings in OSM? "your Local setup" = my local fAIr installation? + + - Go to [OpenStreetMap](https://www.openstreetmap.org/), Log in (If needed: Create an account first.). + - Click on your Profile and Hit ```My Settings``` + - Navigate to ```Oauth2 Applications``` + - Register a new application + - Check permissions for ```Read user preferences``` and set Redirect URI to be ```http://127.0.0.1:3000/authenticate/```. Give it the name ```fAIr Dev Local``` + - You will get ```OSM_CLIENT_ID``` and ```OSM_CLIENT_SECRET```. Copy them. + +6. Create environment files + + - Create a file ```.env``` in directory backend with a copy of the content in [docker_sample_env](../backend/docker_sample_env). + + ``` + cd backend + copy docker_sample_env .env + ``` + + - Fill in the details of ```OSM_CLIENT_ID``` and ```OSM_CLIENT_SECRET``` in the .env file and generate a unique key and paste it to ```OSM_SECRET_KEY``` (It can be any value in a setup for development). + + Leave the other items as they are, unless you have a specific, well-defined need. + + - Create ```.env``` in directory frontend + + ``` + cd frontend + copy .env_sample .env + ``` + + There is no need to modify this file in a setup for development. + +7. Build and Run containers + + In directory fAIr, run the following commands: + + ``` + docker compose build + docker compose up + ``` + +8. Run Migrations + + ***TODO*** What does "migration" mean in this context? + + Grab API container and Open Bash: In another command window, go to directory fAIr and execute: + + ``` + docker exec -it api bash + ``` + + Once bash is promoted run the following commands, one at a time: ***TODO*** How do we know when bash is promoted? What does "promoted" mean in this context? + + python manage.py makemigrations + python manage.py makemigrations login + python manage.py makemigrations core + python manage.py migrate + +9. Play + + Restart containers + + ``` + docker compose restart + ``` + + Frontend will be available on port 5000, Backend on 8000, and Flower on 5500. + + To use your local fAIr installation, go to [Local fAIr](http://127.0.0.1:3000) with your web browser. + ***TODO*** There is a list of port numbers, but it doesn't include 3000, which seems to be the most important one. Is this a bug or a feature? + +Extra. Do you want to run your local tiles? + + You can use [TiTiler](https://github.com/developmentseed/titiler), [gdals2tiles](https://gdal.org/programs/gdal2tiles.html) or nginx to run your own TMS server and add the following to docker compose in order to access your localhost through docker containers. Add this to API and Worker. Also update the .env variable accordingly + + ``` + network_mode: "host" + ``` + + Example docker compose : + + ``` + backend-api: + build: + context: ./backend + dockerfile: Dockerfile_CPU + container_name: api + command: python manage.py runserver 0.0.0.0:8000 + + ports: + - 8000:8000 + volumes: + - ./backend:/app + - ${RAMP_HOME}:/RAMP_HOME + - ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE + depends_on: + - redis + - postgres + network_mode: "host" + + backend-worker: + build: + context: ./backend + dockerfile: Dockerfile_CPU + container_name: worker + command: celery -A aiproject worker --loglevel=INFO --concurrency=1 + + volumes: + - ./backend:/app + - ${RAMP_HOME}:/RAMP_HOME + - ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE + depends_on: + - backend-api + - redis + - postgres + network_mode: "host" + ``` + + Example .env after this host change : + + ``` + DATABASE_URL=postgis://postgres:admin@localhost:5434/ai + CELERY_BROKER_URL="redis://localhost:6379/0" + CELERY_RESULT_BACKEND="redis://localhost:6379/0" + ``` \ No newline at end of file From 50d7b8aa9caea086dabc36e50b7fb21c3a804c3a Mon Sep 17 00:00:00 2001 From: Ettrig Date: Thu, 8 Feb 2024 19:47:56 +0100 Subject: [PATCH 2/4] New value for EXPORT_TOOL_API_URL --- docs/Docker-install-windows.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/Docker-install-windows.md b/docs/Docker-install-windows.md index a6941245..031c5d5b 100644 --- a/docs/Docker-install-windows.md +++ b/docs/Docker-install-windows.md @@ -99,6 +99,8 @@ To check whether Docker is installed, type ``` - Fill in the details of ```OSM_CLIENT_ID``` and ```OSM_CLIENT_SECRET``` in the .env file and generate a unique key and paste it to ```OSM_SECRET_KEY``` (It can be any value in a setup for development). + + - Change EXPORT_TOOL_API_URL to be https://api-prod.raw-data.hotosm.org/v1 Leave the other items as they are, unless you have a specific, well-defined need. From 59de3109aa1e5330bae71b5d88c02e51328188ec Mon Sep 17 00:00:00 2001 From: Ettrig Date: Sun, 11 Feb 2024 06:38:06 +0100 Subject: [PATCH 3/4] Resolved some of the TODOs --- docs/Docker-install-windows.md | 16 +++++++--------- 1 file changed, 7 insertions(+), 9 deletions(-) diff --git a/docs/Docker-install-windows.md b/docs/Docker-install-windows.md index 031c5d5b..f6892135 100644 --- a/docs/Docker-install-windows.md +++ b/docs/Docker-install-windows.md @@ -6,9 +6,9 @@ In the following, four directories are created: ```fair, ramp-code, ramp``` and 1. Check your graphics card - It is highly recommended to use a graphics card to run fAIr. It might not work with CPU only. (You can set up and test from bottom of this document). ***TODO*** What does this mean? NVIDIA graphics cards are tested. The bottom of this document is about a tile server, whatever that is. But it is not a GPU test?! ***TODO*** Does this mean that it works for NVIDIA cards but not for cards from other brands? + It is highly recommended to use a graphics card to run fAIr. It might not work with CPU only. (You can set up and test from bottom of this document). ***TODO*** What does this mean? The bottom of this document is about a tile server. But it is not a GPU test?! - For the local fAIr installation to work the necessary drivers for the graphics card need to be installed. + For the local fAIr installation to work, the necessary drivers for the graphics card need to be installed. By the following command you can see your graphics card and the graphics driver details and the NVIDIA container toolkit that is installed. There is more information on the [NVIDIA site](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html). @@ -16,7 +16,7 @@ In the following, four directories are created: ```fair, ramp-code, ramp``` and nvidia-smi ``` - You need to see your graphics card details here. ***TODO*** Does this mean that if this command shows details about my GPU, then it is OK for fAIr? + You need to see your graphics card details here. If you do, then fAIr is able to use your graphics card for model training. (The graphics card is orders of magnitude more powerful then the CPU for the heavy computations involved in model training.) We have tested fAIr on several different NVIDIA cards. It may be difficult to get it to work on graphics cards from other brands. 2. Install Docker Compose @@ -72,8 +72,7 @@ To check whether Docker is installed, type - Create environment variable TRAINING_WORKSPACE - Training workspace is the directory where fAIr will store its training files - for example. ***TODO*** Is this about a new checkpoint after fine-tuning? The name implies that this is for temporary files. + Training workspace is the directory where fAIr will store its training files, including permanent storage of training results. ``` mkdir trainings @@ -126,13 +125,13 @@ To check whether Docker is installed, type ***TODO*** What does "migration" mean in this context? - Grab API container and Open Bash: In another command window, go to directory fAIr and execute: + Grab API container and Open Bash: In another Command Prompt window, go to directory fAIr and execute: ``` docker exec -it api bash ``` - Once bash is promoted run the following commands, one at a time: ***TODO*** How do we know when bash is promoted? What does "promoted" mean in this context? + It will say "TensorFlow" with big fancy letters. On the new prompt line, run the following commands, one at a time: python manage.py makemigrations python manage.py makemigrations login @@ -141,7 +140,7 @@ To check whether Docker is installed, type 9. Play - Restart containers + In the Command Prompt window where Docker is up, stop it (^C) and restart the containers: ``` docker compose restart @@ -150,7 +149,6 @@ To check whether Docker is installed, type Frontend will be available on port 5000, Backend on 8000, and Flower on 5500. To use your local fAIr installation, go to [Local fAIr](http://127.0.0.1:3000) with your web browser. - ***TODO*** There is a list of port numbers, but it doesn't include 3000, which seems to be the most important one. Is this a bug or a feature? Extra. Do you want to run your local tiles? From a23422273973e533176b31be5c7898244afd32a5 Mon Sep 17 00:00:00 2001 From: Ettrig Date: Sun, 11 Feb 2024 21:32:29 +0100 Subject: [PATCH 4/4] Resolved one more TODO' --- docs/Docker-install-windows.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/Docker-install-windows.md b/docs/Docker-install-windows.md index f6892135..425e93e3 100644 --- a/docs/Docker-install-windows.md +++ b/docs/Docker-install-windows.md @@ -6,17 +6,17 @@ In the following, four directories are created: ```fair, ramp-code, ramp``` and 1. Check your graphics card - It is highly recommended to use a graphics card to run fAIr. It might not work with CPU only. (You can set up and test from bottom of this document). ***TODO*** What does this mean? The bottom of this document is about a tile server. But it is not a GPU test?! + It is highly recommended to use a graphics card to run fAIr. It might not work with CPU only. The graphics card is orders of magnitude more powerful then the CPU for the heavy computations involved in model training. - For the local fAIr installation to work, the necessary drivers for the graphics card need to be installed. - - By the following command you can see your graphics card and the graphics driver details and the NVIDIA container toolkit that is installed. There is more information on the [NVIDIA site](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html). + For the local fAIr installation to work, the necessary drivers for the graphics card need to be installed. By the following command you can see your graphics card and the graphics driver details and the NVIDIA container toolkit that is installed. If it returns, then you have set up your drivers correctly. If not, then either you don't have a graphics card or the drivers are not set up correctly. There is more information on the [NVIDIA site](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html and [another NVIDIA page]https://www.nvidia.com/Download/index.aspx). ``` nvidia-smi ``` - You need to see your graphics card details here. If you do, then fAIr is able to use your graphics card for model training. (The graphics card is orders of magnitude more powerful then the CPU for the heavy computations involved in model training.) We have tested fAIr on several different NVIDIA cards. It may be difficult to get it to work on graphics cards from other brands. + You need to see your graphics card details here. If you do, then fAIr is able to use your graphics card for model training. + +We have tested fAIr on several different NVIDIA cards. It may be difficult to get it to work on graphics cards from other brands. 2. Install Docker Compose