This is a project created for helping maintainers of the Union of Concerned Scientists (UCS)'s Satellite Database to keep track of satellites and their data. It web scrapes various websites for satellite data and then uses that data to create proposed changes to the UCS Satellite Database. The maintainers can then review the proposed changes and accept or reject them.
View the full demo on YouTube (sorry I was a little sick recording this)
This was a project created as part of Amber Field's CS 639 Capstone class at The University of Wisconsin - Madison. Here's the team that worked on it!
Adam Schmidt | David Teather | Gulinazi Julati |
---|---|---|
Georgia Li | Rudy Banerjee | Steven Lai |
The orbit-iq
project comprises several services, which are defined in the docker-compose.yml
:
(note: We have a more detailed readme ./detailed-readme.pdf, unfortunantely it's not in markdown format due to project requirements. But it's pretty detailed)
- API: This is the backend service, built from the context in
./api
. It runs on port8080
. - Website: This is the frontend service, built from the context in
./website
. It runs on port3000
. - Database (DB): A PostgreSQL database service running on port
5432
. - init-db: A service to initialize the database.
- crawler: A service that runs every 24h to crawl various websites for satellite data and dump it into the table
crawler_dump
- validator: A service that creates proposed changes to the
official_satellites
table based on the data incrawler_dump
after doing some validation and data integrity checks.
- Install Docker: Official Docker Installation Guide
- Docker Desktop is reccomended, it should auto install Docker Compose too if I remember correctly
- Install Docker Compose: Official Docker Compose Installation Guide
-
Starting all services:
docker-compose up
We probably should be running the following command, which will force docker to rebuild the images. So that we don't run into weird caching issues with Docker using old images even if we've made changes.
docker-compose up --build --force-recreate
If you want to run the services in the background, you can use the
-d
flag:docker-compose up -d
You can then view the logs using:
docker-compose logs -f
This command will start the
api
,website
,db
, andinit-db
services. However, if you are working on the website, you might want to run the website locally without Docker (details in the next section). -
Stopping the services:
docker-compose down
If you're developing for the website, you might not want to run the website service using Docker. Here's how you can run the other services using Docker and the website locally:
-
Start only the required services:
docker-compose up api db init-db
This will only start the
api
anddb
andinit-db
services. -
Navigate to the
website
directory:cd website
-
Start the website (assuming you have Node.js installed):
Install dependencies
npm install
Start the website
npm run dev
This will start the website on port 3000 (or the port specified in your configuration).
-
Once done, remember to stop the Docker services:
docker-compose down
-
Build and Run the Testing Services:
docker-compose -f docker-compose.test.yml up --build integration_api_test
-
Or directly run the Testing Services:
docker-compose -f docker-compose.test.yml up integration_api_test
- Ensure that the ports 8080 (for API), 3000 (for Website), and 5432 (for PostgreSQL) are available on your machine, if you get an error in the docker logs about the port being in use, you should stop the service using that port.
- The database data is persisted using Docker volumes. If you want to reset the database, you can remove the volume using
docker volume rm orbit-iq_db-data
.
official_satellites
- The list of confirmed satellites, the goal of this table is to be a single source of truth for all confirmed satellites. This is what should be exported when UCS publishes a new list of satellites.
official_satellite_changelog
- A log of all changes to the official_satellites
table. This is used to see historical changes and trends to the list of satellites.
crawler_dump
- Pretty unstructured data of the data that the web crawler finds. This is just raw data that is used by the validator
to create proposed changes
proposed_changes
- These are proposed changes that the validator
creates from combining data from the crawler_dump
data and doing data integrity and validation on the data the web crawler has detected which may be incorrect.