Skip to content

Commit

Permalink
Removed git-lfs, onboarding docs, onboarding scripts
Browse files Browse the repository at this point in the history
Finished up all copy for onboarding

Updated onboarding docs/script/docker files

Added WIP front end docker file

Added engineering getting started callout

Removed git-lfs check and instructions for existing installations

Finished up all copy for onboarding

Updated onboarding docs/script/docker files

Added WIP front end docker file
  • Loading branch information
sellnat77 committed Dec 29, 2019
1 parent 9197555 commit c4f2d39
Show file tree
Hide file tree
Showing 11 changed files with 10,183 additions and 33 deletions.
20 changes: 7 additions & 13 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,20 +1,14 @@
# base image
FROM node:12.2.0-alpine
RUN npm install webpack webpack-cli -g

RUN npm install webpack -g
WORKDIR /tmp
COPY package.json /tmp/
RUN npm config set registry http://registry.npmjs.org/ && npm install

WORKDIR /app
COPY . /app/
RUN cp -a /tmp/node_modules /app/

WORKDIR /usr/src/app
COPY . /usr/src/app/
RUN cp -a /tmp/node_modules /usr/src/app/
RUN webpack

ENV NODE_ENV=production
ENV PORT=8080
ENV PORT=3000
RUN ls
CMD [ "npm", "run-script", "dev" ]

EXPOSE 8080
CMD [ "node", "server.js" ]
EXPOSE 3000
60 changes: 60 additions & 0 deletions GETTING_STARTED.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,67 @@ Running this command does not mean you are _in_ the virtual environment yet. in


## Flask
For back end work we are using a variant of python-flask. It is called Sanic, you can read more about the specific differences [here](https://www.fullstackpython.com/sanic.html) Yes the naming is a bit immature but it does give us asynchronous capabilities. The other reason we chose python for the back end is that we will be doing a lot of data analysis and python is the ideal language for these operations.

## React
The front end will be written in React/Redux since the application is pitched as a reporting dashboard with several visualizations driven by a single set of filters. If you are unfamiliar, it is reccomended to start [here](https://hackernoon.com/getting-started-with-react-redux-1baae4dcb99b)

## API Secrets
Throughout the dev process we will inevitably run into secret keys or username/passwords. These are protected from being pushed to github by the ignore file. You'll notice the entries for the following files
```
docker-compose.yml
.env
settings.cfg
```
When cloning the repo, you will find `.example.env`, `server/src/settings.example.cfg`, and `docker-compose-example.yml` These have dummy or default values. In order to run either the front or back end, you will need to copy the files into `.env` and `settings.cfg` respectively. Then swap out the REDACTED entries with the correct values.

Now, when you commit code, the real values will never get checked in.
### Just make sure to never populate the `example` configs with secrets!!!!!!

## Docker Compose
For ease of use when getting started developing on the front/back there is a docker compose file that will startup all of the required services. After running `docker compose up` in the `Orchestration` folder the following will be running:
```
postgresql: port -> 5432
adminer dashboard: port -> 8080
front end react app: port -> 3000
backend flask server: port -> 5000
```
The adminer dashboard serves as a web ui for the postgres db, if you are unfamiliar please see [here](https://documentation.online.net/en/web/web-hosting/sql-management/configure-postgresql-adminer)

## Onboarding Script
In an attempt to alleviate as much onboarding woes as possible, we have written an onboarding script that should take care of most of the setup.
You can find the script in the root of the repo under `onboard.sh`
If you are on windows, please use git bash or equivalent to run the script.
Once in your terminal you can execute the script with `sh onboard.sh`


# Advanced operations
This section will detail engineering tasks beyond onboarding that we do often.

## Seeding the database
At the moment, database seeding is handled through the `(POST)/ingest` endpoint with an array of strings for the body parameters. The body looks like this:
```json
{
"sets":[
"2018_MINI"
]
}
```
The assumption prior to calling that endpoint is:
```
Running postgres db listening on 5432
csv file with data from data.lacity.org
[YearMapping] entry in settings.cfg that maps a year to the filename in server/src/static
* See 2018_MINI for reference
```

## Socrata API
The raw data will often be referenced as "socrata" (Maybe im the only one [Russell]) but socrata is the mechanism to pull data from data.lacity.org via an api. The cool thing about this api is that you can send sql requests, including aggregates. The even cooler thing is that we can ask for a substatial amount of information by using the `$limit` parameter.
An example of this request would look like this:
```
https://data.lacity.org/resource/pvft-t768.csv
?$select=Address,count(*)+AS+CallVolume
&$where=date_extract_m(CreatedDate)+between+1+and+12+and+RequestType=%27Bulky%20Items%27+and+NCName=%27ARLETA%20NC%27
&$group=Address&$order=CallVolume%20DESC
&$limit=50000000
```
File renamed without changes.
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Empower Neighborhood Associations to Improve analysis of their initiatives using
* Docker + Docker Compose

## Backend Technologies
* Python
* Python
* Sanic(Asynchronous Flask)
* Pandas
* PostgreSql
Expand All @@ -21,6 +21,8 @@ Empower Neighborhood Associations to Improve analysis of their initiatives using
* PostgreSql
* Socrata API

## 🎉🎉 Engineers start [here](https://github.com/hackforla/311-data/blob/master/GETTING_STARTED.md)!!!! 🎉🎉🎉

## Resources
[Empower LA - HfLA Initial Meeting](https://docs.google.com/document/d/19jrYWjq_FfQbuqTfnwJFruWEo9pPF0R0qh4njDZsuzM)

Expand Down
98 changes: 86 additions & 12 deletions onboard.sh
Original file line number Diff line number Diff line change
Expand Up @@ -41,26 +41,49 @@ if ! [ -x "$(command -v node)" ]; then
fi
echo "👍 Found node installation"

# Check for npm
if ! [ -x "$(command -v npm)" ]; then
echo "👺Error: npm is not installed." >&2
echo "Please visit this link \n\n\t https://www.npmjs.com/get-npm"
echo "Once complete, rerun this script to find missing dependencies"
exit 1
fi
echo "👍 Found npm installation"

if [ -d node_modules ]; then
echo "👍 Found node modules folder, skipping installation"
else
echo "🕺 Installing node modules"
npm install
fi


# Check for node version

## BACKEND
echo "\n\n Checking back end components"

# Check if git-lfs is installed
# Check if git-lfs is *NOT* installed
if ! [ -x "$(command -v git-lfs)" ]; then
echo "👺Error: git-lfs is not installed." >&2
echo "Please visit this link \n\n\t https://git-lfs.github.com/"
echo "Once complete, rerun this script to find missing dependencies"
exit 1
# echo "Please visit this link \n\n\t https://git-lfs.github.com/"
# echo "Once complete, rerun this script to find missing dependencies"
# exit 1
else
echo "WARN Found git-lfs installation, if you have already initialized git-lfs"
echo "WARN Please run git-lfs uninstall, currently we are not using git-lfs"
echo "WARN If errors persist, follow this to remove lfs \n\n\thttps://github.com/git-lfs/git-lfs/issues/3026#issuecomment-451598434"
fi
echo "👍 Found git-lfs installation"

# Check for postgres tools
echo "🕺 Starting postgres install to support psycopg2 in the backend"
## Need to adjust for different OS's
#sudo apt-get -y update
#sudo apt-get -y install postgresql-devel
if ! [ -x "$(command -v pg_config)" ]; then
echo "👺Error: pg_config is not installed." >&2
echo "👺👺👺 The backend required psycopg2 which requires a manual install of pg_config and libpq headers."
echo "👺👺👺 To install those, please refer to \n\n\thttp://initd.org/psycopg/docs/install.html"
exit 1
fi
echo "👍 Found pg_config installation"


# Check for python3
if ! [ -x "$(command -v python3)" ]; then
Expand All @@ -80,14 +103,65 @@ if ! [ -x "$(command -v virtualenv)" ]; then
fi
echo "👍 Found virtualenv installation"

if [ -f .env ]; then
echo "👍 Found existing environment file"
else
echo "🕺 Copying environment file into .env"
cp .example.env .env
fi

if [ -f ./server/src/settings.cfg ]; then
echo "👍 Found python settings file"
else
echo "🕺 Copying python settings file into server/src/settings.cfg"
cp ./server/src/settings.example.cfg ./server/src/settings.cfg
fi

if [ -f ./Orchestration/docker-compose.yml ]; then
echo "👍 Found docker compose file"
else
echo "🕺 Copying docker compose file into Orchestration/docker-compose.yml"
cp ./Orchestration/docker-compose-example.yml ./Orchestration/docker-compose.yml
fi


# Create virtualenv
echo "🕺 Creating virtualenv to isolate python packages"
#virtualenv -p python3 ~/.envs/311-data
if [ -d ~/.envs/311-data ]; then
echo "👍 Found virtual environment"
else
echo "🕺 Creating virtualenv to isolate python packages"
virtualenv -p python3 ~/.envs/311-data
fi

# Source virtualenv if not already
echo "\n\n\t\tIf youve gotten to this point, congratulations! Dependencies are installed!"
echo "\t\tHappy Hacking! If you have any questions, please reach out to the #311-data-dev slack channel"
echo "\n\n"

echo "🕺 Sourcing virtualenv, this will terminate the onboarding script"
# echo "🕺 Sourcing virtualenv, this will terminate the onboarding script"
#source ~/.envs/311-data/bin/activate
if [ -f ./server/src/static/2018_mini.csv ]; then
echo "👍 Mini seed file found in server/src/static/2018_mini.csv"
echo "Once the docker compose environment is setup you can send a POST request"
echo 'To /ingest with body argument of {"sets":["2018_MINI"]} to seed the db'
echo "With the first 10000 rows of the 2018 dataset"

else
echo "🕺 Downloading the first 10000 rows of the 2018 dataset for seeding"

curl 'https://data.lacity.org/resource/h65r-yf5i.csv?$select=*&$limit=10000' -o server/src/static/2018_mini.csv

echo "Once the docker compose environment is setup you can send a POST request"
echo 'To /ingest with body argument of {"sets":["2018_MINI"]} to seed the db'
echo "With the first 10000 rows of the 2018 dataset"

fi

while true; do
read -p "Do you wish to run the docker environment? (postgres, frontend, backend, adminer)?" yn
case $yn in
[Yy]* ) docker-compose -f ./Orchestration/docker-compose.yml up -d && exit;;
[Nn]* ) exit;;
* ) echo "Please answer yes or no.";;
esac
done
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
},
"scripts": {
"start": "npm run dev",
"dev": "webpack-dev-server --config webpack.dev.js",
"dev": "webpack-dev-server --config webpack.dev.js --host 0.0.0.0",
"build": "webpack --config webpack.prod.js",
"test": "jest",
"predeploy": "npm run build",
Expand Down
15 changes: 15 additions & 0 deletions server.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
const express = require('express');
const path = require('path');
const port = process.env.PORT || 3000;
const app = express();

// the __dirname is the current directory from where the script is running
app.use(express.static(__dirname + '/dist'));

// send the user to index html page inspite of the url
app.get('*', (req, res) => {
res.sendFile(path.resolve(__dirname, 'public/index.html'));
});

console.log(`Listening on port ${port}`)
app.listen(port);
10 changes: 9 additions & 1 deletion server/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,8 +1,16 @@
FROM python:3.7-alpine

RUN echo "http://dl-8.alpinelinux.org/alpine/edge/community" >> /etc/apk/repositories

RUN apk add --no-cache --allow-untrusted --repository http://dl-3.alpinelinux.org/alpine/edge/testing hdf5 hdf5-dev
RUN apk --no-cache --update-cache add gcc musl-dev gfortran python python-dev py-pip build-base wget freetype-dev libpng-dev postgresql-dev openblas-dev \
&& pip install --no-cache-dir cython numpy
RUN ln -s /usr/include/locale.h /usr/include/xlocale.h


COPY requirements.txt /

RUN pip install -r /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt

COPY src/ /app

Expand Down
2 changes: 1 addition & 1 deletion server/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ itsdangerous==1.1.0
Jinja2==2.10.3
MarkupSafe==1.1.1
multidict==4.5.2
numpy==1.17.4
numpy==1.18.0
pandas==0.25.3
psycopg2==2.8.4
python-dateutil==2.8.1
Expand Down
1 change: 0 additions & 1 deletion server/src/.dockerignore
Original file line number Diff line number Diff line change
@@ -1 +0,0 @@
*.csv
Loading

0 comments on commit c4f2d39

Please sign in to comment.