Skip to content

Latest commit

 

History

History
146 lines (93 loc) · 5.13 KB

File metadata and controls

146 lines (93 loc) · 5.13 KB

Amazon Web Services

Phase 1

This phase involves creating the EC2 instance which will hold your app.

  • Go to Amazon Web Services console and select EC2.

  • Click on the create button to create an instance. Select Ubuntu 14 x64 as the linux distribution.

  • Follow the other steps till you reach the 6th step which is about configuring Security groups. There add a rule to accept all HTTP connections. See the screenshot on how it should look like.

ec2_security_grp

  • Click Launch in the 7th step and you will be presented with a dialog to create a key. Create a new key and give it a name. In this tutorial, I will use the name 'mykey'. Then download the key. Keep it safe because if you lose it, you will lose access to the server.

ec2_create_key

  • Once the instance is created, you will be forwarded to the instances list. Select the newly created instance and click on Connect button. You will see a dialog with instructions on how to connect to it using ssh.

connect_ssh_ec2

  • In the above case, the command is as follows. So open the terminal in your Downloads directory (which has the downloaded key file) and then run the command you got from the previous step. In my case, it was -
chmod 400 mykey.pem
ssh -i "mykey.pem" [email protected]
  • You will be into the server's shell. You will notice a text message stating to install the language pack. So run the following command.
sudo apt-get install language-pack-en

Phase 2

The second phase is about installing Docker and Compose on our cloud Ubuntu instance.

sudo apt-get update
sudo apt-get install apt-transport-https ca-certificates
sudo apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
echo "deb https://apt.dockerproject.org/repo ubuntu-trusty main" | sudo tee /etc/apt/sources.list.d/docker.list
sudo apt-get update
apt-cache policy docker-engine
sudo apt-get install -y docker-engine
  • Then install Docker Compose.
sudo apt-get -y install python3-pip
sudo pip install docker-compose
  • Finally you will have to add your current group to the docker group so that you can run commands without sudo. (Reference)
sudo usermod -a -G docker $(whoami)
  • After this step, you will have to re-login. So exit the connection and then connect to it with ssh again.

Phase 3

The third phase is about building the Open Event on the server and running it. The steps are very similar to the Docker installation instructions so it is very highly recommended you have a look at it and try to understand the process.

export SERVER_NAME="ec2-52-41-207-116.us-west-2.compute.amazonaws.com"
  • Then we will build the server. This process can take some time.
git clone https://github.com/fossasia/open-event-server.git && cd open-event-server
docker-compose build
  • Then start the server.
docker-compose up
  • Now that the server is running we have to create the databases and tables. So open a new terminal on your local system in the same directory and connect to the server.
ssh -i "mykey.pem" [email protected]
# This is just an example. Use the command you got from the Connect step in Phase 1
  • In the new termnial, run the following command -
docker-compose run postgres psql -h postgres -p 5432 -U postgres --password
# Enter password as test
  • When PSQL opens, create the database and then quit using \q.
create database opev;
# CREATE DATABASE
  • Now we have to create the tables. In the same (2nd) terminal window, run -
docker-compose run web /bin/bash
  • When bash opens, run the following commands and exit.
python3 create_db.py
# ^^ write super_admin email and password when asked
python3 manage.py db stamp head

References

Notes

  • You may want to run the server in daemon mode so that it doesn't exit when the local terminal window is closed. For that use docker-compose up -d