Skip to content

Commit

Permalink
Merge pull request #1 from maximerenou50/lifi
Browse files Browse the repository at this point in the history
Initial commit
  • Loading branch information
Maxime Renou authored Oct 27, 2023
2 parents 01f07d4 + 23c3781 commit d35485c
Show file tree
Hide file tree
Showing 21 changed files with 8,561 additions and 1 deletion.
51 changes: 51 additions & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Deploy to ECR

on:
push:
branches:
- main
pull_request:

jobs:
build:
name: Build
runs-on: ubuntu-latest
defaults:
run:
working-directory: app
steps:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: eu-central-1

- name: Checkout
uses: actions/checkout@v4

- uses: actions/setup-node@v2
with:
node-version: 18.x

- name: Install
run: npm ci

- name: Build
run: npm run build --if-present

- name: Unit test
run: npm test

- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1

- name: Build, tag, and push image to Amazon ECR
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: lifi
IMAGE_TAG: latest # TODO: change to use sem versioning
run: |
docker build -t $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG .
docker push $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
49 changes: 49 additions & 0 deletions .github/workflows/terraform.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: Terraform

on:
push:
branches:
- main
pull_request:

jobs:
terraform:
name: Terraform
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: read
packages: write
defaults:
run:
working-directory: terraform
steps:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: eu-central-1

- name: Checkout
uses: actions/checkout@v4

- name: Setup Terraform
uses: hashicorp/setup-terraform@v2
with:
terraform_version: 1.6.2

- name: Terraform Format
id: fmt
run: terraform fmt -check

- name: Terraform Init
id: init
run: terraform init

- name: Terraform Plan
id: plan
run: terraform plan -no-color -input=false

- name: Terraform Apply
run: terraform apply -auto-approve -input=false
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
node_modules
terraform/.terraform
116 changes: 115 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,115 @@
# lifi
# LI.FI Coding Challenge

Welcome to my [coding challenge](https://lifi.notion.site/lifi/Senior-DevOps-Engineer-Technical-Assignment-10a07dea54304262a3fc81752abfa806) for LI.FI Senior DevOps Engineer application. Happy reading!

# Purpose

Run a nodeJS REST application in AWS.

# Functionalities

The following functionalities have been implemented:

* GET /status -> return the application status.
* GET /data -> return the data stored in DynamoDB.
* POST /data -> store the data in DynamoDB.
* DELETE /data/:id -> delete the given id data from DynamoDB.

# Technologies

The following technologies were used for this coding challenge:

* NodeJS -> part of the requirements.
* Github actions -> part of the requirements.
* Terraform -> part of the requirements.
* ECS Fargate -> I decided to use ECS Fargate to run the application. I went with this technology because it's fully serverless & managed, which reduces the TCO (Total Cost of Ownership) compared to other technologies such as EKS.
* DynamoDB -> I decided to use DynamoDB to store the application data for similar reasons, it's a managed serverless service which makes it easy to use and low maintenance. It's also very cost-efficient as I only pay when it's used.
* Cloudwatch -> I used Cloudwatch to monitor the application, as it's the logical choice when using Fargate since it's coming for free. Cloudwatch is used for metrics and application logs (a technology like Prometheus doesn't make sense in that setup).

I'm happy to discuss my technical choices in a call and be challenged!

# CICD

The CICD pipeline uses Github actions separated into two jobs:

* build -> builds the application, runs unit test and pushes the docker image to ECR.
* terraform -> inits terraform, runs a plan and applies it.

Both jobs are executed in parallel, which has a chicken-egg problem for the first deployment, as the ECR repo needs to be provisioned before the build step can push to ECR. Also, the Fargate task uses the `latest` tag for simplicity, which is unreliable and shouldn't be used in production. In a normal scenario, the expectation would be to build and push the docker image before setting up the Fargate task definition - doing it that way automatically rolls out new versions.

# Infrastructure

The infrastructure is provisioned with Terraform. For the sack of simplicity, there is only one entrypoint to deploy all components:

* ECR repo.
* Networking (VPC, subnets, route tables, IG).
* DynamoDB table.
* Fargate cluster, task definition, service and IAM role.

I used separated modules to keep different components isolated. Ideally, core infrastructure like networking shouldn't be part of this repo but set up somewhere centrally.

# Improvements

There are many improvements that would be needed before considering this application production-ready:

* Add load balancer, DNS and certificate (ALB, Route53 & ACM).
* Autoscaling of the ECS service based on some metrics to ensure enough capacity. Also having at least 2 replicas to ensure HA.
* Set up alarms and detailed monitoring.
* Add caching using AWS Redis.
* Use semantic versioning instead of `latest` or get a unique sha from the pipeline and propagate it to the Fargate task.
* Add real unit tests when the code has more logic in it.
* Update AWS SDK to v3 using `import` instead of `require`.
* Learning more about NodeJS and having a proper application structure according to best practices.

# Local development

## Application

In order to run the application locally, one must have nodeJS and NPM installed and must run the following:

```bash
npm install
node app.js
```

Versions used to build this application:

```bash
node --version
v16.15.1

npm --version
8.11.0
```

## Infrastructure

In order to deploy the infrastructure, one must have Terraform installed and AWS configured locally. Then the following can be executed:

```bash
terraform init
terraform apply
```

Terraform version used to deploy the application:

```bash
terraform --version
Terraform v1.6.2
```

# Demo

Click the following link to see a demo of the usage: [Demo](https://asciinema.org/a/iAuShd97LLAOlTsMtGNqsUpkq)

Additionally, here are some screenshots of the applications running:

## Fargate task

![Task definition](docs/task-definition.png "Task definition")

![Task network](docs/task-network.png "Task network")

## Metrics

![Metrics](docs/metrics.png "Metrics")
7 changes: 7 additions & 0 deletions app/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
FROM node:18
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 8080
CMD [ "node", "app.js" ]
74 changes: 74 additions & 0 deletions app/app.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
const express = require("express");
const bodyParser = require('body-parser');
const app = express();
const AWS = require("aws-sdk");
const { v4: uuidv4 } = require('uuid');

// App configuration
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());

app.listen(3000, () => {
console.log("Server running on port 3000");
});

// DynamoDB
AWS.config.update({ region: 'eu-central-1' });
const DynamoDB = new AWS.DynamoDB();
const ddbTableID = process.env.DDB_TABLE_ID

// REST API
app.get("/status", (_, res) => {
res.json({ "status": "running" });
});

app.get("/data", (_, res) => {
const ddbParams = {
TableName: ddbTableID,
};

DynamoDB.scan(ddbParams, function(err, data) {
if (err) {
res.end(JSON.stringify(("Unable to scan", err)));
} else {
res.end(JSON.stringify((data.Items)));
}
});
})

app.post("/data", (req, res) => {
const ddbParams = {
TableName: ddbTableID,
Item: {
id: { S: uuidv4() },
data: { S: JSON.stringify(req.body) },
},
};

DynamoDB.putItem(ddbParams, function(err) {
if (err) {
res.end(JSON.stringify(("Unable to add a new item", err)));
} else {
res.end(JSON.stringify("Data saved to DynamoDB"));
}
});

})

app.delete("/data/:id", (req, res) => {
const ddbParams = {
TableName: ddbTableID,
Key: {
id: { S: req.params.id },
},
};

DynamoDB.deleteItem(ddbParams, function(err) {
if (err) {
res.end(JSON.stringify(("Unable to delete the item", err)));
} else {
res.end(JSON.stringify("Data deleted to DynamoDB"));
}
});

})
4 changes: 4 additions & 0 deletions app/app.test.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
// TODO: Add real unit test
test('fake test', () => {
expect(true).toBe(true); // that's true
});
Loading

0 comments on commit d35485c

Please sign in to comment.