Skip to content

Commit

Permalink
adds: initial service architecture for Elastic search port from main …
Browse files Browse the repository at this point in the history
…backend

fixes: missing variables from port

adds: documentation updates and code cleanup

adds: corrections based on feedback and improvements

adds: port of original tests (failing WIP)

fixes: rebase issues and updates docker env
  • Loading branch information
raygervais committed Mar 16, 2021
1 parent 3ee2423 commit a222715
Show file tree
Hide file tree
Showing 13 changed files with 478 additions and 5 deletions.
9 changes: 9 additions & 0 deletions config/env.development
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,15 @@ IMAGE_PORT=4444
# Image Service URL
IMAGE_URL=http://localhost/v1/image

################################################################################
# Search Service
################################################################################

# Search Service Port (default is 4445)
SEARCH_PORT=4445

# Search Service URL
SEARCH_URL=http://api.telescope.localhost/v1/search

################################################################################
# Posts Service
Expand Down
47 changes: 47 additions & 0 deletions docker/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,34 @@ services:
- 'traefik.http.middlewares.strip_auth_prefix.stripprefix.forceSlash=true'
- 'traefik.http.routers.auth.middlewares=strip_auth_prefix'

search:
container_name: 'search'
build:
context: ../src/api/search
dockerfile: Dockerfile
environment:
- ELASTIC_APM_SERVICE_NAME=search
- SEARCH_PORT

- ELASTICSEARCH_HOSTS=http://elasticsearch:9200
- ELASTIC_APM_SERVER_URL=http://apm:8200
depends_on:
- elasticsearch
- traefik
ports:
- ${SEARCH_PORT}
labels:
# Enable Traefik
- 'traefik.enable=true'
# Traefik routing for the search service at /v1/search
- 'traefik.http.routers.search.rule=PathPrefix(`/${API_VERSION}/search`)'
# Specify the search service port
- 'traefik.http.services.search.loadbalancer.server.port=${SEARCH_PORT}'
# Add middleware to this route to strip the /v1/search prefix
- 'traefik.http.middlewares.strip_search_prefix.stripprefix.prefixes=/${API_VERSION}/search'
- 'traefik.http.middlewares.strip_search_prefix.stripprefix.forceSlash=true'
- 'traefik.http.routers.search.middlewares=strip_search_prefix'

# posts service
posts:
container_name: 'posts'
Expand Down Expand Up @@ -157,3 +185,22 @@ services:
hard: -1
ports:
- '9200'
labels:
# Enable Traefik in development mode
- 'traefik.enable=true'
# Traefik routing for elasticsearch at http://elasticsearch.localhost
- 'traefik.http.routers.elasticsearch.rule=Host(`elasticsearch.localhost`)'
# Specify the redis port
- 'traefik.http.services.elasticsearch.loadbalancer.server.port=9200'

apm:
image: docker.elastic.co/apm/apm-server:7.10.2
container_name: 'apm'
restart: unless-stopped
environment:
- ELASTICSEARCH_HOSTS=http://elasticsearch:9200
- KIBANA_HOST=http://kibana:5601
ports:
- '8200'
depends_on:
- elasticsearch
11 changes: 6 additions & 5 deletions src/api/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,11 +41,12 @@ npm run services:stop

## API Lookup Table

| API | Docker Tag | URL | Description |
| ----- | ------------------- | ------------------------- | ------------------------------------------------- |
| posts | telescope_posts_svc | http://localhost/v1/posts | Provides access to cached user posts |
| image | telescope_img_svc | http://localhost/v1/image | Provides a dynamic image processing service |
| auth | telescope_auth_svc | http://localhost/v1/auth | Provides authentication and authorization service |
| API | Docker Tag | URL | Description |
| ------ | -------------------- | -------------------------- | ------------------------------------------------- |
| posts | telescope_posts_svc | http://localhost/v1/posts | Provides access to cached user posts |
| image | telescope_img_svc | http://localhost/v1/image | Provides a dynamic image processing service |
| auth | telescope_auth_svc | http://localhost/v1/auth | Provides authentication and authorization service |
| search | telescope_search_svc | http://localhost/v1/search | Provides an ELK query controller service |

## Support Services Lookup Table (development only)

Expand Down
19 changes: 19 additions & 0 deletions src/api/search/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
FROM node:lts-alpine as base

# https://snyk.io/blog/10-best-practices-to-containerize-nodejs-web-applications-with-docker/
RUN apk add dumb-init

# TODO: Add feeding in the port from a .env file

# Force production env, regardless of what we get from .env
ENV NODE_ENV production

WORKDIR /app

COPY --chown=node:node . .

RUN npm install ci --only=production

USER node

CMD ["dumb-init", "node", "server.js"]
37 changes: 37 additions & 0 deletions src/api/search/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Search Service

The Search Service provides a direct querying controller which interfaces with our ELK stack.

## Install

```
npm install
```

## Usage

```
# normal mode
npm start
```

By default the server is running on http://localhost:4445/.

### Examples

- `GET /query?text=Telescope&filter=post&page=0` - Returns the search results of posts containing the keyword "Telescope"
- `GET /query?text=SenecaCDOT?filter=author&page=0` - Returns the search results of authors which relate to the keyword "SenecaCDOT"
- `GET /healthcheck` - returns `{ "status": "ok" }` if everything is running properly

## Docker / Docker-Compose

### Docker

- To build and tag: `docker build . -t telescope_search_svc:latest`
- To run locally: `docker run -p 4445:4445 telescope_search_svc:latest`

### Docker-Compose

_Commands to be run from the `~/src/api` directory_

- To build and run: `docker-compose -f docker-compose-api-production.yml up --build search`
34 changes: 34 additions & 0 deletions src/api/search/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
{
"name": "@senecacdot/search-service",
"private": true,
"version": "1.0.0",
"description": "A service for optimizing images",
"scripts": {
"dev": "nodemon server.js | pino-pretty -c -t",
"start": "node server.js",
"test": "jest test.js"
},
"repository": "Seneca-CDOT/telescope",
"license": "BSD-2-Clause",
"bugs": {
"url": "https://github.com/Seneca-CDOT/telescope/issues"
},
"homepage": "https://github.com/Seneca-CDOT/telescope#readme",
"dependencies": {
"@elastic/elasticsearch": "^7.11.0",
"@elastic/elasticsearch-mock": "^0.3.0",
"@senecacdot/satellite": "^1.x",
"express-validator": "^6.9.2"
},
"engines": {
"node": ">=12.0.0"
},
"devDependencies": {
"del-cli": "^3.0.1",
"eslint": "^7.20.0",
"jest": "^26.6.3",
"nodemon": "^2.0.7",
"pino-pretty": "^4.5.0",
"supertest": "^6.1.3"
}
}
9 changes: 9 additions & 0 deletions src/api/search/server.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
const { Satellite } = require('@senecacdot/satellite');

const query = require('./src/routes/query');

const service = new Satellite();
service.router.use('/query', query);

const port = parseInt(process.env.SEARCH_PORT || 4445, 10);
service.start(port);
48 changes: 48 additions & 0 deletions src/api/search/src/bin/validation.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
const { check, validationResult } = require('express-validator');

const queryValidationRules = [
// text must be between 1 and 256 and not empty
check('text')
.exists({ checkFalsy: true })
.withMessage('text should not be empty')
.bail()
.isLength({ max: 256, min: 1 })
.withMessage('text should be between 1 to 256 characters')
.bail(),
// filter must exist and have a valid value
check('filter')
.exists({ checkFalsy: true })
.withMessage('filter should exist')
.bail()
.isIn(['post', 'author'])
.withMessage('invalid filter value')
.bail(),

check('perPage')
.optional()
.isInt({ min: 1, max: 10 })
.withMessage('perPage should be empty or a number between 1 to 10')
.bail(),

check('page')
.optional()
.isInt({ min: 0, max: 999 })
.withMessage('page should be empty or a number between 0 to 999')
.bail(),
];

const validateQuery = (rules) => {
return async (req, res, next) => {
await Promise.all(rules.map((rule) => rule.run(req)));

const result = validationResult(req);
if (result.isEmpty()) {
return next();
}

const errors = result.array();
return res.status(400).send(errors);
};
};

module.exports.validateQuery = validateQuery(queryValidationRules);
49 changes: 49 additions & 0 deletions src/api/search/src/elastic.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
const { ELASTICSEARCH_HOSTS } = process.env;
const { Client } = require('@elastic/elasticsearch');
const Mock = require('@elastic/elasticsearch-mock');
const { logger } = require('@senecacdot/satellite');

function MockClient(options) {
const mock = new Mock();
options.Connection = mock.getConnection();
// Mock out various responses we'll need:
mock.add(
{
method: ['PUT', 'POST', 'GET', 'DELETE'],
path: '/posts/post/:post_id',
},
() => {
return { status: 'ok' };
}
);

const client = new Client(options);
// Provide a fake health check
client.cluster.health = () => Promise.resolve();

return client;
}

// Set MOCK_ELASTIC=1 to mock, MOCK_ELASTIC= to use real elastic
const useMockElastic = process.env.MOCK_ELASTIC;

// Use either a real Elastic Client or a Mock instance, depending on env setting
const ElasticConstructor = useMockElastic ? MockClient : Client;

function createElasticClient() {
try {
const elasticUrl = ELASTICSEARCH_HOSTS;
return new ElasticConstructor({ node: elasticUrl });
} catch (error) {
const message = `Unable to parse elastic URL:PORT "${ELASTICSEARCH_HOSTS}"`;
logger.error({ error }, message);
throw new Error(message);
}
}

module.exports = {
// In case callers need to create a new elastic client
createElasticClient,
// Otherwise they can use this shared instance (most should use this)
client: createElasticClient(),
};
17 changes: 17 additions & 0 deletions src/api/search/src/routes/query.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
const { Router, logger } = require('@senecacdot/satellite');
const { validateQuery } = require('../bin/validation');
const search = require('../search');

const router = Router();

router.get('/', validateQuery, async (req, res) => {
try {
const { text, filter, page, perPage } = req.query;
res.send(await search(text, filter, page, perPage));
} catch (error) {
res.status(500).send(`There was an error while executing your query: ${error}`);
logger.error({ error }, 'Something went wrong with search indexing');
}
});

module.exports = router;
79 changes: 79 additions & 0 deletions src/api/search/src/search.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
const { ELASTIC_MAX_RESULTS_PER_PAGE = 5 } = process.env;
const { client } = require('./elastic');

const index = 'posts';
const type = 'post';

/**
* Creates fields from filter, now the filter is used for author and post but it will be added more.
* the fields will be used for ES
* @param {string} filter
*/
const createFieldsFromFilter = (filter) => {
switch (filter) {
case 'author':
return ['author'];
case 'post':
default:
return ['text', 'title'];
}
};

const sortFromFilter = (filter) => {
switch (filter) {
case 'author':
return [{ published: { order: 'desc' } }];
case 'post':
default:
return undefined;
}
};

const calculateFrom = (page, perPage) => {
const ES_MAX = 10000; // 10K is the upper limit of what ES will return without issue for searches
const wanted = page * perPage;
// Don't exceed 10K, and if we will, return an offset under it by one page size
return wanted + perPage <= ES_MAX ? wanted : ES_MAX - perPage;
};

/**
* Searches text in elasticsearch
* @param textToSearch
* @param filter
* @return all the results matching the passed text
*/
const search = async (
textToSearch,
filter = 'post',
page = 0,
perPage = ELASTIC_MAX_RESULTS_PER_PAGE
) => {
const query = {
query: {
simple_query_string: {
query: textToSearch,
default_operator: 'and',
fields: createFieldsFromFilter(filter),
},
},
sort: sortFromFilter(filter),
};

const {
body: { hits },
} = await client.search({
from: calculateFrom(page, perPage),
size: perPage,
_source: ['id'],
index,
type,
body: query,
});

return {
results: hits.total.value,
values: hits.hits.map(({ _id }) => ({ id: _id, url: `/posts/${_id}` })),
};
};

module.exports = search;
Loading

0 comments on commit a222715

Please sign in to comment.