Skip to content

Commit

Permalink
Merge pull request #31 from HugoByte/next
Browse files Browse the repository at this point in the history
Merge : Next to Master v0.1.0
  • Loading branch information
MuhammedIrfan authored Feb 6, 2021
2 parents b034684 + 56d6123 commit c8ac3e3
Show file tree
Hide file tree
Showing 39 changed files with 2,734 additions and 103 deletions.
5 changes: 5 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Excluded folders
./node_modules
./bin
./coverage
.vscode
24 changes: 24 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Pull Request Template

## Description
* Add description

Please select the options that are relevant.

- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
- [ ] This change requires a documentation update

## Checklist:

- [ ] Pull Request follow only a single responsibility.
- [ ] Code follows the style guidelines of this project
- [ ] Performed a self-review of my own code
- [ ] Commented my code, particularly in hard-to-understand areas
- [ ] Made corresponding changes to the documentation
- [ ] Changes generate no new warnings
- [ ] Added tests that prove fix is effective or the feature works
- [ ] New and existing unit tests pass locally with the changes
- [ ] Any dependent changes have been merged and published in downstream modules
- [ ] Checked the code and corrected any misspellings
19 changes: 19 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
FROM node:12.20.1-stretch-slim

# Define working directory
WORKDIR /app

# Copy package.json file to working directory in container
COPY package.json /app

# Install dependencies
RUN npm install

# Copy other project files to container
COPY . .

# Install dependencies
RUN npm run build

# Run a startup command when container starts
CMD ["npm", "start"]
62 changes: 59 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,66 @@
# limitations under the License.
#
-->
# Aurras Event Feed package to Source Events from Substrate based Chains
# Event Feed - Substrate

[![License](https://img.shields.io/badge/license-Apache--2.0-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0)

Aurras is a middleware that acts as an event processor and a low code workflow orchestration platform. This project is an Event source for Aurras system to source events from chain.
### Introduction

Aurras is a middleware that acts as an event processor and a low code workflow orchestration platform. Aurras is being pitched as a next-generation system for enabling decentralized push notification. This middleware solution listens to events from blockchain applications and propagates them to a registered pool of MQTT brokers. The broader architecture consists of parachain from which the middleware listens for the events.

This Event Feed package facilitates to source events from substrate-based chains. The events will be posted to the OpenWhisk system. [polkadot-js/api](https://github.com/polkadot-js/api) is used under the hood to establish the connection to blockchain nodes and receive events.

### Prerequisites

1. [Substrate Based Chain](https://substrate.dev/docs/en/tutorials/create-your-first-substrate-chain/)
2. [Openwhisk](http://openwhisk.apache.org/)

### Installation

Assuming basic dependency such as [git](https://git-scm.com/) and [yarn](https://yarnpkg.com/) already installed.

1. Clone the repository

```text
git clone https://github.com/HugoByte/aurras-event-feed-substrate-js.git
```

2. Navigate to the cloned directory

```text
cd aurras-event-feed-substrate-js
```

3. Install dependencies

```text
yarn install
```

### Configuration

Configurations are passed through environment variables which can be found [here](/docs/configuration.md).

### Usage

Start the feed in development mode.

```text
yarn serve
```

### Testing

Run Unit test suites

```text
yarn test
```

### Deployment

Deployment is done through either docker-compose or Kubernetes which can be found [here](https://docs.aurras.hugobyte.com/components/event-feed/event-feed-substrate/deployment).

# License
### License
Licensed under [Apache-2.0](./LICENSE)
13 changes: 13 additions & 0 deletions config/custom-environment-variables.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"chainName": "CHAIN_NAME",
"chainEndpoint": "CHAIN_ENDPOINT",
"loggerConfigurations": "LOGGERS",
"sectionMethodExcludes": "EXCLUDES",
"typesLocation": "TYPES_FILE",
"kafkaBrokerConfigurations": "KAFKA_BROKERS",
"kafkaTopic": "KAFKA_TOPIC",
"openwhiskApiKey": "OPENWHISK_API_KEY",
"openwhiskApiHost": "OPENWHISK_API_HOST",
"openwhiskNamespace": "OPENWHISK_NAMESPACE",
"eventReceiver": "EVENT_RECEIVER"
}
48 changes: 48 additions & 0 deletions config/default.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
const defer = require('config/defer').deferConfig;
const { loggersHelper, excludesHelper, typesHelper, kafkaBrokersHelper } = require('./helper');

module.exports = {
// Name of the chain
chainName: undefined,

chainEndpoint: undefined,

// Loggers config fetched through environment variable
loggerConfigurations: undefined,

/**
* TODO: Since the limitation of json schema-to-yup not supporting array validation as we wanted
* we are forced to add the make object than array. Once we add array handle to schema-to-yup
* we can fix the below transformer to have different loggers and array with key value pair.
*/

// Transform the config fetched through environment variable
loggers: defer(function () {
return loggersHelper(this.loggerConfigurations)
}),

// Section Methods to exclude fetched through environment variable
sectionMethodExcludes: undefined,

excludes: defer(function () {
return excludesHelper(this.sectionMethodExcludes)
}),

typesLocation: undefined,

types: defer(function () {
return typesHelper(this.typesLocation);
}),

kafkaBrokerConfigurations: undefined,

kafkaBrokers: defer(function (){
return kafkaBrokersHelper(this.kafkaBrokerConfigurations);
}),

kafkaTopic: undefined,
openwhiskApiKey: undefined,
openwhiskApiHost: undefined,
openwhiskNamespace: undefined,
eventReceiver: undefined
}
3 changes: 0 additions & 3 deletions config/default.json

This file was deleted.

102 changes: 102 additions & 0 deletions config/helper.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
const _ = require('lodash');
const fs = require('fs');
const path = require('path');

module.exports = {
loggersHelper: function (loggerConfigurations) {
// Split loggers config to get indepndent logger
var loggers = _.split(_.trim(loggerConfigurations), ";");

loggers = _.reduce(loggers, function (object, loggerConfiguration) {
// Return the accumulator if the value is empty.
if (_.isEmpty(loggerConfiguration)) return object;

// Get logger options
const logger = _.reduce(_.split(loggerConfiguration, ","), function (object, item, index) {
// Return the accumulator if the value is empty.
if (_.isEmpty(item)) return object;

// Get Logger type as key of the first object
const loggerType = Object.keys(object)[0];

// Type of Logger from the 1st argument
if (index === 0) object[item] = { enabled: true };

// Level of the logger from the 2nd argument
if (index === 1) object[loggerType]["level"] = item;

// Optional Argument based on the logger from 3rd argument
if (index === 2) {
// Map to get the option based on type of logger
const keyMap = {
file: "filename",
};

// keyMap[loggerType] to get the option based on the type of the logger
object[loggerType][keyMap[loggerType]] = item;
}
return object;
}, {});

return _.assign(object, logger);
}, {});

return loggers;
},

excludesHelper: function (sectionMethodExcludes) {
// Split loggers config to get indepndent logger
var sections = _.split(_.trim(sectionMethodExcludes), ";");

sections = _.reduce(sections, function (object, sectionMethodExclude) {
// Return the accumulator if the value is empty.
if (_.isEmpty(sectionMethodExclude)) return object;

//Split to get section and its method sectionMethodExcludeSplit[0] will be the section and if only specific methods need to be excluded, sectionMethodExcludeSplit[1] will be method collection
const sectionMethodExcludeSplit = _.split(_.trim(sectionMethodExclude), "=");
const section = sectionMethodExcludeSplit[0];
const methods = sectionMethodExcludeSplit[1] ? _.filter(_.split(_.trim(sectionMethodExcludeSplit[1]), ","), function (value) {
return !_.isEmpty(value);
}) : undefined;

object.push({
section,
methods
});

return object;
}, []);

return sections;
},

typesHelper: function (typesLocation) {
if (typesLocation === undefined) return;

const location = path.resolve(typesLocation);

try {
if (fs.existsSync(location)) return JSON.parse(fs.readFileSync(location, { encoding: 'utf8' }));
} catch(error) {
throw new Error("Failed to parse provided json");
}

return undefined;
},

kafkaBrokersHelper: function (kafkaBrokerConfigurations) {
// Split brokers config to get indepndent broker
var kafkaBrokers = _.split(_.trim(kafkaBrokerConfigurations), ";");

kafkaBrokers = _.reduce(kafkaBrokers, function (object, kafkaBrokerConfiguration) {
// Return the accumulator if the value is empty.
if (_.isEmpty(kafkaBrokerConfiguration)) return object;

object.push(kafkaBrokerConfiguration);

return object;
}, []);

return kafkaBrokers;
}
}
61 changes: 58 additions & 3 deletions config/schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,68 @@
"type": "object",
"title": "Substrate Event Feed Config",
"properties": {
"name": {
"chainName": {
"description": "Name of the chain",
"type": "string",
"required": true,
"matches": "[a-zA-Z]+",
"min": 3,
"maxLength": 40
},
"loggers": {
"type": "object",
"properties": {
"console": {
"type": "object",
"properties": {
"enabled": {
"type": "boolean"
},
"level": {
"type": "string",
"enum": [
"info",
"debug",
"error",
"warning"
]
}
},
"required": [
"level"
]
},
"file": {
"type": "object",
"properties": {
"level": {
"type": "string",
"enum": [
"info",
"debug",
"error",
"warning"
]
},
"filename": {
"type": "string"
}
},
"required": [
"level",
"filename"
]
}
}
},
"chainEndpoint" :{
"description": "Websocket endpoint of the chain node",
"type": "string",
"pattern": "^(ws|wss)://"
}
}
},
"required": [
"chainName",
"loggers",
"chainEndpoint"
]
}
Loading

0 comments on commit c8ac3e3

Please sign in to comment.