Skip to content

Commit

Permalink
Merge pull request #181 from godspeedsystems/corrections
Browse files Browse the repository at this point in the history
events
  • Loading branch information
sakshiarora386 authored Oct 29, 2024
2 parents 8c756b4 + f1bf6fc commit 85041f9
Show file tree
Hide file tree
Showing 12 changed files with 196 additions and 283 deletions.
5 changes: 2 additions & 3 deletions docs/microservices-framework/CRUD_API.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,7 @@ The gen-crud-api command in Godspeed is a powerful tool that automatically gener
The framework generates CRUD API using Prisma's database model files and ORM client. It uses Godspeed's [Prisma plugin](./datasources/datasource-plugins/Prisma%20Datasource.md) as the ORM and generates **http** eventsource based CRUD APIs by default.

**Currently supported eventsources:**
- Http eventsource: [Express](./event-sources/event-source-plugins/Express%20Http%20Eventsource.md)
- Http eventsource: [Fastify](./event-sources/event-source-plugins/Fastify%20Eventsource.md)
- Http eventsources: [Express](./event-sources/event-source-plugins/Express%20Http%20Eventsource.md), [Fastify](./event-sources/event-source-plugins/Fastify%20Eventsource.md)
- Graphql eventsource: [Apollo Graphql](./event-sources/event-source-plugins/Apollo%20GraphQl%20Eventsource.md)

### Steps to generate CRUD API over REST and Graphql
Expand Down Expand Up @@ -75,7 +74,7 @@ If your schema name is **lms.prisma**, your file content should look like this.

4.1 If you already have an existing database, you can [introspect it](https://www.prisma.io/docs/getting-started/setup-prisma/add-to-existing-project/relational-databases/introspection-typescript-postgresql) and generate the Prisma model file using `prisma db pull`. This will generate your .prisma file.

4.2 Copy the generated file to `src/datasources` folder and rename it as per the name of this datasource that you want to keep. If you don't have an existing database setup with a model, then create a prisma model file from scratch. -->
4.2 Copy the generated file to `src/datasources` folder and rename it as per the name of this datasource that you want to keep. If you don't have an existing database setup with a model, then create a prisma model file from scratch.

4.3 Make sure to note the `output` parameter in the .prisma file which should point to location in `src/datasources/prisma-clients/<name_of_your_prisma_file>` and `previewFeatures` is to be added in case you want to generate metrics for prisma queries for telemetry.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Create a Custom Eventsource.
# Create a Custom Eventsource

## About Eventsources

Expand Down
107 changes: 20 additions & 87 deletions docs/microservices-framework/event-sources/event-schema.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,9 @@
# Event Schema
In the meta-framework world, we call sync events (different APIs) and async events (ex. Kafka, Socket, Cron) as events altogether.

An event schema defines
- The structured format or blueprint for representing data within an event
- Authentication and authorization policy
- Input and output validations
- The event handler - the business logic for handling that event
- The documentation of the event (for publishing the API spec)
The event schema, for each eventsource, closely follows the OpenAPI specification. It includes
- The name/topic/URL of the event
- The event handler workflow(fn)
- Input and output schema with the validation error handling
- [Authorization](/docs/microservices-framework/authorization/overview.md) checks

It outlines the specific fields, data types, and structure that an event must adhere to. The schema serves as a standardized template, ensuring consistency in the implementation across projects in a company, whereby many kinds of eventsources are used.

Expand Down Expand Up @@ -44,16 +41,24 @@ Lets understand the first line from the above snippet `http.get./greet`.

`http`: Protocol http eventsource (can be any)

`get` : method (depends on the eventsource used. Can be topic for Kafka, for example.)
`get` : method (depends on the eventsource used. Can be topic for Kafka)

`/helloworld`: endpoint (In case of http and graphql sources. Can be groupId in case of Kafka for ex.)

We are exposing an endpoint with a `get` method on `http` protocol. This endpoint is calling an eventhandler called `helloworld` in the second line. Event handlers can be functions written in typescript, javascript or yaml workflows in Godspeed's DSL format. In the above example the helloworld function exists in `src/functions` directory.

:::tip Note
When switching between eventsources, the event schema undergoes significant changes. In the case of HTTP events, the start line includes the eventsource name, method, and path. However, for Kafka events, the start line combines the datasource name, topic name, and group ID.
:::

## Http
Points to be undertaken :
- The first line is changed for each protocol.
- You can apply multiple compatible eventsource instances in a URI for ex. `graphql & http.get./greeting`
- Async consumers like Kafka dont need authentication or authorization, and don't have a response
- Async events like Cron do not have any input.

### Example HTTP Schema
<details>
<summary> Example HTTP Schema </summary>

```yaml
http.get./greet: #The initial line depicts a fusion of the event, the employed method, and the path associated with the event.
Expand All @@ -76,14 +81,16 @@ http.get./greet: #The initial line depicts a fusion of the event, the employed m
content:
application/json:
schema:
type: string
type: object
200:
content:
application/json:
schema:
type: object
```
**To get a quick understanding of HTTP event scehma, please watch the video provided below…**
</details>

**To get a quick understanding of Event scehma, please watch the video provided below…**

<div style={{ position: 'relative', paddingBottom: '56.25%', height: 0, overflow: 'hidden' }}>
<iframe style={{ position: 'absolute', top: 0, left: 0, width: '100%', height: '100%' }} src="https://www.youtube.com/embed/WsNwInEaWFw?si=2uEG_Tp5x36v9vAB" frameborder="0" allowfullscreen></iframe>
Expand All @@ -93,77 +100,3 @@ http.get./greet: #The initial line depicts a fusion of the event, the employed m
<iframe width="560" height="315" src="https://www.youtube.com/embed/cp1qgIz1PNw?si=4Qngtu-WXoC-LQeY" frameBorder="0" allowFullScreen></iframe>
</div> -->

## Kafka

The structure of Kafka event schema

> A [Kafka](https://github.com/godspeedsystems/gs-plugins/tree/main/plugins/kafka-as-datasource-as-eventsource#godspeed-plugin-kafka-as-datasource-as-eventsource) event is specified as `{datasourceName}.{topic_name}.{group_id}` in [the Kafka event specification](#example-spec-for-kafka-event).

Within the Kafka event structure, the content of the message is captured and made accessible as `inputs.body`, facilitating its integration into the handler workflow for processing.

### Example spec for Kafka event

``` yaml
# event for consume data from Topic
Kafka.publish-producer1.kafka_proj: // event key
id: kafka_consumer
fn: kafka_consume
body:
description: The body of the query
content:
application/json:
schema:
type: string
```

## Apollo Graphql

### GraphQL Configuration
The Apollo Graphql plugin is currently configured and functions exactly the same as Express and Fastify eventsources. Except that it doesn't have swagger config and doesn't support file upload as of now.

(src/eventsources/apollo.yaml)
```yaml
type: graphql
port: 4000
#eventsource level default settings (can be overriden at event level)
authn:
authz:
on_request_validation_error:
on_response_validation_error:
```

:::tip note
Ensure the event key prefix aligns with the name of the configuration YAML file. In this example, the prefix for the Event key is Apollo. The event schema follows REST standards, resembling HTTP events.
:::

### Apollo Graphql event schema

(src/events/create_category.yaml)
```yaml
apollo.post./mongo/category:
summary: Create a new Category
description: Create Category from the database
fn: create
body:
content:
application/json:
schema:
type: object
properties:
name:
type: string
responses:
content:
application/json:
schema:
type: object
```

:::tip note
- The first line is changed for each protocol.
- You can apply multiple compatible eventsource instances in a URI for ex. `graphql & http.get./greeting`
- Async consumers like Kafka dont need authentication or authorization, and don't have a response
- Async Cron does not have any input either, unlike Kafka.

- Two types of events- sync([http](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/express-as-http/README.md),[Apollo Graphql](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/graphql-as-eventsource/README.md)) and async([cron](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/cron-as-eventsource/README.md),[kafka](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/kafka-as-datasource-as-eventsource/README.md))
:::
Original file line number Diff line number Diff line change
Expand Up @@ -57,20 +57,11 @@ In the event, we establish an HTTP endpoint that accepts parameters such as the
message:
type: string
required: ['message']
responses:
200:
content:
application/json:
schema:
type: object
properties:
name:
type: string
```
#### kafka workflow for Producer ( src/functions/kafka-publish.yaml )

In workflow we need to mension `datasource.kafka.producer` as function (fn) to Produce data.
In workflow we need to mension `datasource.kafka.producer` as function (fn) to produce data.

```yaml
id: kafka-publish
Expand All @@ -80,7 +71,7 @@ tasks:
fn: datasource.kafka.producer
args:
topic: "publish-producer1"
message: <% inputs.body.message%>
message: <% inputs.body.message %>
```

### Example usage EventSource (Consumer):
Expand All @@ -103,7 +94,7 @@ To use Consumer we need to follow the below event key format.
The consumer event is triggered whenever a new message arrives on the specified topic. Upon triggering, it retrieves the incoming message and forwards it to the `kafka_consume` function. Inside this function, the incoming message is processed, and the result is then returned.
``` yaml
# event for consume data from Topic
# event to consume data from Topic
kafka.publish-producer1.kafka_proj: // event key
id: kafka__consumer
fn: kafka_consume
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
( src/events/every_minute_task.yaml )
```yaml
# event for Scheduling a task for every minute.
cron.* * * * *.Asia/Kolkata: //event key
cron.* * * * *.Asia/Kolkata: # event key
fn: every_minute

```
Expand Down
Original file line number Diff line number Diff line change
@@ -1,35 +1,36 @@
# Graphql Event
# Apollo Graphql Event

- The GraphQL event configuration in Godspeed allows seamless integration of GraphQL APIs, emphasizing simplicity and efficiency in API development. The configuration file (Apollo.yaml) specifies the GraphQL type and port, ensuring alignment with the event key prefix.
The GraphQL event configuration in Godspeed allows seamless integration of GraphQL APIs, emphasizing simplicity and efficiency in API development. The configuration file (Apollo.yaml) specifies the GraphQL type and port, ensuring alignment with the event key prefix.

### GraphQL Configuration
The Apollo Graphql plugin is currently configured exactly the same as Express and Fastify eventsources. Except that it doesn't have swagger config and doesn't support file upload as of now.

(src/eventsources/Apollo.yaml)
(src/eventsources/apollo.yaml)
```yaml

type: graphql
port: 4000
# eventsource level default settings (can be overriden at event level)
authn:
authz:
on_request_validation_error:
on_response_validation_error:

```
:::tip note
Ensure the event key prefix aligns with the name of the configuration YAML file. In this example, the prefix for the Event key is Apollo. The event schema follows REST standards, resembling HTTP events.
Ensure the event key prefix aligns with the name of the configuration YAML file. In this example, the prefix for the Event key is `apollo` as per the yaml file name (src/eventsources/apollo.yaml). The event schema follows REST standards, resembling HTTP events.
:::

### GraphQL Event

(src/events/create_category.yaml)
```yaml
graphql.post./mongo/category:
apollo.post./mongo/category: // event key having prefix apollo
summary: Create a new Category
description: Create Category from the database
fn: create
body:
content:
application/json:
schema:
type: object
properties:
name:
type: string
responses:
content:
application/json:
schema:
Expand All @@ -56,3 +57,4 @@ tasks:
:::

This configuration emphasizes the simplicity of implementing GraphQL within the Godspeed framework, promoting efficiency and clarity in API development.

Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,18 @@ Within the Kafka event structure, the content of the message is captured and mad

``` yaml
# event for consume data from Topic
Kafka.publish-producer1.kafka_proj: // event key
Kafka.publish-producer1.kafka_proj: # event key
id: kafka_consumer
fn: kafka_consume
body: #same body structure for all the events
body:
content:
application/json:
schema:
type: object
properties:
message: # the content of the message is captured here
type: string
required: ['message']
```
#### Example workflow consuming a Kafka event
Expand Down
Original file line number Diff line number Diff line change
@@ -1,19 +1,4 @@
# Events and Types
## Introduction
In the realm of microservices architecture, events serve as the lifeblood of communication and coordination. Microservices can be configured to consume events from various sources, such as HTTP endpoints and messaging systems like Kafka. These events are meticulously defined, following the OpenAPI specification, and encapsulate critical information, including event names, sources, and workflow details.

**We closely follow the OpenAPI specification; this is a fundamental aspect of all events that adhere to a [standard structure](/docs/microservices-framework/introduction/design-principles.md#schema-driven-development), which is one of the core design principles of Godspeed, regardless of their source or protocol.**

<!-- **When switching between eventsources, the event schema undergoes significant changes. In the case of HTTP events, the start line includes the eventsource name, method, and path. However, for Kafka events, the start line combines the datasource name, topic name, and group ID.** -->

The event schema, for each eventsource, closely follows the OpenAPI specification. It includes
- The name/topic/URL of the event
- The event handler workflow(fn)
- Input and output schema with the validation error handling
- [Authorization](/docs/microservices-framework/authorization/overview.md) checks


## Event types
# Event types
Based on how processing is handled ,events can be classified into two types: synchronous (sync) and asynchronous (async) events, each suited for various protocols.


Expand Down
7 changes: 7 additions & 0 deletions docs/microservices-framework/event-sources/events-overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Introduction To Events
In the realm of microservices architecture, events serve as the lifeblood of communication and coordination. Microservices can be configured to consume events from various sources, such as HTTP endpoints and messaging systems like Kafka. These events are meticulously defined, following the OpenAPI specification, and encapsulate critical information, including event names, sources, and workflow details.

In the meta-framework world, we call all types of sync and async events (ex. Kafka, Socket, Cron) as events.
- Two types of events- sync ([http](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/express-as-http/README.md), [Apollo Graphql](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/graphql-as-eventsource/README.md)) and async ([cron](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/cron-as-eventsource/README.md), [kafka](https://github.com/godspeedsystems/gs-plugins/blob/main/plugins/kafka-as-datasource-as-eventsource/README.md))

**We closely follow the OpenAPI specification; this is a fundamental aspect of all events that adhere to a [standard structure](/docs/microservices-framework/introduction/design-principles.md#schema-driven-development), which is one of the core design principles of Godspeed, regardless of their source or protocol.**
4 changes: 3 additions & 1 deletion docs/microservices-framework/event-sources/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,9 @@ title: Eventsources

Eventsources in Godspeed framework captures event and allows you to define entry or trigger points of application. For ex. the `type: express` eventsource will allow you to expose your application through REST API or a `type: cron` eventsource will allow to schedule a recurring call to a workflow.

The eventsources listen on the incoming events. They process incoming event as per the middleware set by you, including [authentication](../authentication/overview.md). Finally, they transform it to Godspeed's standard `GSCloudEvent` object, which is then made available to the event handlers and subsequent child workflows. To have a look at supported eventsources and understanding their implementation, refer [Godspeed's gs-plugins mono-repo](https://github.com/godspeedsystems/gs-plugins). For ex. [Kafka](https://github.com/godspeedsystems/gs-plugins/tree/main/plugins/kafka-as-datasource-as-eventsource#godspeed-plugin-kafka-as-datasource-as-eventsource)**
The eventsources listen on the incoming events. They process incoming event as per the middleware set by you, including [authentication](../authentication/overview.md). Finally, they transform it to Godspeed's standard `GSCloudEvent` object, which is then made available to the event handlers and subsequent child workflows.

To have a look at supported eventsources and understanding their implementation, refer [Godspeed's gs-plugins mono-repo](https://github.com/godspeedsystems/gs-plugins). For ex. [Kafka](https://github.com/godspeedsystems/gs-plugins/tree/main/plugins/kafka-as-datasource-as-eventsource#godspeed-plugin-kafka-as-datasource-as-eventsource)**


## Types of eventsources
Expand Down
11 changes: 6 additions & 5 deletions docs/microservices-framework/guide/get-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,10 @@ Whether you're having trouble with setup, configurations or understanding the fr
:::tip
To install prerequisites and Godspeed through our Easy Installation Script, Download it from the link provided below:
:::

- [setup.bat](../../../static/setup.bat) (for Windows)
- [setup.sh](../../../static/setup.sh) (for Ubantu)

- [setup.sh](../../../static/setup.sh) (for Ubuntu)

It simplifies the installation process by checking all required tools in one go.

Expand All @@ -33,14 +35,14 @@ It simplifies the installation process by checking all required tools in one go.

3. Execute the script by writing its name.
```
C:\Users\HP\Downloads> setup.bat
setup.bat
```
</details>

<details>
<summary> See How to execute this script in Ubantu </summary>
<summary> See How to execute this script in Ubuntu </summary>

After downloading setup.sh file, Just execute it from shell as:
After downloading setup.sh file, Just execute it from shell as:

```
sudo bash setup.sh
Expand Down Expand Up @@ -266,4 +268,3 @@ There is a longer and detailed introduction video as well, below on this page.
<iframe style={{ position: 'absolute', top: 0, left: 0, width: '100%', height: '100%' }} src="https://www.youtube.com/embed/BTPHPoI3dh0" frameborder="0" allowfullscreen></iframe>
</div>


Loading

0 comments on commit 85041f9

Please sign in to comment.