Serverless is becoming popular and widely accepted in the developer community. Going serverless requires a mindset shift. Going serverless requires you to think stateless.
This @supercharge/hapi-aws-lambda
package let’s you use your hapi.js HTTP server on AWS Lambda.
This package wraps your hapi server and transforms an incoming API Gateway event into an HTTP request. The request will be injected into your hapi server and the resulting response transformed back into an API-Gateway-compatible format.
It’s basically a “done for you” package to run your hapi server in a serverless function AWS Lambda.
hapi v19 (or later) and Node.js v12 (or newer)
This plugin requires hapi v19 (or later) and Node.js v12 or newer.
Major Release | hapi.js version | Node.js version |
---|---|---|
v2 |
>=19 @hapi/hapi |
>=12 |
v1 |
>=18 hapi |
>=8 |
npm i @supercharge/hapi-aws-lambda
Using @supercharge/hapi-aws-lambda
is a two-step process:
Using @supercharge/hapi-aws-lambda
is pretty straightforward:
'use strict'
const Hapi = require('@hapi/hapi')
const LambdaHandler = require('@supercharge/hapi-aws-lambda')
// this `handler` will be used as a cached instance
// a warm Lambda function will reuse the handler for incoming events
let handler
module.exports.handler = async event => {
if (!handler) {
// First, compose your hapi server with all the plugins and dependencies
server = new Hapi.Server()
await server.register({
plugin: require('@hapi/vision')
})
// Second, create a handler instance for your server which will
// transform the Lambda/API Gateway event to a request, send
// the request through your hapi server and then create
// an API Gateay compatible response
handler = LambdaHandler.for(server)
}
return handler.proxy(event)
}
Serving images from an HTTP server running in a Lambda function won’t work out of the box. When neccessary, @supercharge/hapi-aws-lambda
Base64-encodes the response data so that AWS API Gateway can handle the response body.
You need to explicitely configure binary media types in your the API Gateway that is responsible for your Lambda function. Here’s a screenshot of the */*
configuration we use:
There’s a deployment example in the superchargejs/playground-aws-lambda repository.
We used the Serverless framework to deploy the Supercharge app in the playground-aws-lambda
repository. The Serverless CLI is sweet. Here’s the sample serverless.yml
used to deploy the app:
service: supercharge-aws-lambda
provider:
name: aws
runtime: nodejs12.x
region: eu-central-1
functions:
app:
handler: server.handler
memorySize: 384 # default is 1024 MB
events:
- http: ANY /
- http: 'ANY {proxy+}'
plugins:
- serverless-offline
custom:
serverless-offline:
noStripTrailingSlashInUrl: true
Do you miss a string function? We very much appreciate your contribution! Please send in a pull request 😊
- Create a fork
- Create your feature branch:
git checkout -b my-feature
- Commit your changes:
git commit -am 'Add some feature'
- Push to the branch:
git push origin my-new-feature
- Submit a pull request 🚀
MIT © Supercharge
superchargejs.com · GitHub @supercharge · Twitter @superchargejs