sam
is the AWS CLI tool for managing Serverless applications written with AWS Serverless Application Model (SAM). SAM Local can be used to test functions locally, start a local API Gateway from a SAM template, validate a SAM template, and generate sample payloads for various event sources.
- SAM Local (Beta)
- Develop and test your Lambda functions locally with
sam local
and Docker - Invoke functions from known event sources such as Amazon S3, Amazon DynamoDB, Amazon Kinesis, etc.
- Start local API Gateway from a SAM template, and quickly iterate over your functions with hot-reloading
- Validate SAM templates
Running Serverless projects and functions locally with SAM Local requires Docker to be installed and running. SAM Local will use the DOCKER_HOST
environment variable to contact the docker daemon.
- macOS: Docker for Mac
- Windows: Docker Toolbox
- Linux: Check your distroโs package manager (e.g. yum install docker)
For macOS and Windows users: SAM local requires that the project directory (or any parent directory) is listed in Docker file sharing options.
Verify that docker is working, and that you can run docker commands from the CLI (e.g. โdocker psโ). You do not need to install/fetch/pull any containers โ SAM Local will do it automatically as required.
The easiest way to install sam
is to use NPM.
npm install -g aws-sam-local
Verify the installation worked:
sam --version
If you get a permission error when using npm (such as EACCES: permission denied
), please see the instructions on this page of the NPM documentation: https://docs.npmjs.com/getting-started/fixing-npm-permissions.
We also release the CLI as binaries that you can download and instantly use. You can find them under Releases in this repo. In case you cannot find the version or architecture you're looking for you can refer to Build From Source section for build details.
First, install Go (v1.8+) on your machine: https://golang.org/doc/install, then run the following:
$ go get github.com/awslabs/aws-sam-local
This will install sam
to your $GOPATH/bin
folder. Make sure this directory is in your $PATH
(or %%PATH%% on Windows) and you should then be able to use the SAM Local. Please note that due to the package name, the binary will be installed as aws-sam-local
rather than sam
.
aws-sam-local --help
sam
requires a SAM template in order to know how to invoke your function locally, and it's also true for spawning API Gateway locally - If no template is specified template.yaml
will be used instead.
You can find sample SAM templates either under samples
located in this repo or by visiting SAM official repository.
You can invoke your function locally by passing its SAM logical ID and an event file. Alternatively, sam local invoke
accepts stdin as an event too.
Resources:
Ratings: # <-- Logical ID
Type: 'AWS::Serverless::Function'
...
Syntax
# Invoking function with event file
$ sam local invoke "Ratings" -e event.json
# Invoking function with event via stdin
$ echo '{"message": "Hey, are you there?" }' | sam local invoke "Ratings"
# For more options
$ sam local invoke --help
To make local development and testing of Lambda functions easier, you can generate mock/sample event payloads for the following services:
- S3
- Kinesis
- DynamoDB
- Cloudwatch Scheduled Event
- Cloudtrail
- API Gateway
Syntax
sam local generate-event <service>
Also, you can invoke an individual lambda function locally from a sample event payload - Here's an example using S3:
sam local generate-event s3 --bucket <bucket> --key <key> | sam local invoke <function logical id>
For more options, see sam local generate-event --help
.
sam local start-api
spawns a local API Gateway to test HTTP request/response functionality. Features hot-reloading to allow you to quickly develop, and iterate over your functions.
Syntax
sam local start-api
sam
will automatically find any functions within your SAM template that have Api
event sources defined, and mount them at the defined HTTP paths.
In the example below, the Ratings
function would mount ratings.py:handler()
at /ratings
for GET
requests.
Ratings:
Type: AWS::Serverless::Function
Properties:
Handler: ratings.handler
Runtime: python3.6
Events:
Api:
Type: Api
Properties:
Path: /ratings
Method: get
By default, SAM uses Proxy Integration and expects the response from your Lambda function to include one or more of the following: statusCode
, headers
and/or body
.
For example:
// Example of a Proxy Integration response
exports.handler = (event, context, callback) => {
callback(null, {
statusCode: 200,
headers: { "x-custom-header" : "my custom header value" },
body: "hello world"
});
}
For examples in other AWS Lambda languages, see this page.
If your function does not return a valid Proxy Integration response then you will get a HTTP 500 (Internal Server Error) when accessing your function. SAM Local will also print the following error log message to help you diagnose the problem:
ERROR: Function ExampleFunction returned an invalid response (must include one of: body, headers or statusCode in the response object)
Both sam local invoke
and sam local start-api
support local debugging of your functions.
To run SAM Local with debugging support enabled, just specify --debug-port
or -d
on the command line.
# Invoke a function locally in debug mode on port 5858
$ sam local invoke -d 5858 <function logical id>
# Start local API Gateway in debug mode on port 5858
$ sam local start-api -d 5858
Note: If using sam local start-api
, the local API Gateway will expose all of your lambda functions but, since you can specify a single debug port, you can only debug one function at a time.
Here is an example showing how to debug a NodeJS function with Microsoft Visual Studio Code:
In order to setup Visual Studio Code for debugging with AWS SAM Local, use the following launch configuration:
{
"version": "0.2.0",
"configurations": [
{
"name": "Attach to SAM Local",
"type": "node",
"request": "attach",
"address": "localhost",
"port": 5858,
"localRoot": "${workspaceRoot}",
"remoteRoot": "/var/task"
}
]
}
Unlike Node.JS and Java, Python requires you to enable remote debugging in your Lambda function code. If you enable debugging with --debug-port
or -d
for a function that uses one of the Python runtimes, SAM Local will just map through that port from your host machine through to the Lambda runtime container. You will need to enable remote debugging in your function code. To do this, use a python package such as remote-pdb. When configuring the host the debugger listens on in your code, make sure to use 0.0.0.0
not 127.0.0.1
to allow Docker to map through the port to your host machine.
Both sam local invoke
and sam local start-api
support connecting the create lambda docker containers to an existing docker network.
To connect the containers to an existing docker network, you can use the --docker-network
command-line argument or the SAM_DOCKER_NETWORK
environment variable along with the name or id of the docker network you wish to connect to.
# Invoke a function locally and connect to a docker network
$ sam local invoke --docker-network my-custom-network <function logical id>
# Start local API Gateway and connect all containers to a docker network
$ sam local start-api --docker-network b91847306671 -d 5858
Validate your templates with $ sam validate
.
This command will validate your template against the official AWS Serverless Application Model specification.
As with most SAM Local commands, it will look for a template.yaml
file in your current working directory by default. You can specify a different template file/location with the -t
or --template
option.
Syntax
$ sam validate
ERROR: Resource "HelloWorld", property "Runtime": Invalid value node. Valid values are "nodejs", "nodejs4.3", "nodejs6.10", "java8", "python2.7", "python3.6", "dotnetcore1.0", "nodejs4.3-edge" (line: 11; col: 6)
# Let's fix that error...
$ sed -i 's/node/nodejs6.10/g' template.yaml
$ sam validate
Valid!
Once you have developed and tested your Serverless application locally, you can deploy to Lambda using sam package
and sam deploy
command. package
command will zip your code artifacts, upload to S3 and produce a SAM file that is ready to be deployed to Lambda using AWS CloudFormation. deploy
command will deploy the packaged SAM template to CloudFormation. Both sam package
and sam deploy
are identical to their AWS CLI equivalents commands aws cloudformation package
and aws cloudformation deploy
respectively. Please consult the AWS CLI command documentation for usage.
Example:
# Package SAM template
$ sam package --template-file sam.yaml --s3-bucket mybucket --output-template-file packaged.yaml
# Deploy packaged SAM template
$ sam deploy --template-file ./packaged.yaml --stack-name mystack --capabilities CAPABILITY_IAM
- Check out HOWTO Guide section for more details
To use SAM Local with compiled languages, such as Java that require a packaged artifact (e.g. a JAR, or ZIP), you can specify the location of the artifact with the AWS::Serverless::Function
CodeUri
property in your SAM template.
For example:
AWSTemplateFormatVersion: 2010-09-09
Transform: AWS::Serverless-2016-10-31
Resources:
ExampleJavaFunction:
Type: AWS::Serverless::Function
Properties:
Handler: com.example.HelloWorldHandler
CodeUri: ./target/HelloWorld-1.0.jar
Runtime: java8
You should then build your JAR file using your normal build process. Please note that JAR files used with AWS Lambda should be a shaded JAR file (or uber jar) containing all of the function dependencies.
// Build the JAR file
$ mvn package shade:shade
// Invoke with SAM Local
$ echo '{ "some": "input" }' | sam local invoke
// Or start local API Gateway simulator
$ sam local start-api
You can find a full Java example in the samples/java folder
SAM Local will invoke functions with your locally configured IAM credentials.
As with the AWS CLI and SDKs, SAM Local will look for credentials in the following order:
- Environment Variables (
AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
). - The AWS credentials file (located at
~/.aws/credentials
on Linux, macOS, or Unix, or atC:\Users\USERNAME \.aws\credentials
on Windows). - Instance profile credentials (if running on Amazon EC2 with an assigned instance role).
See this Configuring the AWS CLI for more details.
If your Lambda function uses environment variables, you can provide values for them will passed to the Docker container. Here is how you would do it:
For example, consider the SAM template snippet:
Resources:
MyFunction1:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: nodejs4.3
Environment:
Variables:
TABLE_NAME: prodtable
BUCKET_NAME: prodbucket
MyFunction2:
Type: AWS::Serverless::Function
Properties:
Handler: app.handler
Runtime: nodejs4.3
Environment:
Variables:
STAGE: prod
TABLE_NAME: prodtable
Use --env-vars
argument of invoke
or start-api
commands to provide a JSON file that contains values for environment variables defined in your function. The file should be structured as follows:
{
"MyFunction1": {
"TABLE_NAME": "localtable",
"BUCKET_NAME": "testBucket"
},
"MyFunction2": {
"TABLE_NAME": "localtable",
"STAGE": "dev"
},
}
$ sam local start-api --env-vars env.json
Variables defined in your Shell's environment will be passed to the Docker container, if they map to a Variable in your Lambda function. Shell variables are globally applicable to functions ie. If two functions have a variable called TABLE_NAME
, then the value for TABLE_NAME
provided through Shell's environment will be availabe to both functions.
Following command will make value of mytable
available to both MyFunction1
and MyFunction2
$ TABLE_NAME=mytable sam local start-api
For greater control, you can use a combination shell variables and external environment variable file. If a variable is defined in both places, the one from the file will override the shell. Here is the order of priority, highest to lowest. Higher priority ones will override the lower.
- Environment Variable file
- Shell's environment
- Hard-coded values from the template
When your Lambda function is invoked using SAM Local, it sets an environment variable AWS_SAM_LOCAL=true
in the Docker container. Your Lambda function can use this property to enable or disable functionality that would not make sense in local development. For example: Disable emitting metrics to CloudWatch (or) Enable verbose logging etc.
Often, it's useful to serve up static assets (e.g CSS/HTML/Javascript etc) when developing a Serverless application. On AWS, this would normally be done with CloudFront/S3. SAM Local by default looks for a ./public/
directory in your SAM project directory and will serve up all files from it at the root of the HTTP server when using sam local start-api
. You can override the default static asset directory by using the -s
or --static-dir
command line flag. You can also disable this behaviour completely by setting --static-dir ""
.
Both invoke
and start-api
command allow you to pipe logs from the function's invocation into a file. This will be useful if you are running automated tests against SAM Local and want to capture logs for analysis.
Example:
$ sam local invoke --log-file ./output.log
Sam Local loads function code by mounting filesystem to a Docker Volume. As a result, The project directory must be pre-mounted on the remote host where the Docker is running.
If mounted, you can use the remote docker normally using --docker-volume-basedir
or environment variable SAM_DOCKER_VOLUME_BASEDIR
.
Example - Docker Toolbox (Windows):
When you install and run Docker Toolbox, the Linux VM with Docker is automatically installed in the virtual box.
The /c/ path for this Linux VM is automatically shared with C:\ on the host machine.
sam local invoke --docker-volume-basedir /c/Users/shlee322/projects/test "Ratings"
- Supported AWS Lambda Runtimes
-
nodejs
-
nodejs4.3
-
nodejs6.10
-
java8
-
python2.7
-
python3.6
-
dotnetcore1.0
-
- AWS credential support
- Debugging support
- Inline Swagger support within SAM templates
Contributions and feedback are welcome! Proposals and pull requests will be considered and responded to. For more information, see the CONTRIBUTING file.
SAM Local uses the open source docker-lambda Docker images created by @mhart.
You can find sample functions code and a SAM template used in this README under the samples folder within this repo.