Skip to content
This repository has been archived by the owner on Jul 22, 2024. It is now read-only.

A Node.js function to send all log files generated by Flow Logs for VPC to IBM Log Analysis

License

Notifications You must be signed in to change notification settings

IBM/vpc-flowlogs-log-analysis

Repository files navigation

IBM Cloud Flow Logs for VPC to IBM Log Analysis

IBM Cloud Powered Platform LICENSE PRs Welcome

IBM Cloud® Flow Logs for VPC enable the collection, storage, and presentation of information about the Internet Protocol (IP) traffic going to and from network interfaces within your Virtual Private Cloud (VPC). The service stores collector output in a bucket on IBM Cloud Object Storage (COS) - at least 1 log package (on a .gz file). For those logs, there is a service called IBM Log Analysis that can receive all logs and display them in a single platform (you can send logs from your Kubernetes cluster, VMs, etc). To import all logs into your IBM Log Analysis instance, you need to set up a Serverless function on IBM Cloud Functions which uses a Trigger to call your function automatically. The Trigger listens for a write event on IBM Cloud Object Storage. Whenever Flow Logs for VPC stores a new object into your IBM Cloud Object Storage bucket, the Trigger calls your function that process the log package and automatcally send it to your IBM Log Analysis instance.

Architecture Design

Before you follow step-by-step below, you need to install IBM Cloud CLI and IBM Cloud Functions CLI in your local machine. Then, you need to login in your IBM Cloud account on IBM Cloud CLI (if you haven't already done, run ibmcloud login).

1. Clone this repository

Download the source code from Github and access the project folder.

git clone https://github.com/IBM/vpc-flowlogs-log-analysis.git
cd vpc-flowlogs-log-analysis

2. Create an IBM Cloud Object Storage service instance

Access the IBM Cloud Catalog and create a IBM Cloud Object Storage. After you create the instance, you have to create two Buckets with the same Resiliency and Location (e.g Regional and us-south):

  • To receive the log files from CIS;
  • To store the log files after you send the content to IBM Log Analysis.

Remember the name for each one of the Bucket name, because you're going to use them in the next step.

For your service instance, you need to create a Service credential. You can find it on the left menu in your COS instance. For the purpose of this project, you will use apikey and iam_serviceid_crn.

You can find the Endpoint URL on Endpoints tab. The correct enpoint for your usecase depends on the Resilience and Location you choose when you create your Buckets. For more information, access the IBM Cloud Docs.

3. Create a IBM Log Analysis service instance

Access the IBM Cloud Catalog and create a IBM Log Analysis. After you create the instance, you have to access the service by clicking on View IBM Log Analysis button.

Access the Settings -> ORGANIZATION -> API Keys to get your Ingestion Keys.

4. Set up the environment variables to deploy them as function's parameters

Run the following command with the IBM Cloud Object Storage credentials and the bucket name (for long-term retention), and IBM Log Analysis ingestion key:

  • LOG_ANALYSIS_HOSTNAME is the name of the source of the log line.
  • LOG_ANALYSIS_INGESTION_KEY is used to connect the Node.js function to the IBM Log Analysis instance.
  • LOG_ANALYSIS_REGION is the region where your IBM Log Analysis instance is running (i.e. us-south for Dallas region).
  • COS_BUCKET_ARCHIVE is the bucket where you will save the log package after you send it to IBM Log Analysis (consider it as your long-term retention).
  • COS_APIKEY is the apikey field, generated on service credentials in your COS instance.
  • COS_ENDPOINT is the endpoint available on Endpoint section in your COS instance. It depends on the resiliency and location that your bucket is defined.
  • COS_INSTANCEID is the resource_instance_id field, generated on service credentials in your COS instance.
export LOG_ANALYSIS_HOSTNAME="" \
  LOG_ANALYSIS_INGESTION_KEY="" \
  LOG_ANALYSIS_REGION="" \
  COS_BUCKET_ARCHIVE="" \
  COS_APIKEY="" \
  COS_ENDPOINT="" \
  COS_INSTANCEID=""

5. Deploy the Action

Run the following command to deploy handler.js function and to set up the Trigger with Cron job.

As you are using IBM Cloud Functions, you don't need to install any package or setup a package.json. The platform already has all libraries required to run the source code.

ibmcloud fn deploy --manifest manifest.yml

6. Set up the Cloud Object Storage Trigger on IBM Cloud Functions

Before you can create a trigger to listen for bucket change events, you must assign the Notifications Manager role to your IBM Cloud Functions namespace. Folow the instruction on IBM Cloud Docs to assign the role in your service policy.

When you complete the previous step, you will be able to create a new COS Trigger. Access the Functions > Create > Trigger > Cloud Object Storage to create and connect your bucket with your Action.

Now, your Action will be called everytime you upload a new object to your bucket.

API Reference

Troubleshooting

  • IBM Log Analysis ingestion API has a limitation of 10 MB per request.
  • ESOCKETTIMEDOUT, ECONNRESET and ETIMEDOUT are IBM Log Analysis Ingest API errors. The script will automatically resend the logs.

LICENSE

Copyright 2021 IBM Corp.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •