Skip to content

Latest commit

 

History

History
142 lines (106 loc) · 5.3 KB

README.md

File metadata and controls

142 lines (106 loc) · 5.3 KB

falco-gpt

falco-gpt is an OpenAI powered tool to generate remediation actions for Falco audit events

MIT GitHub release Go Report


falco-gpt is an OpenAI powered tool to generate remediation actions for Falco audit events. It is a simple HTTP server that listens for Falco audit events and pushes them to an internal NATS server acting like a queue. The queue is then processed by a goroutine that sends the audit events to OpenAI API by applying rate limiting and retries. The generated remediation actions are then sent to Slack via a BOT in a thread.

Screenshots

output-slack-4

Features

  • OpenAI powered
  • Async processing
  • Resiliency with retries
  • Rate limiting (queries per hour)
  • Custom prompt template
  • Limitation: Only Slack support for demo purposes

High Level Overview

                              +------------------------------------------------------+
                              |                                                      |
                              |                                        +----------+  |
                              |                             +--------->|          |  |
                              |                             |          |  OpenAI  |  |
+-------------+               | +-------------+       +-----+-------+  |    API   |  |
|             |               | |             |       |  Retryable  |<-+          |  |
|    falco    |  Send audits  | |  falco-gpt  |Push To| Rate-Limited|  +----------+  |
|  instances  +-------------->| | HTTP Server +------>|    Async    |                |
|(http_output)|    [POST]     | |   (:8080)   |Buffer |    Queue    |  +----------+  |
|             |               | |             |       |  Processor  |  |          |  |
+-------------+               | +-------------+       +------+------+  |  Slack   |  |
                              |                              |         |   BOT    |  |
                              |                              +-------->|          |  |
                              |                                        +----------+  |
                              |                                                      |
                              +------------------------------------------------------+

Installation

Prerequisites

  1. Export the following environment variables:
  1. Falco with http_output enabled:
helm upgrade --install falco falcosecurity/falco --namespace falco --create-namespace \
  --set falco.json_output=true \
  --set falco.http_output.enabled=true \
  --set falco.http_output.url=http://falco-gpt:8080

Build

Build with go

go build .

Build with ko

KO_DOCKER_REPO=<REGISTRY> LDFLAGS="-s -w" ko publish -B --platform=linux/amd64 --tags latest --push=true .

Deployment

Container Image:

furkanturkal/falco-gpt:latest

Kubernetes

envsubst < deployment.yaml | kubectl apply -n falco -f -

Usage

$ go run . <FLAGS>

  -channel string
        Slack channel
  -ignore-older int
        Ignore events in queue older than X hour(s) (default 1)
  -min-priority string
        minimum priority to analyse (default "warning")
  -model string
        Backend AI model (default "gpt-3.5-turbo")
  -port int
        port to listen on (default 8080)
  -qph int
        max queries per HOUR to OpenAI (default 10)
  -template-file string
        path custom template file to use for the ChatGPT

Disclaimer

Your audit log payloads will be sent to OpenAI to generate remediation actions. This project currently does not anonymize the audit log payloads. Please be aware of this when using this tool if you are concerned about your sensitive data. Use at your own risk. By using this tool, you agree that you are solely responsible for any and all consequences; and that the author(s) of this tool are not liable for any damages or losses of any kind.

License

falco-gpt was created by Furkan 'Dentrax' Türkal

The base project code is licensed under MIT unless otherwise specified. Please see the LICENSE file for more information.

Best Regards