Skip to content

Commit

Permalink
feat: Add New Relic destination
Browse files Browse the repository at this point in the history
  • Loading branch information
fdmsantos committed Oct 17, 2022
1 parent d8ea509 commit 227d95e
Show file tree
Hide file tree
Showing 9 changed files with 163 additions and 3 deletions.
24 changes: 24 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ Supports all destinations and all Kinesis Firehose Features.
* [Splunk](#splunk)
* [HTTP Endpoint](#http-endpoint)
* [Datadog](#datadog)
* [New Relic](#new-relic)
* [Coralogix](#coralogix)
* [Server Side Encryption](#server-side-encryption)
* [Data Transformation with Lambda](#data-transformation-with-lambda)
* [Data Format Conversion](#data-format-conversion)
Expand Down Expand Up @@ -55,6 +57,7 @@ Supports all destinations and all Kinesis Firehose Features.
- Custom Http Endpoint
- DataDog
- Coralogix
- New Relic
- Data Transformation With Lambda
- Original Data Backup in S3
- Logging and Encryption
Expand Down Expand Up @@ -236,6 +239,25 @@ module "firehose" {
}
```

#### New Relic

**To Enabled It:** `destination = "newrelic"`

**Variables Prefix:** `http_endpoint_` and `newrelic_endpoint_type`

**Check [HTTP Endpoint](#http-endpoint) to more details and [Destinations Mapping](#destinations-mapping) to see the difference between http_endpoint and newrelic destinations**

```hcl
module "firehose" {
source = "fdmsantos/kinesis-firehose/aws"
version = "x.x.x"
name = "firehose-delivery-stream"
destination = "newrelic"
newrelic_endpoint_type = "metrics_eu"
http_endpoint_access_key = "<newrelic_api_key>"
}
```

#### Coralogix

**To Enabled It:** `destination = "coralogix"`
Expand Down Expand Up @@ -562,6 +584,7 @@ The destination variable configured in module is mapped to firehose valid destin
| opensearch and elasticsearch | elasticsearch | There is no difference between opensearch or elasticsearch destinations |
| http_endpoint | http_endpoint | |
| datadog | http_endpoint | The difference regarding http_endpoint is the http_endpoint_url and http_endpoint_name variables aren't support, and it's necessary configure datadog_endpoint_type variable |
| newrelic | http_endpoint | The difference regarding http_endpoint is the http_endpoint_url and http_endpoint_name variables aren't support, and it's necessary configure newrelic_endpoint_type variable |
| coralogix | http_endpoint | The difference regarding http_endpoint is the http_endpoint_url and http_endpoint_name variables aren't support, and it's necessary configure coralogix_endpoint_type variable |

## Examples
Expand All @@ -577,6 +600,7 @@ The destination variable configured in module is mapped to firehose valid destin
- [Splunk In VPC](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/splunk/splunk-in-vpc) - Creates a Kinesis Firehose Stream with splunk in VPC as destination.
- [Custom Http Endpoint](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/custom-http-endpoint) - Creates a Kinesis Firehose Stream with custom http endpoint as destination.
- [Datadog](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/datadog) - Creates a Kinesis Firehose Stream with datadog europe metrics as destination.
- [New Relic](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/newrelic) - Creates a Kinesis Firehose Stream with New Relic europe metrics as destination.
- [Coralogix](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/coralogix) - Creates a Kinesis Firehose Stream with coralogix ireland as destination.

<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
Expand Down
2 changes: 1 addition & 1 deletion examples/http-endpoint/datadog/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ module "firehose" {
buffer_interval = 60
datadog_endpoint_type = "metrics_eu"
http_endpoint_access_key = var.datadog_api_key
http_endpoint_retry_duration = 400
http_endpoint_retry_duration = 60
http_endpoint_enable_request_configuration = true
http_endpoint_request_configuration_content_encoding = "GZIP"
http_endpoint_request_configuration_common_attributes = [
Expand Down
24 changes: 24 additions & 0 deletions examples/http-endpoint/newrelic/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Datadog

Configuration in this directory creates kinesis firehose stream with Direct Put as source and new relic as destination to Europe Metrics URL.

This example can be tested with Demo Data in Kinesis Firehose Console.

## Usage

To run this example you need to execute:

```bash
$ terraform init
$ terraform plan
$ terraform apply
```

It's necessary configure the following variables:

```hcl
newrelic_api_key = "<newrelic_api_key>"
```

<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
43 changes: 43 additions & 0 deletions examples/http-endpoint/newrelic/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
resource "random_pet" "this" {
length = 2
}

resource "aws_s3_bucket" "s3" {
bucket = "${var.name_prefix}-destination-bucket-${random_pet.this.id}"
force_destroy = true
}

resource "aws_kms_key" "this" {
description = "${var.name_prefix}-kms-key"
deletion_window_in_days = 7
}

module "firehose" {
source = "../../../"
name = "${var.name_prefix}-delivery-stream"
destination = "newrelic"
buffer_interval = 60
newrelic_endpoint_type = "metrics_eu"
http_endpoint_access_key = var.newrelic_api_key
http_endpoint_retry_duration = 60
http_endpoint_enable_request_configuration = true
http_endpoint_request_configuration_content_encoding = "GZIP"
http_endpoint_request_configuration_common_attributes = [
{
name = "testname"
value = "testvalue"
},
{
name = "testname2"
value = "testvalue2"
}
]
s3_backup_mode = "All"
s3_backup_prefix = "backup/"
s3_backup_bucket_arn = aws_s3_bucket.s3.arn
s3_backup_buffer_interval = 100
s3_backup_buffer_size = 100
s3_backup_compression = "GZIP"
s3_backup_enable_encryption = true
s3_backup_kms_key_arn = aws_kms_key.this.arn
}
19 changes: 19 additions & 0 deletions examples/http-endpoint/newrelic/outputs.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
output "firehose_role" {
description = "Firehose Role"
value = module.firehose.kinesis_firehose_role_arn
}

output "kinesis_firehose_arn" {
description = "The ARN of the Kinesis Firehose Stream"
value = module.firehose.kinesis_firehose_arn
}

output "kinesis_firehose_destination_id" {
description = "The Destination id of the Kinesis Firehose Stream"
value = module.firehose.kinesis_firehose_destination_id
}

output "kinesis_firehose_version_id" {
description = "The Version id of the Kinesis Firehose Stream"
value = module.firehose.kinesis_firehose_version_id
}
11 changes: 11 additions & 0 deletions examples/http-endpoint/newrelic/variables.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
variable "name_prefix" {
description = "Name prefix to use in resources"
type = string
default = "firehose-to-newrelic"
}

variable "newrelic_api_key" {
description = "New Relic Api Key"
type = string
sensitive = true
}
14 changes: 14 additions & 0 deletions examples/http-endpoint/newrelic/versions.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
terraform {
required_version = ">= 0.13.1"

required_providers {
aws = {
source = "hashicorp/aws"
version = ">= 4.4"
}
random = {
source = "hashicorp/random"
version = ">= 2.0"
}
}
}
13 changes: 12 additions & 1 deletion locals.tf
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,8 @@ locals {
splunk : "splunk",
http_endpoint : "http_endpoint",
datadog : "http_endpoint",
coralogix : "http_endpoint"
coralogix : "http_endpoint",
newrelic: "http_endpoint"
}
destination = local.destinations[var.destination]
s3_destination = local.destination == "extended_s3" ? true : false
Expand Down Expand Up @@ -165,12 +166,14 @@ locals {
http_endpoint : var.http_endpoint_url
datadog : local.datadog_endpoint_url[var.datadog_endpoint_type]
coralogix : local.coralogix_endpoint_url[var.coralogix_endpoint_type]
newrelic: local.newrelic_endpoint_url[var.newrelic_endpoint_type]
}

http_endpoint_name = {
http_endpoint : var.http_endpoint_name
datadog : "Datadog"
coralogix : "Coralogix"
newrelic: "New Relic"
}

http_endpoint_destinations_parameters = {
Expand Down Expand Up @@ -210,6 +213,14 @@ locals {
}] : []
)

# New Relic
newrelic_endpoint_url = {
logs_us : "https://aws-api.newrelic.om/firehose/v1"
logs_eu : "https://aws-api.eu.newrelic.com/firehose/v1"
metrics_us : "https://aws-api.newrelic.com/cloudwatch-metrics/v1"
metrics_eu : "https://aws-api.eu01.nr-data.net/cloudwatch-metrics/v1"
}

# Networking
firehose_cidr_blocks = {
redshift : {
Expand Down
16 changes: 15 additions & 1 deletion variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ variable "destination" {
type = string
validation {
error_message = "Please use a valid destination!"
condition = contains(["s3", "extended_s3", "redshift", "opensearch", "elasticsearch", "splunk", "http_endpoint", "datadog", "coralogix"], var.destination)
condition = contains(["s3", "extended_s3", "redshift", "opensearch", "elasticsearch", "splunk", "http_endpoint", "datadog", "coralogix", "newrelic"], var.destination)
}
}

Expand Down Expand Up @@ -934,6 +934,20 @@ variable "datadog_endpoint_type" {
}
}

######
# New Relic Destination Variables
######
variable "newrelic_endpoint_type" {
description = "Endpoint type to New Relic destination"
type = string
default = "logs_eu"
validation {
error_message = "Please use a valid endpoint type!"
condition = contains(["logs_us", "logs_eu", "metrics_us", "metrics_eu"], var.newrelic_endpoint_type)
}
}


######
# Coralogix Destination Variables
######
Expand Down

0 comments on commit 227d95e

Please sign in to comment.