Skip to content

Commit

Permalink
Add log section to hdfs integrations (#4632)
Browse files Browse the repository at this point in the history
* Add log section to hdfs integrations

* Update README.md

* Update hdfs_namenode/README.md

Co-Authored-By: cswatt <[email protected]>

* Update hdfs_datanode/README.md

Co-Authored-By: Christine Chen <[email protected]>

* Update hdfs_namenode/README.md

Co-Authored-By: Christine Chen <[email protected]>

Co-authored-by: cswatt <[email protected]>
Co-authored-by: Christine Chen <[email protected]>
  • Loading branch information
3 people authored Feb 10, 2020
1 parent 4252198 commit 35b224a
Show file tree
Hide file tree
Showing 6 changed files with 84 additions and 0 deletions.
24 changes: 24 additions & 0 deletions hdfs_datanode/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,30 @@ For containerized environments, see the [Autodiscovery Integration Templates][2]
| `<INIT_CONFIG>` | blank or `{}` |
| `<INSTANCE_CONFIG>` | `{"hdfs_datanode_jmx_uri": "http://%%host%%:50075"}` |

#### Log collection

**Available for Agent >6.0**

1. Collecting logs is disabled by default in the Datadog Agent. Enable it in the `datadog.yaml` file with:

```yaml
logs_enabled: true
```

2. Add this configuration block to your `hdfs_datanode.d/conf.yaml` file to start collecting your DataNode logs:

```yaml
logs:
- type: file
path: /var/log/hadoop-hdfs/*.log
source: hdfs_datanode
service: <SERVICE_NAME>
```

Change the `path` and `service` parameter values and configure them for your environment.

3. [Restart the Agent][6].

### Validation

[Run the Agent's status subcommand][7] and look for `hdfs_datanode` under the Checks section.
Expand Down
17 changes: 17 additions & 0 deletions hdfs_datanode/datadog_checks/hdfs_datanode/data/conf.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -209,3 +209,20 @@ instances:
## Whether or not to persist cookies and use connection pooling for increased performance.
#
# persist_connections: false

## Log Section (Available for Agent >=6.0)
##
## type - mandatory - Type of log input source (tcp / udp / file / windows_event)
## port / path / channel_path - mandatory - Set port if type is tcp or udp. Set path if type is file. Set channel_path if type is windows_event
## service - mandatory - Name of the service that generated the log
## source - mandatory - Attribute that defines which Integration sent the logs
## sourcecategory - optional - Multiple value attribute. Used to refine the source attribute
## tags: - optional - Add tags to the collected logs
##
## Discover Datadog log collection: https://docs.datadoghq.com/logs/log_collection/
#
# logs:
# - type: file
# path: /var/log/hadoop-hdfs/*.log
# source: hdfs_datanode
# service: <SERVICE_NAME>
1 change: 1 addition & 0 deletions hdfs_datanode/manifest.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"categories": [
"processing",
"os & system",
"log collection",
"autodiscovery"
],
"creates_events": false,
Expand Down
24 changes: 24 additions & 0 deletions hdfs_namenode/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,30 @@ For containerized environments, see the [Autodiscovery Integration Templates][11
| `<INIT_CONFIG>` | blank or `{}` |
| `<INSTANCE_CONFIG>` | `{"hdfs_namenode_jmx_uri": "https://%%host%%:50070"}` |

#### Log collection

**Available for Agent >6.0**

1. Collecting logs is disabled by default in the Datadog Agent. Enable it in the `datadog.yaml` file with:

```yaml
logs_enabled: true
```

2. Add this configuration block to your `hdfs_namenode.d/conf.yaml` file to start collecting your NameNode logs:

```yaml
logs:
- type: file
path: /var/log/hadoop-hdfs/*.log
source: hdfs_namenode
service: <SERVICE_NAME>
```

Change the `path` and `service` parameter values and configure them for your environment.

3. [Restart the Agent][6].

### Validation

[Run the Agent's status subcommand][117] and look for `hdfs_namenode` under the Checks section.
Expand Down
17 changes: 17 additions & 0 deletions hdfs_namenode/datadog_checks/hdfs_namenode/data/conf.yaml.example
Original file line number Diff line number Diff line change
Expand Up @@ -209,3 +209,20 @@ instances:
## Whether or not to persist cookies and use connection pooling for increased performance.
#
# persist_connections: false

## Log Section (Available for Agent >=6.0)
##
## type - mandatory - Type of log input source (tcp / udp / file / windows_event)
## port / path / channel_path - mandatory - Set port if type is tcp or udp. Set path if type is file. Set channel_path if type is windows_event
## service - mandatory - Name of the service that generated the log
## source - mandatory - Attribute that defines which Integration sent the logs
## sourcecategory - optional - Multiple value attribute. Used to refine the source attribute
## tags: - optional - Add tags to the collected logs
##
## Discover Datadog log collection: https://docs.datadoghq.com/logs/log_collection/
#
# logs:
# - type: file
# path: /var/log/hadoop-hdfs/*.log
# source: hdfs_namenode
# service: <SERVICE_NAME>
1 change: 1 addition & 0 deletions hdfs_namenode/manifest.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"categories": [
"processing",
"os & system",
"log collection",
"autodiscovery"
],
"creates_events": false,
Expand Down

0 comments on commit 35b224a

Please sign in to comment.