CloudWatch Logs Plugin for Fluentd
$ gem install fluent-plugin-cloudwatch-logs
Create IAM user with a policy like the following:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:*",
"s3:GetObject"
],
"Resource": [
"arn:aws:logs:us-east-1:*:*",
"arn:aws:s3:::*"
]
}
]
}
Set region and credentials:
$ export AWS_REGION=us-east-1
$ export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
$ export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_ACCESS_KEY"
Start fluentd:
$ fluentd -c example/fluentd.conf
Send sample log to CloudWatch Logs:
$ echo '{"hello":"world"}' | fluent-cat test.cloudwatch_logs.out
Fetch sample log from CloudWatch Logs:
# stdout
2014-07-17 00:28:02 +0900 test.cloudwatch_logs.in: {"hello":"world"}
<match tag>
type cloudwatch_logs
log_group_name log-group-name
log_stream_name log-stream-name
auto_create_stream true
#message_keys key1,key2,key3,...
#max_message_length 32768
#use_tag_as_group false
#use_tag_as_stream false
#include_time_key true
#localtime true
#log_group_name_key group_name_key
#log_stream_name_key stream_name_key
#remove_log_group_name_key true
#remove_log_stream_name_key true
</match>
log_group_name
: name of log group to store logslog_stream_name
: name of log stream to store logsauto_create_stream
: to create log group and stream automaticallymessage_keys
: keys to send messages as eventsmax_message_length
: maximum length of the messagemax_events_per_batch
: maximum number of events to send at once (default 10000)use_tag_as_group
: to use tag as a group nameuse_tag_as_stream
: to use tag as a stream nameinclude_time_key
: include time key as part of the log entry (defaults to UTC)localtime
: use localtime timezone forinclude_time_key
output (overrides UTC default)log_group_name_key
: use specified field of records as log group namelog_stream_name_key
: use specified field of records as log stream nameremove_log_group_name_key
: remove field specified bylog_group_name_key
remove_log_stream_name_key
: remove field specified bylog_stream_name_key
<source>
type cloudwatch_logs
tag cloudwatch.in
log_group_name group
log_stream_name stream
#use_log_stream_name_prefix true
state_file /var/lib/fluent/group_stream.in.state
</source>
tag
: fluentd taglog_group_name
: name of log group to fetch logslog_stream_name
: name of log stream to fetch logsuse_log_stream_name_prefix
: to uselog_stream_name
as log stream name prefix (default false)state_file
: file to store current state (e.g. next_forward_token)
Set credentials:
$ export AWS_REGION=us-east-1
$ export AWS_ACCESS_KEY_ID="YOUR_ACCESS_KEY"
$ export AWS_SECRET_ACCESS_KEY="YOUR_SECRET_KEY"
Run tests:
$ rake test
Or, If you do not want to use IAM roll or ENV(this is just like writing to configuration file) :
$ rake aws_key_id=YOUR_ACCESS_KEY aws_sec_key=YOUR_SECRET_KEY region=us-east-1 test
- out_cloudwatch_logs
- if the data is too big for API, split into multiple requests
- format
- check data size
- in_cloudwatch_logs
- format
- fallback to start_time because next_token expires after 24 hours
- Fork it ( https://github.com/[my-github-username]/fluent-plugin-cloudwatch-logs/fork )
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create a new Pull Request