Now lets us create logstash configuration to parse our nginx logs
we want to read Nginx access logs from the file. Let's break down the input section:
input {
file {
path => "/var/log/nginx/access.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => json
}
}
file
: We are using thefile
input plugin to read data from a file.path
: Specify the path to your Nginx access log file.start_position => "beginning"
: Logstash starts reading the log file from the beginning.sincedb_path => "/dev/null"
: Prevents Logstash from remembering file read positions between restarts.codec => json
: Indicates that the input data is in JSON format.
The filter section defines data transformations and processing. In our case, we are parsing the time_local
field from Nginx logs. Here's the filter section:
filter {
date {
match => [ "time_local", "dd/MMM/yyyy:HH:mm:ss Z" ]
target => "time"
remove_field => ["time_local"]
}
}
date
: We use thedate
filter to parse thetime_local
field.match
: Specifies the source field (time_local
) and its date format.target => "time"
: Parsed timestamps are stored in thetime
field.remove_field => ["time_local"]
: Removes the originaltime_local
field after parsing.
The output section determines where Logstash should send the processed data. We're sending it to Elasticsearch. Here's the output section:
output {
elasticsearch {
hosts => "elasticsearch:9200"
manage_template => false
index => "nginx-access-logs-%{+YYYY.MM.dd}"
}
}
elasticsearch
: We configure theelasticsearch
output plugin to send data to Elasticsearch.hosts => "elasticsearch:9200"
: Specifies the Elasticsearch instance's address and port.manage_template => false
: Disables Logstash's management of index templates.index => "nginx-access-logs-%{+YYYY.MM.dd}"
: Defines the index pattern for Elasticsearch. The%{+YYYY.MM.dd}
adds a date-based suffix for time-based indexing.