Skip to content

Commit

Permalink
Incorporate review comments
Browse files Browse the repository at this point in the history
  • Loading branch information
karenzone committed Jul 18, 2018
1 parent 2d6a134 commit 31a77d2
Showing 1 changed file with 75 additions and 107 deletions.
182 changes: 75 additions & 107 deletions docs/static/azure-module.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,19 @@
=== Azure Module [Experimental]
experimental[]

The https://azure.microsoft.com/en-us/overview/what-is-azure/[Microsoft Azure] module in Logstash helps you easily integrate your Azure
activity logs and SQL diagnostic logs with the Elastic Stack.
The https://azure.microsoft.com/en-us/overview/what-is-azure/[Microsoft Azure]
module in Logstash helps you easily integrate your Azure activity logs and SQL
diagnostic logs with the Elastic Stack.

You can monitor your Azure cloud environments and SQL DB deployments with
deep operational insights across multiple Azure subscriptions. You can explore
the health of your infrastructure in real-time, accelerating root cause analysis
and decreasing overall time to resolution. The Azure module helps you:

* Analyze infrastructure changes and authorization activity
* Identify suspicious behaviors and potential malicious actors
* Perform root-cause analysis by investigating user activity
* Monitor and optimize your SQL DB deployments.

NOTE: The Logstash Azure module is an
https://www.elastic.co/products/x-pack[{xpack}] feature under the Basic License
Expand All @@ -21,69 +32,64 @@ suite of {kib} dashboards to help you start exploring your data immediately.
[[azure-dashboards]]
==== Dashboards

These {kib} dashboards are available and ready for you to use.
These {kib} dashboards are available and ready for you to use. You can use the dashboards they are, or tailor them to meet your needs.

===== Infrastructure activity monitoring

* *Overview*. Top-level view into your Azure operations, including info about users, resource groups, service health, access, activities, and alerts.

* *Alerts*. Alert info, including activity, alert status (activated, resolved, succeeded), and alerts heatmap

* *User Activity*. Info about system users, their activity, and requests.

===== SQL database monitoring

* *SQL DB Overview*. Top-level view into your SQL databases, including counts for databases, servers, resource groups, and subscriptions.

* *SQL DB Database View*. Detailed info about each SQL database, including wait time, errors, DTU and storage utilization, size, and read and write input/output.

* *SQL DB Queries*. Info about SQL database queries, including DTU Utilization, errors, and query duration and wait time.

* *User Activity*. Info about system users, their activity, and requests.

You can use the dashboards they are, or tailor them to meet your needs.


==== Azure_event_hubs plugin

The Azure module uses the `azure_event_hubs` plugin. Basic understanding of the
plugin is helpful when you set up the Azure module. See
plugin and options is helpful when you set up the Azure module. See
{logstash-ref}/plugins-inputs-azure_event_hubs.html[azure_event_hubs plugin
documentation] for more information about configurations and options.


[[azure-prereqs]]
[[azure-module-prereqs]]
==== Prerequisites

These instructions assume that Logstash, Elasticsearch, and Kibana are
installed and running. The products are
https://www.elastic.co/downloads[available to download] and easy to install.

The Elastic Stack 6.4 (or later) is required for this module.
The Elastic Stack and Microsoft Azure Event Hubs are required for this module.

NOTE: Logstash, Elasticsearch, and Kibana must run locally. You can also run
Elasticsearch, Kibana and Logstash on separate hosts to consume data from Azure.
[[azure-elk-prereqs]]
===== Elastic prereqs

Logstash, Elasticsearch, and Kibana should be installed and running. The
products are https://www.elastic.co/downloads[available to download] and easy to
install.

[[azure-setup]]
==== Installation and setup
The Elastic Stack version 6.4 (or later) is required for this module.

To get started with the Azure module:

. Install the {logstash-ref}/plugins-inputs-azure_event_hubs.html[azure_event_hubs
plugin].
. Set up the Azure module.
NOTE: Logstash, Elasticsearch, and Kibana must run locally. You can also run
Elasticsearch, Kibana and Logstash on separate hosts to consume data from Azure.

[[azure-plugin-setup]]
===== Install the plugin
[[azure-prereqs]]
===== Azure prereqs

TBD: From the LS directory?
Run this command to install the plugin:
Azure Monitor should be configured to stream logs to one or more Event Hubs.
See <<azure-resources>> at the end of this topic for links to Microsoft Azure documentation.

["source","shell"]
-----
bin/logstash-plugin install logstash_input_azure_event_hubs
-----

[[azure-module-setup]]
===== Set up the module

TBD: From the LS directory?
==== Set up the module

Modify this command for your environment, and run it.
Modify this command for your environment, and run it from the Logstash
directory.

["source","shell",subs="attributes"]
-----
Expand All @@ -109,21 +115,35 @@ NOTE: The `--setup` option is intended only for first-time setup. If you include
You can specify <<azure_config_options, options>> for the Logstash Azure module in the
`logstash.yml` configuration file or with overrides through the command line.

The azure_event_hubs plugin and the Azure module support two configuration
models: basic and advanced. *Basic configuration* is the default, and accepts
inputs from multiple Event Hubs.
* *Command line configuration.* You can pass configuration options and run the module from the command line.
The command line configuration uses the configuration options you provide and supports only one Event Hub.

*Advanced configuration* is available for deployments where different Event Hubs
require different configurations.Advanced configuration is not necessary or
* *Basic configuration.* You can use the `logstash.yml` file to configure inputs from multiple Event Hubs that share the same configuration.

* *Advanced configuration.* The advanced configuration is available for deployments where different Event Hubs
require different configurations. The `logstash.yml` file holds your settings. Advanced configuration is not necessary or
recommended for most use cases.

See {logstash-ref}/plugins-inputs-azure_event_hubs.html[azure_event_hubs plugin
documentation] for more information about basic and advanced configuration
models.

===== Basic configuration samples

All configuration is shared between Event Hubs
===== Command line configuration sample

You can use the command line to set up the basic configuration for a single
Event Hub. This command starts the Azure module with command line arguments,
bypassing any settings in `logstash.yml`.

["source","shell",subs="attributes"]
-----
bin/logstash --modules azure -M "azure.var.elasticsearch.host=es.mycloud.com" -M "azure.var.input.azure_event_hubs.threads=8" -M "azure.var.input.azure_event_hubs.consumer_group=logstash" -M "azure.var.input.azure_event_hubs.decorate_events=true" -M "azure.var.input.azure_event_hubs.event_hub_connections=Endpoint=sb://example1...EntityPath=insights-logs-errors" -M "azure.var.input.azure_event_hubs.storage_connection=DefaultEndpointsProtocol=https;AccountName=example...."
-----


===== Basic configuration sample

The configuration in the `logstash.yml` file is shared between Event Hubs.

["source","shell",subs="attributes"]
-----
Expand All @@ -140,20 +160,10 @@ modules:
- "Endpoint=sb://example2...EntityPath=insights-metrics-pt1m"
-----

**Command line for basic configuration**

You can use the command line to set up the basic configuration for a single
Event Hub.

["source","shell",subs="attributes"]
-----
bin/logstash --modules azure -M "azure.var.elasticsearch.host=es.mycloud.com" -M "azure.var.input.azure_event_hubs.threads=8" -M "azure.var.input.azure_event_hubs.consumer_group=logstash" -M "azure.var.input.azure_event_hubs.decorate_events=true" -M "azure.var.input.azure_event_hubs.event_hub_connections=Endpoint=sb://example1...EntityPath=insights-logs-errors" -M "azure.var.input.azure_event_hubs.storage_connection=DefaultEndpointsProtocol=https;AccountName=example...."
-----

===== Advanced configuration sample

Advanced configuration supports Event Hub specific options.
It is not necessary or recommended for most use cases. Use
Advanced configuration in the `logstash.yml` file supports Event Hub specific options.
Advanced configuration is not necessary or recommended for most use cases. Use
it only if it is required for your deployment scenario.

You must define the `header` array with `name` in the first position. You can
Expand Down Expand Up @@ -190,19 +200,6 @@ configurations, with the following exceptions. The basic configuration uses
`event-hub-connections`. The the advanced configuration uses `event_hubs` and
`event_hub_connection`.

[id="plugins-{type}s-{plugin}-config_mode"]
===== `config_mode`
* Value type is <<string,string>>
* Valid entries are `basic` or `advanced`
* Default value is `basic`

[source,ruby]
----
azure_event_hubs {
event_hub_connections => ["Endpoint=sb://example1...;EntityPath=event_hub_name1" , "Endpoint=sb://example2...;EntityPath=event_hub_name2" ]
}
----

[id="plugins-{type}s-{plugin}-event_hubs"]
===== `event_hubs`
* Value type is <<array,array>>
Expand Down Expand Up @@ -474,7 +471,8 @@ include::shared-module-options.asciidoc[]
[[run-azure]]
==== Start the module

Run this command from the Logstash install directory:
. Be sure that the `logstash.yml` file is <<configuring-azure,configured correctly>>.
. Run this command from the Logstash directory:

["source","shell",subs="attributes"]
-----
Expand Down Expand Up @@ -532,8 +530,7 @@ assurances of information accuracy or passivity.

Many of the logs contain a "properties" top level field. This is often where the
most interesting data lives. There is not a fixed schema between log types for
properties fields coming from different sources. This can cause mapping errors
when shipping the data to Elasticsearch.
properties fields coming from different sources.

For example, one log may have
properties.type where one log sets this a String type and another sets this an
Expand All @@ -544,48 +541,19 @@ properties.type may end up as sql_diagnostics_Errors_properties.type or
activity_log_Security_properties.type depending on the group/category from where
the event originated.



==== Testing

Testing modules is easiest with Docker and Docker compose to stand up instances of Elasticsearch and Kibana. Below is a Docker compose file that can be used for quick testing.

[source,shell]
----
version: '3'
# docker-compose up --force-recreate
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.2.4
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx512m -Xms512m"
discovery.type: "single-node"
networks:
- ek
kibana:
image: docker.elastic.co/kibana/kibana:6.2.4
ports:
- "5601:5601"
networks:
- ek
depends_on:
- elasticsearch
networks:
ek:
driver: bridge
----

[[azure-production]]
==== Deploying the module in production

Use security best practices to secure your configuration.
See {stack-ov}/xpack-security.html for details and recommendations.

[[azure-resources]]
==== Microsoft Azure resources

Microsoft is the best source for the most up-to-date Azure information.

* [Overview of Azure Monitor]https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-azure-monitor
* [Azure SQL Database metrics and diagnostics logging]https://docs.microsoft.com/en-us/azure/sql-database/sql-database-metrics-diag-logging
* [Stream the Azure Activity Log to Event Hubs]https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-stream-activity-logs-event-hubs


0 comments on commit 31a77d2

Please sign in to comment.