The Elastic Integration filter plugin for Logstash allows you to process data from Elastic integrations through executing ingest pipelines within Logstash, before forwarding the data to Elastic.
Why should I use it?
This approach has the advantage of offloading data processing operations outside of your Elastic deployment and onto Logstash, giving you flexibility on where this should occur.
Additionally, with Logstash as the final route for your data before ingestion into Elastic, it could save you from having to open different ports and set different firewall rules for each agent or beats instance, as Logstash could aggregate all output from these components.
Prerequisites
You have an Elastic agent with one or more integrations inside an agent policy running on a server. If you need to install an Elastic agent, you can follow the guide here.
Steps
We will:
-
Install Logstash, but not run it until all steps are complete
-
Generate custom certificates and keys on our Logstash server, to enable secure communication between Fleet server and Logstash
-
Configure Fleet to add a Logstash output
-
Set up Logstash, including a custom pipeline that receives input from Elastic agent, uses the integration filter plugin, and finally forwards the events to Elastic
-
Start Logstash
-
Update an agent policy to use that new Logstash output
Installing Logstash
Use this guide to install Logstash on your server.
Set up SSL/TLS on the Logstash server
Use this guide to create custom certificates and keys for securing the Logstash output connection that will be used by Fleet. We need to do this before we set up a custom pipeline file for Logstash, as we’ll refer to some of the certificate values in that config.
As per the guide, I downloaded Elasticsearch so I could use the certutil tool that is included, and extracted the contents.
Add a Logstash output to Fleet in Kibana
With our certificates and keys to hand, we can complete the steps necessary to set up a Logstash output for Fleet from within Kibana. Do not yet set the Logstash output on an agent policy, as we need to configure a custom pipeline in Logstash first.
Set up a custom pipeline for Logstash
We need to add a custom pipeline yml file, which will include our Elastic agent input and integration filter. The typical definition for a Logstash pipeline is this:
Our custom pipeline yml file will start with the Elastic agent input plugin, the guide for which is here.
We will then have the integration filter, and an output to Elastic Cloud that will be different depending on if you are ingesting to a hosted cloud deployment, or a serverless project.
Your completed file should look something like this:
input {
elastic_agent {
port => 5044
ssl_enabled => true
ssl_certificate_authorities => ["/pathtoca/ca.crt"]
ssl_certificate => "/pathtologstashcrt/logstash.crt"
ssl_key => "/pathtologstashkey/logstash.pkcs8.key"
ssl_client_authentication => "required"
}
}
filter {
elastic_integration{
cloud_id => "Ross_is_Testing:123456"
cloud_auth => "elastic:yourpasswordhere"
}
}
output {
# For cloud hosted deployments
elasticsearch {
cloud_id => "Ross_is_Testing:123456
cloud_auth => "elastic:yourpasswordhere"
data_stream => true
ssl => true
ecs_compatibility => v8
}
# For serverless projects
elasticsearch {
hosts => ["https://projectname.es.us-east-1.aws.elastic.cloud:443"]
api_key => "yourapikey-here"
data_stream => true
ssl => true
ecs_compatibility => v8
}
}
The above syntax for the output section is valid, you can specify multiple outputs!
For cloud hosted deployments you can use the deployment’s CloudId for authentication, which you can get from the cloud admin console, on the deployment overview screen:
I'm also using a username and password, but you could instead specify an API key if desired.
For serverless projects, you’ll need to use your Elasticsearch endpoint and an API key to connect Logstash, as documented here. You can get the Elasticsearch endpoint from the manage project screen of the cloud admin console:
Ensure the main pipelines.yml file for Logstash also includes a reference to our custom pipeline file:
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
# https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
- pipeline.id: fromagent
path.config: "/etc/logstash/conf.d/agent.conf"
We can then start Logstash. As we haven’t yet updated an Elastic agent policy to use our Logstash output, no events will yet be going through Logstash.
Update an agent policy to use our Logstash output
With Logstash running, we can now set our configured Logstash output on an agent policy of our choosing.
Complete
Events from the integrations on the chosen agent policy will be sent through Logstash, and relevant ingest pipelines run within Logstash to process the data before sending to Elastic Cloud.
The release and timing of any features or functionality described in this post remain at Elastic's sole discretion. Any features or functionality not currently available may not be delivered on time or at all.