Configure Centralized Pipeline Management
editConfigure Centralized Pipeline Management
editTo configure centralized pipeline management:
-
Verify that you are using a license that includes the pipeline management feature.
For more information, see https://www.elastic.co/subscriptions and License management.
-
Specify configuration management settings in the
logstash.ymlfile. At a minimum, set:-
xpack.management.enabled: trueto enable centralized configuration management. -
xpack.management.elasticsearch.hoststo specify the Elasticsearch instance that will store the Logstash pipeline configurations and metadata. -
xpack.management.pipeline.idto register the pipelines that you want to centrally manage.
-
- Restart Logstash.
-
If your Elasticsearch cluster is protected with basic authentication, assign
the built-in
logstash_adminrole as well as thelogstash_writerrole to any users who will use centralized pipeline management. See Secure your connection for more information.
Centralized management is disabled until you configure and enable security features.
After you’ve configured Logstash to use centralized pipeline
management, you can no longer specify local pipeline configurations. This means
that the pipelines.yml file and settings like path.config and
config.string are inactive when this feature is enabled.
Configuration Management Settings in Logstash
editYou can set the following xpack.management settings in logstash.yml to
enable
centralized pipeline management.
For more information about configuring Logstash, see logstash.yml.
The following example shows basic settings that assume Elasticsearch and Kibana are installed on the localhost with basic AUTH enabled, but no SSL. If you’re using SSL, you need to specify additional SSL settings.
xpack.management.enabled: true xpack.management.elasticsearch.hosts: "http://localhost:9200/" xpack.management.elasticsearch.username: logstash_admin_user xpack.management.elasticsearch.password: t0p.s3cr3t xpack.management.logstash.poll_interval: 5s xpack.management.pipeline.id: ["apache", "cloudwatch_logs"]
-
xpack.management.enabled -
Set to
trueto enable X-Pack centralized configuration management for Logstash. -
xpack.management.logstash.poll_interval - How often the Logstash instance polls for pipeline changes from Elasticsearch. The default is 5s.
-
xpack.management.pipeline.id -
Specify a comma-separated list of pipeline IDs to register for centralized
pipeline management. After changing this setting, you need to restart Logstash
to pick up changes.
Pipeline IDs support
*as a wildcard for matching multiple IDs -
xpack.management.elasticsearch.hosts -
The Elasticsearch instance that will store the Logstash pipeline configurations and
metadata. This might be the same Elasticsearch instance specified in the
outputssection in your Logstash configuration, or a different one. Defaults tohttp://localhost:9200. -
xpack.management.elasticsearch.usernameandxpack.management.elasticsearch.password -
If your Elasticsearch cluster is protected with basic authentication, these settings
provide the username and password that the Logstash instance uses to
authenticate for accessing the configuration data. The username you specify here
should have the built-in
logstash_adminrole and the customizedlogstash_writerrole, which provides access to system indices for managing configurations. Starting with Elasticsearch version 7.10.0, thelogstash_adminrole inherits themanage_logstash_pipelinescluster privilege for centralized pipeline management. If a user has created their own roles and granted them access to the .logstash index, those roles will continue to work in 7.x but will need to be updated for 8.0. -
xpack.management.elasticsearch.proxy - Optional setting that allows you to specify a proxy URL if Logstash needs to use a proxy to reach your Elasticsearch cluster.
-
xpack.management.elasticsearch.ssl.certificate_authority -
Optional setting that enables you to specify a path to the
.pemfile for the certificate authority for your Elasticsearch instance. -
xpack.management.elasticsearch.ssl.truststore.path - Optional setting that provides the path to the Java keystore (JKS) to validate the server’s certificate.
-
xpack.management.elasticsearch.ssl.truststore.password - Optional setting that provides the password to the truststore.
-
xpack.management.elasticsearch.ssl.keystore.path - Optional setting that provides the path to the Java keystore (JKS) to validate the client’s certificate.
-
xpack.management.elasticsearch.ssl.keystore.password - Optional setting that provides the password to the keystore.
-
xpack.management.elasticsearch.cloud_id -
If you’re using Elasticsearch in Elastic Cloud, you should specify the identifier here.
This setting is an alternative to
xpack.management.elasticsearch.hosts. Ifcloud_idis configured,xpack.management.elasticsearch.hostsshould not be used. This Elasticsearch instance will store the Logstash pipeline configurations and metadata. -
xpack.management.elasticsearch.cloud_auth -
If you’re using Elasticsearch in Elastic Cloud, you can set your auth credentials here.
This setting is an alternative to both
xpack.management.elasticsearch.usernameandxpack.management.elasticsearch.password. Ifcloud_authis configured, those settings should not be used. The credentials you specify here should be for a user with thelogstash_adminrole, which provides access to system indices for managing configurations. -
xpack.management.elasticsearch.api_key -
Authenticate using an Elasticsearch API key. Note that this option also requires using SSL.
The API key Format is
id:api_keywhereidandapi_keyare as returned by the Elasticsearch Create API key API.
Wildcard support in pipeline ID
editPipeline IDs must begin with a letter or underscore and contain only letters, underscores, dashes, and numbers.
You can use * in xpack.management.pipeline.id to match any number of letters, underscores, dashes, and numbers.
xpack.management.pipeline.id: ["*logs", "*apache*", "tomcat_log"]
In this example, "*logs" matches all IDs ending in logs. "*apache*" matches any IDs with apache in the name.
Wildcard in pipeline IDs is available starting with Elasticsearch 7.10. Logstash can pick up new pipeline without a restart if the new pipeline ID matches the wildcard pattern.