A newer version is available. For the latest information, see the
current release documentation.
Glossary of Terms
editGlossary of Terms
edit- @metadata
-
A special field for storing content that you don’t want to include in output events. For example, the
@metadata
field is useful for creating transient fields for use in conditional statements. - codec plugin
- A Logstash plugin that changes the data representation of an event. Codecs are essentially stream filters that can operate as part of an input or output. Codecs enable you to separate the transport of messages from the serialization process. Popular codecs include json, msgpack, and plain (text).
- conditional
-
A control flow that executes certain actions based on whether a statement (also called a condition) is true or false. Logstash supports
if
,else if
, andelse
statements. You can use conditional statements to apply filters and send events to a specific output based on conditions that you specify. - event
- A single unit of information, containing a timestamp plus additional data. An event arrives via an input, and is subsequently parsed, timestamped, and passed through the Logstash pipeline.
- field
- An event property. For example, each event in an apache access log has properties, such as a status code (200, 404), request path ("/", "index.html"), HTTP verb (GET, POST), client IP address, and so on. Logstash uses the term "fields" to refer to these properties.
- field reference
-
A reference to an event field. This reference may appear in an output block or filter block in the
Logstash config file. Field references are typically wrapped in square (
[]
) brackets, for example[fieldname]
. If you are referring to a top-level field, you can omit the[]
and simply use the field name. To refer to a nested field, you specify the full path to that field:[top-level field][nested field]
. - filter plugin
- A Logstash plugin that performs intermediary processing on an event. Typically, filters act upon event data after it has been ingested via inputs, by mutating, enriching, and/or modifying the data according to configuration rules. Filters are often applied conditionally depending on the characteristics of the event. Popular filter plugins include grok, mutate, drop, clone, and geoip. Filter stages are optional.
- gem
- A self-contained package of code that’s hosted on RubyGems.org. Logstash plugins are packaged as Ruby Gems. You can use the Logstash plugin manager to manage Logstash gems.
- hot thread
- A Java thread that has high CPU usage and executes for a longer than normal period of time.
- input plugin
- A Logstash plugin that reads event data from a specific source. Input plugins are the first stage in the Logstash event processing pipeline. Popular input plugins include file, syslog, redis, and beats.
- indexer
- A Logstash instance that is tasked with interfacing with an Elasticsearch cluster in order to index event data.
- message broker
- Also referred to as a message buffer or message queue, a message broker is external software (such as Redis, Kafka, or RabbitMQ) that stores messages from the Logstash shipper instance as an intermediate store, waiting to be processed by the Logstash indexer instance.
- output plugin
- A Logstash plugin that writes event data to a specific destination. Outputs are the final stage in the event pipeline. Popular output plugins include elasticsearch, file, graphite, and statsd.
- pipeline
- A term used to describe the flow of events through the Logstash workflow. A pipeline typically consists of a series of input, filter, and output stages. Input stages get data from a source and generate events, filter stages, which are optional, modify the event data, and output stages write the data to a destination. Inputs and outputs support codecs that enable you to encode or decode the data as it enters or exits the pipeline without having to use a separate filter.
- plugin
- A self-contained software package that implements one of the stages in the Logstash event processing pipeline. The list of available plugins includes input plugins, output plugins, codec plugins, and filter plugins. The plugins are implemented as Ruby gems and hosted on RubyGems.org. You define the stages of an event processing pipeline by configuring plugins.
- plugin manager
-
Accessed via the
bin/logstash-plugin
script, the plugin manager enables you to manage the lifecycle of plugins in your Logstash deployment. You can install, remove, and upgrade plugins by using the plugin manager Command Line Interface (CLI). - shipper
- An instance of Logstash that send events to another instance of Logstash, or some other application.
- worker
- The filter thread model used by Logstash, where each worker receives an event and applies all filters, in order, before emitting the event to the output queue. This allows scalability across CPUs because many filters are CPU intensive.