Es_bulk codec plugin v3.1.0
editEs_bulk codec plugin v3.1.0
edit- Plugin version: v3.1.0
- Released on: 2021-08-19
- Changelog
For other versions, see the overview list.
To learn more about Logstash, see the Logstash Reference.
Getting help
editFor questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.
Description
editThis codec will decode the Elasticsearch bulk format
into individual events, plus metadata into the @metadata
field.
Encoding is not supported at this time as the Elasticsearch output submits Logstash events in bulk format.
Codec settings in the logstash-input-http
plugin
editThe input-http plugin has two
configuration options for codecs: codec
and additional_codecs
.
Values in additional_codecs
are prioritized over those specified in the
codec
option. That is, the default codec
is applied only if no codec
for the request’s content-type is found in the additional_codecs
setting.
Event Metadata and the Elastic Common Schema (ECS)
editWhen ECS compatibility is disabled, the metadata is stored in the [@metadata]
field.
When ECS is enabled, the metadata is stored in the [@metadata][codec][es_bulk]
field.
ES Bulk Codec Configuration Options
editSetting | Input type | Required |
---|---|---|
No |
||
No |
ecs_compatibility
edit- Value type is string
-
Supported values are:
-
disabled
: unstructured metadata added at @metadata -
v1
: uses[@metadata][codec][es_bulk]
fields
-
Controls this plugin’s compatibility with the Elastic Common Schema (ECS).
target
edit- Value type is string
- There is no default value for this setting.
Define the target field for placing the values. If this setting is not set, the data will be stored at the root (top level) of the event.
For example, if you want data to be put under the document
field:
input { kafka { codec => es_bulk { target => "[document]" } } }