csv

edit
  • Version: 3.0.2
  • Released on: July 14, 2016
  • Changelog

Getting Help

edit

For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.

Description

edit

CSV output.

Write events to disk in CSV or other delimited format Based on the file output, many config values are shared Uses the Ruby csv library internally

 

Synopsis

edit

This plugin supports the following configuration options:

Required configuration options:

csv {
    fields => ...
    path => ...
}

Available configuration options:

Details

edit

 

codec

edit
  • Value type is codec
  • Default value is "plain"

The codec used for output data. Output codecs are a convenient method for encoding your data before it leaves the output, without needing a separate filter in your Logstash pipeline.

create_if_deleted

edit
  • Value type is boolean
  • Default value is true

If the configured file is deleted, but an event is handled by the plugin, the plugin will recreate the file. Default ⇒ true

csv_options

edit
  • Value type is hash
  • Default value is {}

Options for CSV output. This is passed directly to the Ruby stdlib to_csv function. Full documentation is available on the Ruby CSV documentation page. A typical use case would be to use alternative column or row seperators eg: csv_options => {"col_sep" => "\t" "row_sep" => "\r\n"} gives tab seperated data with windows line endings

dir_mode

edit
  • Value type is number
  • Default value is -1

Dir access mode to use. Note that due to the bug in jruby system umask is ignored on linux: https://github.com/jruby/jruby/issues/3426 Setting it to -1 uses default OS value. Example: "dir_mode" => 0750

enable_metric

edit
  • Value type is boolean
  • Default value is true

Disable or enable metric logging for this specific plugin instance by default we record all the metrics we can, but you can disable metrics collection for a specific plugin.

fields

edit
  • This is a required setting.
  • Value type is array
  • There is no default value for this setting.

The field names from the event that should be written to the CSV file. Fields are written to the CSV in the same order as the array. If a field does not exist on the event, an empty string will be written. Supports field reference syntax eg: fields => ["field1", "[nested][field]"].

file_mode

edit
  • Value type is number
  • Default value is -1

File access mode to use. Note that due to the bug in jruby system umask is ignored on linux: https://github.com/jruby/jruby/issues/3426 Setting it to -1 uses default OS value. Example: "file_mode" => 0640

filename_failure

edit
  • Value type is string
  • Default value is "_filepath_failures"

If the generated path is invalid, the events will be saved into this file and inside the defined path.

flush_interval

edit
  • Value type is number
  • Default value is 2

Flush interval (in seconds) for flushing writes to log files. 0 will flush on every message.

gzip

edit
  • Value type is boolean
  • Default value is false

Gzip the output stream before writing to disk.

  • Value type is string
  • There is no default value for this setting.

Add a unique ID to the plugin configuration. If no ID is specified, Logstash will generate one. It is strongly recommended to set this ID in your configuration. This is particularly useful when you have two or more plugins of the same type, for example, if you have 2 grok filters. Adding a named ID in this case will help in monitoring Logstash when using the monitoring APIs.

output {
 stdout {
   id => "my_plugin_id"
 }
}

path

edit
  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

This output writes events to files on disk. You can use fields from the event as parts of the filename and/or path.

By default, this output writes one event per line in json format. You can customise the line format using the line codec like

output {
 file {
   path => ...
   codec => line { format => "custom format: %{message}"}
 }
}
The path to the file to write. Event fields can be used here,
like `/var/log/logstash/%{host}/%{application}`
One may also utilize the path option for date-based log
rotation via the joda time format. This will use the event
timestamp.
E.g.: `path => "./test-%{+YYYY-MM-dd}.txt"` to create
`./test-2013-05-29.txt`

If you use an absolute path you cannot start with a dynamic string. E.g: /%{myfield}/, /test-%{myfield}/ are not valid paths

spreadsheet_safe

edit
  • Value type is boolean
  • Default value is true

Option to not escape/munge string values. Please note turning off this option may not make the values safe in your spreadsheet application

workers

edit
  • Value type is string
  • Default value is 1

when we no longer support the :legacy type This is hacky, but it can only be herne