- Filebeat Reference: other versions:
- Overview
- Getting Started With Filebeat
- Setting up and running Filebeat
- Upgrading Filebeat
- How Filebeat works
- Configuring Filebeat
- Specify which modules to run
- Configure inputs
- Manage multiline messages
- Specify general settings
- Load external configuration files
- Configure the internal queue
- Configure the output
- Configure index lifecycle management
- Load balance the output hosts
- Specify SSL settings
- Filter and enhance the exported data
- Define processors
- Add cloud metadata
- Add fields
- Add labels
- Add the local time zone
- Add tags
- Decode CEF
- Decode CSV fields
- Decode JSON fields
- Decode Base64 fields
- Decompress gzip fields
- Community ID Network Flow Hash
- Convert
- Drop events
- Drop fields from events
- Extract array
- Keep fields from events
- Registered Domain
- Rename fields from events
- Add Kubernetes metadata
- Add Docker metadata
- Add Host metadata
- Add Observer metadata
- Dissect strings
- DNS Reverse Lookup
- Add process metadata
- Script Processor
- Timestamp
- Parse data by using ingest node
- Enrich events with geoIP information
- Configure project paths
- Configure the Kibana endpoint
- Load the Kibana dashboards
- Load the Elasticsearch index template
- Configure logging
- Use environment variables in the configuration
- Autodiscover
- YAML tips and gotchas
- Regular expression support
- HTTP Endpoint
- filebeat.reference.yml
- Beats central management
- Modules
- Modules overview
- Apache module
- Auditd module
- AWS module
- CEF module
- Cisco module
- Coredns Module
- Elasticsearch module
- Envoyproxy Module
- Google Cloud module
- haproxy module
- IBM MQ module
- Icinga module
- IIS module
- Iptables module
- Kafka module
- Kibana module
- Logstash module
- MongoDB module
- MSSQL module
- MySQL module
- nats module
- NetFlow module
- Nginx module
- Osquery module
- Palo Alto Networks module
- PostgreSQL module
- RabbitMQ module
- Redis module
- Santa module
- Suricata module
- System module
- Traefik module
- Zeek (Bro) Module
- Exported fields
- Apache fields
- Auditd fields
- AWS fields
- Beat fields
- Decode CEF processor fields fields
- CEF fields
- Cisco fields
- Cloud provider metadata fields
- Coredns fields
- Docker fields
- ECS fields
- elasticsearch fields
- Envoyproxy fields
- Google Cloud fields
- haproxy fields
- Host fields
- ibmmq fields
- Icinga fields
- IIS fields
- iptables fields
- Jolokia Discovery autodiscover provider fields
- Kafka fields
- kibana fields
- Kubernetes fields
- Log file content fields
- logstash fields
- mongodb fields
- mssql fields
- MySQL fields
- nats fields
- NetFlow fields
- NetFlow fields
- Nginx fields
- Osquery fields
- panw fields
- PostgreSQL fields
- Process fields
- RabbitMQ fields
- Redis fields
- s3 fields
- Google Santa fields
- Suricata fields
- System fields
- Traefik fields
- Zeek fields
- Monitoring Filebeat
- Securing Filebeat
- Troubleshooting
- Get help
- Debug
- Common problems
- Can’t read log files from network volumes
- Filebeat isn’t collecting lines from a file
- Too many open file handlers
- Registry file is too large
- Inode reuse causes Filebeat to skip lines
- Log rotation results in lost or duplicate events
- Open file handlers cause issues with Windows file rotation
- Filebeat is using too much CPU
- Dashboard in Kibana is breaking up data fields incorrectly
- Fields are not indexed or usable in Kibana visualizations
- Filebeat isn’t shipping the last line of a file
- Filebeat keeps open file handlers of deleted files for a long time
- Filebeat uses too much bandwidth
- Error loading config file
- Found unexpected or unknown characters
- Logstash connection doesn’t work
- @metadata is missing in Logstash
- Not sure whether to use Logstash or Beats
- SSL client fails to connect to Logstash
- Monitoring UI shows fewer Beats than expected
- Contributing to Beats
Filter and enhance the exported data
editFilter and enhance the exported data
editYour use case might require only a subset of the data exported by Filebeat, or you might need to enhance the exported data (for example, by adding metadata). Filebeat provides a couple of options for filtering and enhancing exported data.
You can configure each input to include or exclude specific lines or files. This
allows you to specify different filtering criteria for each input. To do this,
you use the include_lines
, exclude_lines
, and exclude_files
options under
the filebeat.inputs
section of the config file (see
Configure inputs). The disadvantage of this approach is
that you need to implement a configuration option for each filtering criteria
that you need.
Another approach (the one described here) is to define processors to configure global processing across all data exported by Filebeat.
Processors
editYou can define processors in your configuration to process events before they are sent to the configured output. The libbeat library provides processors for:
- reducing the number of exported fields
- enhancing events with additional metadata
- performing additional processing and decoding
Each processor receives an event, applies a defined action to the event, and returns the event. If you define a list of processors, they are executed in the order they are defined in the Filebeat configuration file.
event -> processor 1 -> event1 -> processor 2 -> event2 ...
Drop event example
editThe following configuration drops all the DEBUG messages.
processors: - drop_event: when: regexp: message: "^DBG:"
To drop all the log messages coming from a certain log file:
processors: - drop_event: when: contains: source: "test"
Decode JSON example
editIn the following example, the fields exported by Filebeat include a
field, inner
, whose value is a JSON object encoded as a string:
{ "outer": "value", "inner": "{\"data\": \"value\"}" }
The following configuration decodes the inner JSON object:
filebeat.inputs: - type: log paths: - input.json json.keys_under_root: true processors: - decode_json_fields: fields: ["inner"] output.console.pretty: true
The resulting output looks something like this:
{ "@timestamp": "2016-12-06T17:38:11.541Z", "beat": { "hostname": "host.example.com", "name": "host.example.com", "version": "7.4.2" }, "inner": { "data": "value" }, "input": { "type": "log", }, "offset": 55, "outer": "value", "source": "input.json", "type": "log" }
On this page