- Winlogbeat Reference: other versions:
- Overview
- Getting Started With Winlogbeat
- Setting up and running Winlogbeat
- Upgrading Winlogbeat
- Configuring Winlogbeat
- Set up Winlogbeat
- Specify general settings
- Configure the internal queue
- Configure the output
- Configure index lifecycle management
- Specify SSL settings
- Filter and Enhance the exported data
- Define processors
- Add cloud metadata
- Add Docker metadata
- Add fields
- Add Host metadata
- Add Kubernetes metadata
- Add labels
- Add the local time zone
- Add Observer metadata
- Add process metadata
- Add tags
- Community ID Network Flow Hash
- Convert
- Decode Base64 fields
- Decode JSON fields
- Decompress gzip fields
- Dissect strings
- DNS Reverse Lookup
- Drop events
- Drop fields from events
- Extract array
- Keep fields from events
- Registered Domain
- Rename fields from events
- Script Processor
- Timestamp
- Parse data by using ingest node
- Enrich events with geoIP information
- Configure project paths
- Configure the Kibana endpoint
- Load the Kibana dashboards
- Load the Elasticsearch index template
- Configure logging
- Use environment variables in the configuration
- YAML tips and gotchas
- HTTP Endpoint
- winlogbeat.reference.yml
- Modules
- Exported fields
- Monitoring Winlogbeat
- Securing Winlogbeat
- Troubleshooting
- Get Help
- Debug
- Common problems
- Dashboard in Kibana is breaking up data fields incorrectly
- Bogus computer_name fields are reported in some events
- Error loading config file
- Found unexpected or unknown characters
- Logstash connection doesn’t work
- @metadata is missing in Logstash
- Not sure whether to use Logstash or Beats
- SSL client fails to connect to Logstash
- Monitoring UI shows fewer Beats than expected
- Not sure how to read from .evtx files
- Contributing to Beats
Timestamp
editTimestamp
editThis functionality is in beta and is subject to change. The design and code is less mature than official GA features and is being provided as-is with no warranties. Beta features are not subject to the support SLA of official GA features.
The timestamp
processor parses a timestamp from a field. By default the
timestamp processor writes the parsed result to the @timestamp
field. You can
specify a different field by setting the target_field
parameter. The timestamp
value is parsed according to the layouts
parameter. Multiple layouts can be
specified and they will be used sequentially to attempt parsing the timestamp
field.
The timestamp layouts used by this processor are different than the formats supported by date processors in Logstash and Elasticsearch Ingest Node.
The layouts
are described using a reference time that is based on this
specific time:
Mon Jan 2 15:04:05 MST 2006
Since MST is GMT-0700, the reference time is:
01/02 03:04:05PM '06 -0700
To define your own layout, rewrite the reference time in a format that matches the timestamps you expect to parse. For more layout examples and details see the Go time package documentation.
If a layout does not contain a year then the current year in the specified
timezone
is added to the time value.
Table 2. Timestamp options
Name | Required | Default | Description | |
---|---|---|---|---|
|
yes |
Source field containing the time to be parsed. |
||
|
no |
@timestamp |
Target field for the parsed time value. The target value is always written as UTC. |
|
|
yes |
Timestamp layouts that define the expected time value format. In addition layouts, |
||
|
no |
UTC |
Timezone (e.g. America/New_York) to use when parsing a timestamp not containing a timezone. |
|
|
no |
false |
Ignore errors when the source field is missing. |
|
|
no |
false |
Ignore all errors produced by the processor. |
|
|
no |
A list of timestamps that must parse successfully when loading the processor. |
||
|
no |
An identifier for this processor instance. Useful for debugging. |
Here is an example that parses the start_time
field and writes the result
to the @timestamp
field then deletes the start_time
field. When the
processor is loaded it will immediately validate that the two test
timestamps
parse with this configuration.
processors: - timestamp: field: start_time layouts: - '2006-01-02T15:04:05Z' - '2006-01-02T15:04:05.999Z' test: - '2019-06-22T16:33:51Z' - '2019-11-18T04:59:51.123Z' - drop_fields: fields: [start_time]