- Observability: other versions:
- What is Elastic Observability?
- What’s new in 8.15
- Get started
- Observability AI Assistant
- Application performance monitoring (APM)
- Self manage APM Server
- Data Model
- Features
- Navigate the APM UI
- Perform common tasks in the APM UI
- Configure APM agents with central config
- Control access to APM data
- Create an alert
- Create and upload source maps (RUM)
- Create custom links
- Filter data
- Find transaction latency and failure correlations
- Identify deployment details for APM agents
- Integrate with machine learning
- Explore mobile sessions with Discover
- Observe Lambda functions
- Query your data
- Storage Explorer
- Track deployments with annotations
- Use OpenTelemetry
- Manage storage
- Configure
- Advanced setup
- Secure communication
- Monitor
- APM Server API
- APM UI API
- Troubleshoot
- Upgrade
- Release notes
- Known issues
- Log monitoring
- Infrastructure monitoring
- AWS monitoring
- Azure monitoring
- Synthetic monitoring
- Get started
- Scripting browser monitors
- Configure lightweight monitors
- Manage monitors
- Work with params and secrets
- Analyze monitor data
- Monitor resources on private networks
- Use the CLI
- Configure projects
- Multi-factor Authentication
- Configure Synthetics settings
- Grant users access to secured resources
- Manage data retention
- Use Synthetics with traffic filters
- Migrate from the Elastic Synthetics integration
- Scale and architect a deployment
- Synthetics support matrix
- Synthetics Encryption and Security
- Troubleshooting
- Uptime monitoring
- Real user monitoring
- Universal Profiling
- Alerting
- Service-level objectives (SLOs)
- Cases
- CI/CD observability
- Troubleshooting
- Fields reference
- Tutorials
Log monitoring
editLog monitoring
editLogs are an important tool for ensuring the performance and reliability of your applications and infrastructure. They provide important information for debugging, analyzing performance, and managing compliance.
On this page, you’ll find resources for sending log data to Elasticsearch, configuring your logs, and analyzing your logs.
Get started with logs
editFor a high-level overview on ingesting, viewing, and analyzing logs with Elastic, refer to Get started with logs and metrics.
To get started ingesting, parsing, and filtering your own data, refer to these pages:
-
Stream any log file: send log files from your system to Elasticsearch using a standalone Elastic Agent and configure the Elastic Agent and your data streams using the
elastic-agent.yml
file. - Parse and organize logs: break your log messages into meaningful fields that you can use to filter and analyze your data.
- Filter and aggregate logs: find specific information in your log data to gain insight and monitor your systems.
The following sections provide resources to important concepts or advanced use cases for working with your logs.
Send log data to Elasticsearch
editYou can send log data to Elasticsearch in different ways depending on your needs:
-
Elastic Agent: a single agent for logs, metrics, security data, and threat prevention. It can be deployed either standalone or managed by Fleet:
- Standalone: Manually configure, deploy and update an Elastic Agent on each host.
- Fleet: Centrally manage and update Elastic Agent policies and lifecycles in Kibana.
- Filebeat: a lightweight, logs-specific shipper for forwarding and centralizing log data.
Refer to the Elastic Agent and Beats capabilities comparison for more information on which option best fits your situation.
Install Elastic Agent
editThe following pages detail installing and managing the Elastic Agent in different modes.
-
Standalone Elastic Agent
Install an Elastic Agent and manually configure it locally on the system where it’s installed. You are responsible for managing and upgrading the agents.
Refer to Stream any log file to learn how to send a log file to Elasticsearch using a standalone Elastic Agent and configure the Elastic Agent and your data streams using the
elastic-agent.yml
file. -
Fleet-managed Elastic Agent
Install an Elastic Agent and use Fleet in Kibana to define, configure, and manage your agents in a central location.
Refer to install Fleet-managed Elastic Agent.
-
Elastic Agent in a containerized environment
Run an Elastic Agent inside of a container—either with Fleet Server or standalone.
Refer to install Elastic Agent in a containerized environment.
Install Filebeat
editFilebeat is a lightweight shipper for forwarding and centralizing log data. Installed as a service on your servers, Filebeat monitors the log files or locations that you specify, collects log events, and forwards them either to Elasticsearch or Logstash for indexing.
- Filebeat overview: general information on Filebeat and how it works.
- Filebeat quick start: basic installation instructions to get you started.
- Set up and run Filebeat: information on how to install, set up, and run Filebeat.
Parse and organize your logs
editTo get started parsing and organizing your logs, refer to Parse and organize logs for information on breaking unstructured log data into meaningful fields you can use to filter and aggregate your data.
The following resources provide information on important concepts related to parsing and organizing your logs:
- Data streams: Efficiently store append-only time series data in multiple backing indices partitioned by time and size.
- Data views: Query log entries from the data streams of specific datasets or namespaces.
- Index lifecycle management: Configure the built-in logs policy based on your application’s performance, resilience, and retention requirements.
- Ingest pipeline: Parse and transform log entries into a suitable format before indexing.
- Mapping: define how data is stored and indexed.
View and monitor logs
editWith the Logs app in Kibana you can search, filter, and tail all your logs ingested into Elasticsearch in one place.
The following resources provide information on viewing and monitoring your logs:
- Logs Explorer: monitor all of your log events flowing in from your servers, virtual machines, and containers in a centralized view.
- Inspect log anomalies: use machine learning to detect log anomalies automatically.
- Categorize log entries: use machine learning to categorize log messages to quickly identify patterns in your log events.
- Configure data sources: Specify the source configuration for logs in the Logs app settings in the Kibana configuration file.
Monitor Kubernetes logs
editYou can use the Elastic Agent with the Kubernetes integration to collect and parse Kubernetes logs. Refer to Monitor Kubernetes.
View and monitor application logs
editApplication logs provide valuable insight into events that have occurred within your services and applications.
Refer to Stream application logs.
Create a log threshold alert
editYou can create a rule to send an alert when the log aggregation exceeds a threshold.
Refer to Log threshold.
Configure the default logs template
editConfigure the default logs
template using the logs@custom
component template.
Refer to the Logs index template reference.
On this page