- Observability: other versions:
- Get started
- What is Elastic Observability?
- What’s new in 8.17
- Quickstart: Monitor hosts with Elastic Agent
- Quickstart: Monitor your Kubernetes cluster with Elastic Agent
- Quickstart: Monitor hosts with OpenTelemetry
- Quickstart: Unified Kubernetes Observability with Elastic Distributions of OpenTelemetry (EDOT)
- Quickstart: Collect data with AWS Firehose
- Add data from Splunk
- Applications and services
- Application performance monitoring (APM)
- Get started
- Learn about data types
- Collect application data
- View and analyze data
- Act on data
- Use APM securely
- Manage storage
- Configure APM Server
- Monitor APM Server
- APM APIs
- Troubleshooting
- Upgrade
- Release notes
- Known issues
- Synthetic monitoring
- Get started
- Scripting browser monitors
- Configure lightweight monitors
- Manage monitors
- Work with params and secrets
- Analyze monitor data
- Monitor resources on private networks
- Use the CLI
- Configure projects
- Multi-factor Authentication
- Configure Synthetics settings
- Grant users access to secured resources
- Manage data retention
- Use Synthetics with traffic filters
- Migrate from the Elastic Synthetics integration
- Scale and architect a deployment
- Synthetics support matrix
- Synthetics Encryption and Security
- Troubleshooting
- Real user monitoring
- Uptime monitoring (deprecated)
- Tutorial: Monitor a Java application
- Application performance monitoring (APM)
- CI/CD
- Cloud
- Infrastructure and hosts
- Logs
- Troubleshooting
- Incident management
- Data set quality
- Observability AI Assistant
- Reference
GCP Dataflow templates
editGCP Dataflow templates
editIn this tutorial, you’ll learn how to ship logs directly from the Google Cloud Console with the Dataflow template for analyzing GCP Audit Logs in the Elastic Stack.
What you’ll learn
editYou’ll learn how to:
- Export GCP audit logs through Pub/Sub topics and subscriptions.
- Ingest logs using Google Dataflow and view those logs in Kibana.
Before you begin
editCreate a deployment using our hosted Elasticsearch Service on Elastic Cloud. The deployment includes an Elasticsearch cluster for storing and searching your data, and Kibana for visualizing and managing your data.
Step 1: Install the GCP integration
editYou’ll start with installing the Elastic GCP integration to add pre-built dashboards, ingest node configurations, and other assets that help you get the most of the GCP logs you ingest.
- Find Integrations in the main menu or use the global search field.
-
Search for
gcp
. -
Click the Elastic Google Cloud Platform (GCP) integration to see more details about it, then click Add Google Cloud Platform (GCP).
- Click Save integration.
Step 2: Create a Pub/Sub topic and subscription
editBefore configuring the Dataflow template, create a Pub/Sub
topic and subscription from your Google Cloud Console where you can send your
logs from Google Operations Suite.
There are three available filesets:
audit
, vpcflow
, firewall
. This tutorial covers the audit
fileset.
-
Go to the Logs Router page to configure GCP to export logs to a Pub/Sub topic. Use the search bar to find the page:
To set up the logs routing sink, click Create sink. Set sink name as
monitor-gcp-audit-sink
. Select the Cloud Pub/Sub topic as the sink service and Create new Cloud Pub/Sub topic namedmonitor-gcp-audit
:Finally, under Choose logs to include in sink, add
logName:"cloudaudit.googleapis.com"
(it includes all audit logs). Click create sink. It will look something like the following: -
Now go to the Pub/Sub page to add a subscription to the topic you just created. Use the search bar to find the page:
To add a subscription to the
monitor-gcp-audit
topic click Create subscription:Set
monitor-gcp-audit-sub
as the Subscription ID and leave the Delivery type as pull:Finally, scroll down and click Create.
Step 3: Configure the Google Dataflow template
editAfter creating a Pub/Sub topic and subscription, go to the Dataflow Jobs page and configure your template to use them. Use the search bar to find the page:

To create a job, click Create Job From Template.
Set Job name as auditlogs-stream
and select Pub/Sub to Elasticsearch
from
the Dataflow template dropdown menu:

Before running the job, fill in required parameters:

For Cloud Pub/Sub subscription, use the subscription you created in the previous step. For Cloud ID and Base64-encoded API Key, use the values you got earlier. If you don’t have an Error output topic, create one like you did in the previous step.
After filling the required parameters, click Show Optional Parameters and add
audit
as the log type parameter.

When you are all set, click Run Job and wait for Dataflow to execute the template, which takes a few minutes.
Finally, navigate to Kibana to see your logs parsed and visualized in the [Logs GCP] Audit dashboard.

Besides collecting audit logs from your Google Cloud Platform, you can also use Dataflow integrations to ingest data directly into Elastic from Google BigQuery and Google Cloud Storage.
On this page