Configure functions

edit

Functionbeat runs as a function in your serverless environment.

Before deploying Functionbeat, you need to configure one or more functions and specify details about the services that will trigger the functions.

You configure the functions in the the functionbeat.yml configuration file. When you’re done, you can deploy the functions to your serverless environment.

The following example configures two functions: cloudwatch and sqs. The cloudwatch function collects events from CloudWatch Logs. The sqs function collects messages from Amazon Simple Queue Service (SQS). Both functions forward the events to Elasticsearch.

functionbeat.provider.aws.deploy_bucket: "functionbeat-deploy"
functionbeat.provider.aws.functions:
  - name: cloudwatch
    enabled: true
    type: cloudwatch_logs
    description: "lambda function for cloudwatch logs"
    triggers:
      - log_group_name: /aws/lambda/my-lambda-function
        #filter_pattern: mylog_
  - name: sqs
    enabled: true
    type: sqs
    description: "lambda function for SQS events"
    triggers:
      - event_source_arn: arn:aws:sqs:us-east-1:123456789012:myevents

cloud.id: "MyESDeployment:SomeLongString=="
cloud.auth: "elastic:SomeLongString"

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~

Configuration options

edit

You can specify the following options to configure the functions that you want to deploy.

If you change the configuration after deploying the function, use the update command to update your deployment.

provider.aws.deploy_bucket

edit

A unique name for the S3 bucket that the Lambda artifact will be uploaded to.

name

edit

A unique name for the Lambda function. This is the name of the function as it will appear in the Lambda console on AWS.

type

edit

The type of service to monitor. For this release, the supported types are:

cloudwatch_logs
Collects events from CloudWatch logs.
sqs
Collects data from Amazon Simple Queue Service (SQS).
kinesis
Collects data from a Kinesis stream.

description

edit

A description of the function. This description is useful when you are running multiple functions and need more context about how each function is used.

triggers

edit

A list of triggers that will cause the function to execute. The list of valid triggers depends on the type. If type is cloudwatch_logs logs, specify a list of log groups. If type is sqs, specify a list of Amazon Resource Names (ARNs).

filter_pattern

edit

A regular expression that matches the events you want to collect. Setting this option may reduce execution costs because the function only executes if there is data that matches the pattern.

concurrency

edit

The reserved number of instances for the function. Setting this option may reduce execution costs by limiting the number of functions that can execute in your serverless environment. The default is unreserved.

memory_size

edit

The maximum amount of memory to allocate for this function. Specify a value that is a factor of 64. There is a hard limit of 3008 MiB for each function. The default is 128 MiB.

dead_letter_config.target_arn

edit

The dead letter queue to use for messages that can’t be processed successfully. Set this option to an ARN that points to an SQS queue.

batch_size

edit

The number of events to read from a Kinesis stream, the minimal values is 100 and the maximun is 10000. The default is 100.

starting_position

edit

The starting position to read from a Kinesis stream, valids values are trim_horizon and latest. The default is trim_horizon.