protobuf

edit
  • Version: 1.0.0
  • Released on: December 4, 2016
  • Changelog

Installation

edit

For plugins not bundled by default, it is easy to install by running bin/logstash-plugin install logstash-codec-protobuf. See Working with plugins for more details.

Getting Help

edit

For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.

Description

edit

This codec converts protobuf encoded messages into logstash events and vice versa.

Requires the protobuf definitions as ruby files. You can create those using the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).

The following shows a usage example for decoding events from a kafka stream:

kafka
{
 zk_connect => "127.0.0.1"
 topic_id => "your_topic_goes_here"
 codec => protobuf
 {
   class_name => "Animal::Unicorn"
   include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb']
 }
}

 

Synopsis

edit

This plugin supports the following configuration options:

Required configuration options:

protobuf {
      class_name => ...
      include_path => ...
  }

Available configuration options:

Setting Input type Required

class_name

string

Yes

enable_metric

boolean

No

id

string

No

include_path

array

Yes

Details

edit

 

class_name

edit
  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

Name of the class to decode. If your protobuf definition contains modules, prepend them to the class name with double colons like so:

class_name => "Foods::Dairy::Cheese"

This corresponds to a protobuf definition starting as follows:

module Foods
   module Dairy
       class Cheese
           # here are your field definitions.

If your class references other definitions: you only have to add the main class here.

enable_metric

edit
  • Value type is boolean
  • Default value is true

Disable or enable metric logging for this specific plugin instance by default we record all the metrics we can, but you can disable metrics collection for a specific plugin.

  • Value type is string
  • There is no default value for this setting.

Add a unique ID to the plugin configuration. If no ID is specified, Logstash will generate one. It is strongly recommended to set this ID in your configuration. This is particulary useful when you have two or more plugins of the same type, for example, if you have 2 grok filters. Adding a named ID in this case will help in monitoring Logstash when using the monitoring APIs.

output {
 stdout {
   id => "my_plugin_id"
 }
}

include_path

edit
  • This is a required setting.
  • Value type is array
  • There is no default value for this setting.

List of absolute pathes to files with protobuf definitions. When using more than one file, make sure to arrange the files in reverse order of dependency so that each class is loaded before it is refered to by another.

Example: a class Cheese referencing another protobuf class Milk

module Foods
  module Dairy
        class Cheese
           set_fully_qualified_name "Foods.Dairy.Cheese"
           optional ::Foods::Cheese::Milk, :milk, 1
           optional :int64, :unique_id, 2
           # here be more field definitions

would be configured as

include_path => ['/path/to/protobuf/definitions/Milk.pb.rb','/path/to/protobuf/definitions/Cheese.pb.rb']

When using the codec in an output plugin: * make sure to include all the desired fields in the protobuf definition, including timestamp. Remove fields that are not part of the protobuf definition from the event by using the mutate filter. * the @ symbol is currently not supported in field names when loading the protobuf definitions for encoding. Make sure to call the timestamp field "timestamp" instead of "@timestamp" in the protobuf file. Logstash event fields will be stripped of the leading @ before conversion.