See how you can save 70% of the cost by reducing log volume and staying compliant.

Creating Re-Usable Components for Telemetry Pipelines

4 MIN READ
MIN READ
Kai Alvason

9.18.24

Kai is a Senior Technical Editor at Mezmo.
4 MIN READ
MIN READ

One challenge for the widespread adoption of telemetry pipelines for SRE teams within an organization is knowing where to start when building a pipeline. Faced with a wide assortment of sources, processors, and destinations, setting up a telemetry pipeline can seem like trying to build a Lego set without any instructions.

The solution is to provide teams with pre-defined components that provide specific functionality, that they can then use to build pipelines that meet their own requirements. For example, Mezmo Telemetry Pipelines include components like Shared Sources and Processor Modules that enable the creation of standard components, and the centralization of data governance and management standards.

What is a Shared Source?


When you create a pipeline and add a source to it, that Source is exclusive to that pipeline, and the pipeline itself is self-contained. There are many situations in which you may want to use that same source for multiple pipelines, but you need to define and configure it within each Pipeline independently. This complicates the management of that same source for multiple pipelines, and also contributes to increased expense for ingress and egress data volume.

Shared Sources offers a solution to these issues. Shared Sources are configured at the global level, and can be shared across multiple Pipelines. With a Shared Source, the same data is transmitted once across all Pipelines, rather than being transmitted separately for each Pipeline.

Shared sources also incorporate additional data processing functionality, including automatic parsing of source data, and the generation of a data profile for the source.

What is a Processor Module?

A Processor Module is a set of Processors that perform a specific function within your Pipeline. For example, you may have a series of Encrypt Field and Redact Processors that function as a Compliance module for Personally Identifying Information, or you may have an Event to Metric Processor and an Aggregate Processor that work together in an Event to Metric module.

By grouping these processors together into a module, you can simplify your Pipeline Map, and more easily manage the configuration of the Processors in relation to each other. Published modules can also be shared with other members of your organization.

Mezmo offers many features that make it easy to create and manage telemetry pipelines, check out our documentation at docs.mezmo.com for more information.

false
false