See how you can save 70% of the cost by reducing log volume and staying compliant.

What’s new from KubeCon + Cloud Native Con North America 2024

Mezmo unveils Mezmo Flow for guided data onboarding and log volume optimization

Mezmo today unveiled Mezmo Flow, a guided experience for building telemetry pipelines. With Mezmo Flow, users can quickly onboard new log sources, profile data, and implement recommended optimizations with a single click, to reduce log volumes by more than 40%. With this release, Mezmo enables next generation log management, a pipeline-first log analysis solution that helps companies control incoming data volumes, identify the most valuable data, and glean insights faster, without the need to index data in expensive observability tools.

Developers should not have to choose between how much they can log and how fast they can debug and troubleshoot issues, especially with custom applications. SREs need an easy way to understand logs, monitor any data spikes, solve any infrastructure issues, and easily provision data to downstream teams and systems. The new release from Mezmo streamlines both developer and SRE workflows.

With Mezmo Flow, users can create their first log volume reduction pipeline in less than 15 minutes, retaining the most valuable data and preventing unnecessary charges, overages, and spikes. Next generation log management is a pipeline-first log analysis that improves the quality of critical application logs to improve signal-to-noise ratio for increased developer productivity. Alerts and notifications on data in motion can help users take timely actions for accidental application log volume spikes or changes in metrics.

As part of its recent release, Mezmo is also introducing a series of new capabilities to simplify action and control for developers and SREs. These include:

  • Data profiler enhancements: Analyze and understand structured and unstructured logs while continuously monitoring log volume trends across applications.
  • Processor groups: Create multifunctional, reusable pipeline components, improving pipeline development time and ensuring standardization and governance over data management.
  • Shared resources: Configure sources once and use them for multiple pipelines. This ensures data is delivered to the right users in their preferred tools with as little overhead as possible.
  • Data aggregation for insights: Collect and aggregate telemetry metrics such as log volume or errors per application, host, and user-defined label. The aggregated data is available as interactive reports to gain insights such as application log volume or error trends and can be used to detect anomalies such as volume surges  and alert users to help prevent overages.

SIGN UP FOR A 14 DAY TRIAL