Webinar Recap: Myths and Realities in Telemetry Data Handling

4 MIN READ
MIN READ
TABLE OF CONTENTS
    4 MIN READ
    MIN READ

    Telemetry data is growing exponentially, but the business value isn’t increasing at a similar pace. Getting the right telemetry data is hard, so I recently had a conversation with Matt Aslett, Director of Research at Ventana Research, now a part of ISG, about five myths and realities in telemetry data handling. 

    Myth #1: We’re going to solve our data handling by improving our practices and culture

    While it is still valid to want to improve practices and culture, the reality is it does nothing to improve data quality. Without the right data, poor decisions can be made, and more time spent preparing and analyzing data for consumption. 

    Myth #2: Telemetry data (e.g., metrics, logs, events, and traces) is exhausted from infrastructure and apps, and business people do not need that data

    While historically telemetry data has been IT focused, it has a lot of transactional information that can be tied to business data, such as understanding customer patterns, meeting compliance requirements, and optimizing experiences. Having this insight can also help you better understand business risks, such as how a technical incident can impact your business while keeping things up and running smoothly.   

    Myth #3: We do not understand our data, but we can’t really do much about it, so we don’t prioritize doing it.

    Profiling telemetry data is a necessity not only to understand the data patterns but also to do further analysis to detect data drifts, anomalies, and data quality issues so you can better serve your customers before they experience an issue.

    Myth #4: We need to keep all of our telemetry data for compliance purposes, so there’s little value in us spending time to optimize or reduce the data volume.

    While it has become cheaper to store all your telemetry data and, in theory, gives you more insights, you still have to spend time accessing and then filtering it. Thereby, it's important to find a way to optimize data for better analysis, SIEM purposes, etc., while also sending appropriate full-fidelity data to long-term storage and to have it available for rehydration if needed. 

    Myth #5: Normalizing telemetry data to be consistent across teams can be disruptive

    While there have been good business reasons in the past for different teams to have their own telemetry data taxonomy, organizations won’t get the full value of telemetry data unless there’s wider adoption of shared standards, like OpenTelemetry. Why? To be more competitive in today’s landscape, it’s important to have interoperability between multiple applications and infrastructure so you can reduce ambiguity, improve collaboration, and increase business intelligence. 

    With all this in mind, it’s important to apply data engineering principles to your telemetry data- understand, optimize, and respond to your data. This is where a telemetry pipeline can help you do things like reduce costs, increase data utility, get deeper insights, accelerate resolution time, improve security, and ensure compliance. 

    Watch the webinar for full commentary
    true
    false
    April Yep

    3.22.24

    April has several years of experience in the observability space, leading back to the days when it was called APM, DevOps, or infrastructure monitoring. April is a Senior Product Marketing Manager at Mezmo and loves cats and tea.

    SHARE ARTICLE

    RSS FEED