See how you can save 70% of the cost by reducing log volume and staying compliant.

Announcing the Control API Suite

4 MIN READ
MIN READ
Albert Feng

10.13.21

Driving product for the Ecosystem team. Avid basketball, tennis and golf player.
4 MIN READ
MIN READ

LogDNA is now Mezmo but the product that you know and love is here to stay.

As LogDNA has grown, many of our customers have too, meaning that they are bringing in more ingestion data sources and expanding their use cases for their logs. To help with managing more data, we’re excited to introduce the Control API suite.

We’ve built 4 individual APIs that will help companies programmatically configure their data and how they want to ingest logs.

Below, we’ll cover each new API in detail as well as why they are massively impactful for our customers.

Exclusion Rules API & Terraform Support

One of the most popular features customers use today is the ability to configure Exclusion Rules. Customers can use the LogDNA UI to define rules to exclude certain logs from being saved by our underlying datastore. This helps control cost and filter out noise for more focused debugging and troubleshooting.

Today, we’re excited to announce that customers can programmatically configure Exclusion Rules via our new API endpoints and Terraform Provider. The API endpoints can be called to create, update, read, and delete their exclusion rules while the Terraform Provider now recognizes exclusion rules as a configurable resource.

Exclusion Rules API will be live in the upcoming week.

Start/Stop Ingestion API

When incidents happen and large amounts of logs are created in a short period of time, seeing a flood of logs in Live Tail can sometimes cause more chaos, confusion, and a spike in cost. During these situations, developers often have the necessary logs to debug the issue without needing to see an influx of the same data over and over again. Thus, we enabled a way in our UI to start and stop ingestion of all logs, as needed.

Now customers can now programmatically turn on and off their account ingestion with a few API endpoints.

Usage API

In the current LogDNA UI, customers can see data ingestion by apps, sources, and tags. This is helpful for understanding broad log trends over time and pinpointing specific applications that are contributing to the highest volume of logs.

Today, we’re announcing our Usage API, which allows customers to programmatically query for which services are creating the most logs. Now, customers can automatically monitor their usage and better understand how their logs change over time.

Archiving API and Terraform Support

When logs reach their retention period, customers can set up archiving to a 3rd party storage provider like an Amazon S3 bucket or IBM Cloud Object Storage. This is helpful because beyond a certain amount of time, logs are most useful for audit and security purposes, rather than debugging and troubleshooting.

Today, we’re introducing the Archiving API and Terraform Provider which allows users to programmatically configure their archiving integrations as well as use Terraform to manage their archiving instances as resources.

Why the Control API suite is impactful

The Control API suite provides the flexibility to integrate logging practices with existing development workflows.  Now teams can set up rules with their deployment processes that will impact how log data is used throughout the organization.

By being able to programmatically query for log usage, customers can come up with rules to start excluding certain logs as well as even completely turn off ingestion during a massive log spike.

Within staging environments, LogDNA can help detect usage spikes from certain applications and quickly understand which applications require more attention before they are deployed in production.

For larger organizations, the enablement of APIs and the LogDNA Terraform Provider makes it easier to manage multiple teams using multiple LogDNA accounts. By configuring account configurations (ie Archiving, Views, Alerts, etc.) as code, organizations can better control and automate how their logs are utilized as an entire data pipeline.

How to get started

You can check out all of our new APIs here in our documentation. If you have any feedback or questions, I’d love to hear from you — my email is albert.feng@mezmo.com.

false
false