Product
-
May 2, 2022

Introducing Metadata Metrics: instant data observability for your entire data warehouse

Metadata Metrics offers instant coverage for your entire data warehouse, so you can quickly detect the most common data quality issues.

Kyle Kirwan
Get Data Insights Delivered
Join hundreds of data professionals who subscribe to the Data Leaders Digest for actionable insights and expert advice.
Stay Informed
Sign up for the Data Leaders Digest and get the latest trends, insights, and strategies in data management delivered straight to your inbox.
Get the Best of Data Leadership
Subscribe to the Data Leaders Digest for exclusive content on data reliability, observability, and leadership from top industry experts.

Get the Best of Data Leadership

Subscribe to the Data Leaders Digest for exclusive content on data reliability, observability, and leadership from top industry experts.

Stay Informed

Sign up for the Data Leaders Digest and get the latest trends, insights, and strategies in data management delivered straight to your inbox.

Get Data Insights Delivered

Join hundreds of data professionals who subscribe to the Data Leaders Digest for actionable insights and expert advice.

Meet Metadata Metrics—instant coverage for your entire data warehouse so you can quickly detect the most common data quality issues. Metadata Metrics work by scanning your existing query logs to automatically track key operational metrics including hours since the last table load, rows inserted, and the number of read queries on every dataset. Metadata Metrics take minutes to set up, with zero manual configuration and almost no additional load to your warehouse.

Among data observability solutions, Bigeye is the only platform capable of broadly monitoring across tables and deeply into the most critical datasets, reducing the number of expensive outages affecting business-critical applications.

Instant observability for your entire warehouse

Metadata Metrics give you instant insight into key operational attributes of every table in your warehouse including:

  1. Time since the table was last refreshed
  2. Number of rows inserted per day
  3. Number of queries run per day

With Metadata Metrics enabled, your team will be the first to know about stale data, table updates that are too big or too small, or changes in table utilization thanks to Bigeye’s best-in-class, anomaly detection system. Also, if you have data exchange software in your stack, metadata metrics can feed into those analytics without altering the inherent meaning that transpired during data acquisition.

T-Shaped Monitoring—wide and deep

Bigeye is the creator of T-shaped Monitoring, a unique approach to data observability that tracks fundamentals across all your data while applying deeper monitoring on the most critical datasets such as those used for financial planning, machine learning models, and executive-level dashboards. This approach ensures you're covered against even the unknown unknowns.

Here’s how it works:

  1. Enable Metadata Metrics to instantly track the basics across all your data.
  2. Go deep on each business-critical dataset using a blend of metrics that Bigeye suggests for each table from its library of 70+ pre-built data quality metrics.
  3. Take it even further by adding custom metrics with templates and virtual tables to ensure custom business logic is being monitored for defects.

T-Shaped Monitoring gives data teams peace of mind with monitoring across your entire warehouse, 24/7. With Metadata Metrics, it’s even faster to set up and deploy broad coverage without the hassle of configuration. Ensure your business is never disrupted. Detect both simple problems, such as stale data, and even the most subtle errors in any critical dataset.

Metadata Metrics are available to all customers now. To learn more, read our documentation about Metadata Metrics. Or, if you’re ready to see it, we’d love to give you a demo.

share this episode
Resource
Monthly cost ($)
Number of resources
Time (months)
Total cost ($)
Software/Data engineer
$15,000
3
12
$540,000
Data analyst
$12,000
2
6
$144,000
Business analyst
$10,000
1
3
$30,000
Data/product manager
$20,000
2
6
$240,000
Total cost
$954,000
Role
Goals
Common needs
Data engineers
Overall data flow. Data is fresh and operating at full volume. Jobs are always running, so data outages don't impact downstream systems.
Freshness + volume
Monitoring
Schema change detection
Lineage monitoring
Data scientists
Specific datasets in great detail. Looking for outliers, duplication, and other—sometimes subtle—issues that could affect their analysis or machine learning models.
Freshness monitoringCompleteness monitoringDuplicate detectionOutlier detectionDistribution shift detectionDimensional slicing and dicing
Analytics engineers
Rapidly testing the changes they’re making within the data model. Move fast and not break things—without spending hours writing tons of pipeline tests.
Lineage monitoringETL blue/green testing
Business intelligence analysts
The business impact of data. Understand where they should spend their time digging in, and when they have a red herring caused by a data pipeline problem.
Integration with analytics toolsAnomaly detectionCustom business metricsDimensional slicing and dicing
Other stakeholders
Data reliability. Customers and stakeholders don’t want data issues to bog them down, delay deadlines, or provide inaccurate information.
Integration with analytics toolsReporting and insights

Get the Best of Data Leadership

Subscribe to the Data Leaders Digest for exclusive content on data reliability, observability, and leadership from top industry experts.

Stay Informed

Sign up for the Data Leaders Digest and get the latest trends, insights, and strategies in data management delivered straight to your inbox.

Get Data Insights Delivered

Join hundreds of data professionals who subscribe to the Data Leaders Digest for actionable insights and expert advice.

Join the Bigeye Newsletter

1x per month. Get the latest in data observability right in your inbox.