Blog — Advancing Analytics

Why Your Databricks Metrics Views and Power BI Aren't Seeing Eye-to-Eye

Written by Ust Oldfield | Jul 17, 2025 4:20:18 PM

Databricks is rapidly changing how organisations manage data and analytics. At the Data and AI summit a few weeks ago, Databricks announced the public preview of Metrics Views within Unity Catalog - effectively creating a semantic layer within Databricks, rather than having to rely upon third-parties (such as Power BI, Cube, AtScale etc.) for semantic definitions of business oriented data. 

If you use a BI tool over Databricks, such as Sigma or ThoughtSpot, there is a seamless integration between Databricks' Metrics Views and the tool. However, if you're a heavy Power BI user you may have noticed that this integration is absent. 

At Advancing Analytics, we're huge fans of both Databricks and Power BI, so we want to dig in deeper as to what Metrics Views are, why they don't currently integrate with Power BI, as well as exploring, at a high-level, what some alternative approaches might be, so you can still benefit from innovation in both worlds.

What are Databricks Metrics Views?

Databricks Metrics Views allow you to define standardised business metrics and dimensions over your lakehouse, with the built-in governance and lineage provided by Unity Catalog - just what you'd want and expect from a semantic layer. With multiple avenues for our customers to consume data, from SQL and Dashboards to Genie rooms and Apps, just within Databricks (let alone other tools that in turn consume from Databricks) the need to have a standardised and governed definitions of metrics and dimensional attributes has never been more important.

Instead of individual analysts or departments defining "revenue" or "customer churn" in their own ways, Metrics Views provide a single, authoritative definition.

These views are created using a straightforward YAML-based syntax and are stored within Unity Catalog, Databricks' unified governance solution. Key features include:

  • Centralised Logic: Define key performance indicators (KPIs) once and have them consistently applied across all your analytics.

  • Dimensions and Measures: Clearly separate your categorical attributes (dimensions) from your aggregated calculations (measures).

  • Advanced Capabilities: Support for joins, window functions, and complex business logic directly within the metric definition.

  • Governance: Leverage Unity Catalog for fine-grained access control to your metrics.

The goal is to provide a "single source of truth" for your business logic, ensuring that everyone is speaking the same language when it comes to data analysis.

The Power BI-Databricks Connection

Power BI boasts a robust and widely used connector for Databricks. This allows users to connect to their Databricks SQL warehouses to access tables and standard views with ease. You can import data into Power BI or use DirectQuery for real-time insights. Databricks has even introduced a "Publish to Power BI" feature, which streamlines the process of creating Power BI semantic models from your Databricks data, preserving relationships defined in Unity Catalog.

So, with such a strong connection, why the disconnect with Metrics Views? The answer lies in a fundamental difference in architecture.

A Tale of Two Engines

The core of the issue is not just a simple missing feature. It's a clash of semantic models. Power BI has its own powerful, in-memory calculation engine (the VertiPaq engine) and a separate language for defining business logic: DAX (Data Analysis Expressions).

A core principle of Power BI development is to define business logic within a Power BI Semantic Model. Analysts define their KPIs and calculations as DAX measures. For example, Total Sales = SUM(Sales[SalesAmount]). This makes the Power BI Dataset its own "single source of truth" for the reports built on top of it.

Here's where the conflict arises:

  • Databricks Metrics Views want to be the source of truth for metrics, defined in YAML and queried with a specific MEASURE() SQL function.

  • Power BI Semantic Models want to be the source of truth for metrics, defined in DAX and calculated by the VertiPaq engine.

Power BI's connector is built to query underlying tables and views from a source like Databricks. It then expects its own engine to perform the aggregations and calculations. It is not designed to consume a pre-defined, semantic metric from another engine.

When Power BI tries to query a Databricks Metrics View, its auto-generated SQL doesn't include the necessary MEASURE() function because its internal logic is geared towards generating standard SQL clauses. It's trying to speak its own language (translating user actions into standard SQL) to a system that is offering a different dialect.

What's a Data Analyst to do?

You have two powerful tools: Databricks and Power BI, and you want to use the newer features in Databricks (e.g., SQL, Genie or Apps) to enable other users to access data in a way that matters to them, while maintaining reusability, repeatability, and consistency across your two platforms. But if Metrics Views enables Databricks use cases but can't be reused in Power BI, what are your options?

It might be tempting to try to put in intermediate layers or native query workarounds to bring the two together, but let me tell you now - they don't currently work. So your options are:

  • Use Databricks AI/BI Dashboards instead. Databricks has its own dashboarding capabilities that are fully aware of and integrated with Metrics Views. For analyses that can live within the Databricks ecosystem, this is the most seamless option. But if you have a large Power BI estate, this may not feasible.
  • Build a mapping between Databricks Metrics Views and Power BI Semantic Models. They both use YAML after all... 

The Road Ahead

The lack of direct integration between Power BI and Databricks Metrics Views highlights a classic challenge in the modern data stack: where should business logic live? True integration would require more than the Power BI connector simply supporting a new SQL function. It would require Power BI to be able to either:

  • Translate: Intelligently read the Databricks Metrics View and automatically create a corresponding DAX measure in its own model.

  • Delegate: Operate in a mode where it fully cedes calculation control to the source, trusting the Databricks MEASURE() to deliver the correct result.

For now, understanding the architectural differences between these tools is key to navigating their limitations and building a robust analytics pipeline.

Our team at Advancing Analytics specialises in architecting efficient, scalable data platforms that bridge the gap between best-in-class tools like Databricks and Power BI. If this post resonates with the challenges you're facing, let's have a more detailed conversation.