Published: 14 Jun 2023 | Author: James Beresford

The latest iteration of Microsoft’s Data & AI offering – Fabric- was released with much fanfare on May 25. It’s hailed by Microsoft as a great uplift in the capability available to their Data & AI customers. This article is here to cut through the hype and give an honest perspective on it. I will caveat that there may be some inaccuracies in this article as a) it’s all very new and b) it’s still in preview so some of my observations may not remain valid for long.

First, what it is.

Fabric is a unification of a collection of Microsoft Data & AI capabilities in a single platform. It covers Data Storage, Data Engineering, Machine Learning, Streaming data & Self-Service Analytics.

  • Data Storage is primarily through OneLake – an implementation of the Delta Lake concept. This is in my view the big news and I cover it more depth later. It also supports a traditional SQL Warehouse approach, though again these are underpinned by Delta Lake storage.
  • Data Engineering is through one of three paths. You can either use Data Factory, Notebooks or Dataflows (i.e., Power Query).
  • Data Science is through Python or Jupyter Notebooks and MLFlow.
  • Streaming capability is provided by support for Kusto databases.
  • Self Service Analytics is through Power BI is the main interface for visualising data.

All this, of course, sits under Purview for data governance and potentially information protection as well.


Is this Synapse repackaged?

Well… kind of. A while back, Synapse was the answer to everything for Analytics. Today, it’s Fabric. Many of the moving parts are the same – but Power BI has been brought to the forefront of the offering, instead of just being a subcomponent.

So why have Microsoft done this? My analysis is that there are three reasons:

  • Power BI has dominated the Self-Service BI market and as a brand has more weight than Synapse ever could hope to achieve. Most business buyers will be motivated by front end considerations.
  • Snowflake and other platforms have been eating Azure’s Data Engineering lunch. OneLake is a strong counter to that (and initial reports on performance are very good).
  • It gives Microsoft predictable revenue across the Data Analytics portfolio. Now all workloads are captured under the Power BI Premium (Fabric) capacity it means that organisations will be locked in to fixed, long term revenue streams. 


What does this mean for licencing?

It looks like if you have Power BI Premium, you can turn all this capability on at no extra cost. This means all your workloads (engineering, storage, reports) fall under the one capacity. Good news for predictability of Azure costs – though the kicker is whenever you hit capacity limits, you need to upgrade your capacity. I would suspect you will hit your capacity limits quicker than you would under the current Power BI workload only model.

It's not available for Power BI Premium Per User or Embedded SKU’s.

What’s the big deal, then?

In my opinion the biggest piece is OneLake. By going this route, Microsoft have effectively endorsed the Delta Lake as the pattern to follow for managing all data storage going forward. It’s a more user-friendly implementation of ADLS Gen 2 that can be accessed via your desktop.

It promises lightning-fast data connectivity through DirectLake connectors that offer the performance benefits of Import Mode Power BI datasets but in a Direct Query model (so no more dataset refreshes required). Early reports I have heard say it stands up to its promise as well.

It also seems to be another nail in the coffin for relational storage for analytics purposes. By allowing the creation of SQL Style warehouses but using Delta Lake as the actual storage engine it seems to imply a directional shift away from relational SQL. Whether this holds up for more complex use cases is another question.


So, who is it for?

Given Power BI Premium still carries a significant price tag, the reality is that this is still for Enterprise customers only. The rolling in of Data Engineering and Storage however means the entry point will become lower than the current economics of roughly 250 end users.

If you already have separate workloads in Azure then it may be worth migrating to Fabric, but the cost of migration will probably exceed the benefits. The reality is probably going to be that existing Power BI Premium users will migrate to Fabric over time, only putting new workloads in. I can see growing into Fabric being a bit challenging, as you’ll reach a tipping point where it makes sense from an ongoing cost perspective, but the migration costs will outweigh the benefits.


In summary

Fabric represents a good step in the ongoing push to democratise data in an organisation. It puts more tools in the hands of business users, increasing self service capability.

Like all things, good governance, training and delivery approaches will need to be created to maximise the impact of this. We’ll be updating our methodologies to incorporate Fabric’s offerings – especially around how we approach Modern Data Platform. As I emphasised earlier, the shift to OneLake & the embracing of Delta Lake as a data storage approach is the biggest change Fabric offers to the Data & AI Landscape.

Subscribe

Get the latest Talos Newsletter delivered directly to your inbox

TECHNOLOGY PARTNERS

Our partners including Microsoft, UiPath, Databricks & Profisee enable us to deliver business outcomes using best of breed technologies and solutions.

Automation & Analytics Technologies for Business

Our Solutions

Enterprise PowerBI
Enterprise PowerBI
Self Service Analytics

Enable self service analytics to meet the needs of the whole organisation with our proven methodologies.

Automation Initiation
Automation Initiation
Your automation journey

Using our EPIC methodology guiding you to deliver outcomes quickly and cost effectively.

CASSIE
CASSIE
Customer Assistance Expert

Specialising in all customer-related processes, she has been trained to quickly learn specific customer processes.

Modern Data Platform
Modern Data Platform
Build Data components

Build, test and implement Data Platform components - secure, efficient, flexible and cost effective.

Click here for more