Published: 14 Jun 2023 | Author: James Beresford
The latest iteration of Microsoft’s Data & AI offering – Fabric- was released with much fanfare on May 25. It’s hailed by Microsoft as a great uplift in the capability available to their Data & AI customers. This article is here to cut through the hype and give an honest perspective on it. I will caveat that there may be some inaccuracies in this article as a) it’s all very new and b) it’s still in preview so some of my observations may not remain valid for long.
Fabric is a unification of a collection of Microsoft Data & AI capabilities in a single platform. It covers Data Storage, Data Engineering, Machine Learning, Streaming data & Self-Service Analytics.
All this, of course, sits under Purview for data governance and potentially information protection as well.
Well… kind of. A while back, Synapse was the answer to everything for Analytics. Today, it’s Fabric. Many of the moving parts are the same – but Power BI has been brought to the forefront of the offering, instead of just being a subcomponent.
So why have Microsoft done this? My analysis is that there are three reasons:
It looks like if you have Power BI Premium, you can turn all this capability on at no extra cost. This means all your workloads (engineering, storage, reports) fall under the one capacity. Good news for predictability of Azure costs – though the kicker is whenever you hit capacity limits, you need to upgrade your capacity. I would suspect you will hit your capacity limits quicker than you would under the current Power BI workload only model.
It's not available for Power BI Premium Per User or Embedded SKU’s.
In my opinion the biggest piece is OneLake. By going this route, Microsoft have effectively endorsed the Delta Lake as the pattern to follow for managing all data storage going forward. It’s a more user-friendly implementation of ADLS Gen 2 that can be accessed via your desktop.
It promises lightning-fast data connectivity through DirectLake connectors that offer the performance benefits of Import Mode Power BI datasets but in a Direct Query model (so no more dataset refreshes required). Early reports I have heard say it stands up to its promise as well.
It also seems to be another nail in the coffin for relational storage for analytics purposes. By allowing the creation of SQL Style warehouses but using Delta Lake as the actual storage engine it seems to imply a directional shift away from relational SQL. Whether this holds up for more complex use cases is another question.
Given Power BI Premium still carries a significant price tag, the reality is that this is still for Enterprise customers only. The rolling in of Data Engineering and Storage however means the entry point will become lower than the current economics of roughly 250 end users.
If you already have separate workloads in Azure then it may be worth migrating to Fabric, but the cost of migration will probably exceed the benefits. The reality is probably going to be that existing Power BI Premium users will migrate to Fabric over time, only putting new workloads in. I can see growing into Fabric being a bit challenging, as you’ll reach a tipping point where it makes sense from an ongoing cost perspective, but the migration costs will outweigh the benefits.
Fabric represents a good step in the ongoing push to democratise data in an organisation. It puts more tools in the hands of business users, increasing self service capability.
Like all things, good governance, training and delivery approaches will need to be created to maximise the impact of this. We’ll be updating our methodologies to incorporate Fabric’s offerings – especially around how we approach Modern Data Platform. As I emphasised earlier, the shift to OneLake & the embracing of Delta Lake as a data storage approach is the biggest change Fabric offers to the Data & AI Landscape.
Get the latest Talos Newsletter delivered directly to your inbox
Automation & Analytics Technologies for Business
Enable self service analytics to meet the needs of the whole organisation with our proven methodologies.
Using our EPIC methodology guiding you to deliver outcomes quickly and cost effectively.
Specialising in all customer-related processes, she has been trained to quickly learn specific customer processes.
Build, test and implement Data Platform components - secure, efficient, flexible and cost effective.