Microsoft fabric for power bi developers: what actually changes
Moving from power bi to fabric feels like a big shift. Microsoft keeps saying it's the future of analytics and you should migrate. But what does that actually mean for you as a power bi developer?
Spent the last two years working with fabric and honestly most of what you already know still works. The core stuff hasn't changed. But there are some new concepts you need to wrap your head around.
What is fabric anyway?
Fabric is microsoft's unified analytics platform. Instead of having power bi, synapse, data factory, and a bunch of other tools scattered around, everything lives in one place now.
Think of it as power bi plus a proper data platform underneath. You still build reports the same way but now you have better options for where your data lives and how it gets there.
What stays the same
This is the important part. Most of your existing skills transfer directly:
- Power bi desktop - still the same tool for building reports
- DAX - all your measures and calculations work exactly the same
- Report building - visuals, interactions, formatting, nothing changed here
- Semantic models - they renamed datasets to semantic models but it's the same concept
- Row level security - same implementation, same patterns
- Publishing - still click publish, still goes to the service
If you know how to build power bi reports you already know 80% of fabric.
What changes: the mental model
The big shift is how you think about data storage and workspaces.
OneLake storage
Everything in fabric stores data in OneLake. It's basically microsoft's version of a data lake but built into the platform.
Your data lives in delta tables by default. This is different from traditional power bi where data got imported into the semantic model and that was it.
Benefits:
- One copy of data that multiple things can use
- Direct lake mode means reports query the lakehouse directly
- No more duplicate datasets eating storage
The tradeoff is you need to think more about data architecture upfront. Can't just throw everything into a pbix file anymore.
Workspace concept expansion
Workspaces aren't just for power bi reports now. In fabric a workspace can contain:
- Lakehouses
- Warehouses
- Dataflows
- Notebooks
- Data pipelines
- Semantic models
- Reports
It's more like a project container than just a place to publish reports.
start simple
Create one workspace per project or data domain. Put your lakehouse, dataflows, semantic models, and reports all in the same workspace. Makes permissions and organization way easier.
Data sources and refresh
This is where things get interesting. You have way more options now:
Old world: dataflow or direct query to source, import into semantic model, set up refresh schedule
Fabric world: multiple paths depending on your needs
- Use dataflows gen2 to load data into a lakehouse
- Use notebooks for more complex transformations
- Use data pipelines to orchestrate everything
- Build semantic models on top of lakehouses using direct lake mode
The refresh schedule moves earlier in the chain. You refresh the lakehouse and your reports automatically see the new data.
Storage options: lakehouse vs warehouse
Fabric gives you two main places to store data and this choice matters.
Lakehouse: stores data as delta tables, access via spark or sql endpoint. More flexible, better for large scale data.
Warehouse: traditional sql database, access via t-sql. Familiar if you come from sql server world.
For most power bi developers starting with fabric i'd recommend lakehouse. It's where the platform is heading and it integrates better with the modern features.
I wrote a detailed comparison of lakehouses vs warehouses if you want to dig deeper into which one fits your use case.
Direct lake mode: the actual benefit
This is the feature that makes fabric worth it for power bi work.
Traditional import mode: data gets copied into the semantic model. You set a refresh schedule. Data is always slightly stale. Large datasets eat up capacity.
Direct lake mode: semantic model queries the lakehouse delta tables directly. No data duplication. Always current data. Way better performance at scale.
Caveats:
- Only works with lakehouses (not warehouses, not other sources)
- Has some limitations on row counts in lower capacity tiers
- Not all dax patterns are supported (falls back to directquery when needed)
But when it works it's legitimately better than the old way.
capacity matters
Direct lake mode has row limits based on your fabric capacity sku. F2 capacity can handle up to 300 million rows across all tables. If you exceed this it falls back to directquery which is slower.
When should you actually migrate
Don't migrate just because microsoft says to. Migrate when fabric solves a real problem you have.
Good reasons to move to fabric:
- You're hitting dataset size limits in power bi
- You need better data transformation capabilities than power query offers
- You're already using synapse or data factory and want to consolidate
- You need spark processing for large scale data
- You want to avoid data duplication between systems
Not great reasons:
- Just because it's new
- You have 5 small reports and no data engineering needs
- Your current setup works fine
Regular power bi still works and isn't going anywhere soon. Fabric is for when you need the extra data platform capabilities.
Getting started path
If you're ready to try fabric here's the practical path:
- Start with a non-critical project or a copy of existing work
- Create a fabric workspace
- Create a lakehouse in that workspace
- Use a dataflow gen2 to load some data into the lakehouse
- Create a semantic model on top of the lakehouse
- Build your report in power bi desktop connected to that semantic model
- Publish and test
Once you've done this flow once the rest makes way more sense.
What to learn next
Focus on these areas in order:
- Lakehouses - understand delta tables and the sql endpoint
- Dataflows gen2 - your main tool for getting data in
- Direct lake mode - how to set it up and when it's worth using
- Notebooks - spark basics for when power query isn't enough
You don't need to become a data engineer overnight. Learn the pieces as you need them.
Final thoughts
Fabric isn't a total rewrite of power bi. It's power bi plus a proper data platform underneath.
Your existing power bi skills are still valuable. The report building part hasn't changed. What changed is having better options for data storage and transformation before you get to the semantic model.
Start small. Pick one project. Try out a lakehouse and direct lake mode. See if it actually solves problems you have. Then decide if you want to move more stuff over.
Not everything needs to move to fabric. But when you hit the limits of traditional power bi it's a solid next step.
related posts
Migrating to fabric: a 3 day plan for power bi teams
Moving to fabric doesn't have to be a month-long ordeal. Here's a practical 3-day roadmap to get your first end-to-end solution running in production.
Spark optimization in fabric notebooks: the logic vs physics split
Your notebook code is logic. Your spark configuration is physics. Understanding this split and what you can actually control at each fabric SKU level makes everything faster and cheaper.
Databricks vs fabric: which one do you actually need
Databricks gives you atomic control over everything. Fabric makes it simple and integrates with power bi. Neither is objectively better but one is probably right for your situation.