What We Learned at Fabric February

As I, Megan Livadas, one of Dufrain’s Heads of BI & Analytics, sit in Oslo airport about to head back to London, it feels like the perfect time to sum up a fantastic day at Fabric February! This was my first international conference, attending with my colleague David Mills, representing Team Dufrain. I’m leaving with a notebook full of ideas and insights. There are plenty of topics I’d love to explore in more detail, but for now, here’s a rundown of the key takeaways from the day.

Fabric February


Power BI: The Microsoft Bridge Between Tech and Business

Kim Manis’ keynote focused on the unified data culture that the Microsoft ecosystem supports. With Power BI bridging both the traditional and modern tech landscapes, it’s essential to remember that for many business users, Power BI is the first step into a data-driven culture.

The low-code UI of Power BI allows users to engage with data in new ways, making it a crucial tool for organisations embracing Microsoft Fabric. As adoption grows worldwide, users will increasingly rely on Microsoft Fabric and Microsoft Fabric Copilot – so we, as data professionals, must be prepared.

Adoption stories from Oslo, as well as major brands like Chanel and Porsche, demonstrated the widespread impact of Power BI. With more monthly users than the population of Australia, there has never been a better time to champion Power BI. As the platform evolves rapidly, helping teams integrate Power BI effectively is key to fostering a data-driven culture that delivers real business value.


Copilot for Power BI: A Game Changer

In some exclusive previews from Kim Manis and Patrick Le Blanc, I was genuinely excited to see the upcoming Microsoft Fabric Copilot for Power BI UI. Soon, users will be able to navigate to a Copilot tab in Power BI Service, ask questions, and receive instant insights – eliminating the need to manually search through models and reports.

No more “Can you send me that link again? Mine isn’t working anymore!” Microsoft Fabric Copilot will automatically choose the right semantic model, ensuring users only see data they have access to, all while maintaining Row-Level Security (RLS). This delivers a Chat-GPT-like experience within Power BI, making it easier than ever to gain insights quickly.

Copilot for Power BI: A Game Changer


Why Copilot is a Game Changer for AI Adoption

There’s a lot of scepticism around AI (and I admit, I’ve been a little sceptical myself), but features like Microsoft Fabric Copilot always get me excited for two reasons:

  1. It drives user engagement and adoption – even those hesitant about AI will be curious to try it.
  2. It makes a compelling case for investment – executives and boards are more likely to approve budgets when they see AI’s potential in action.

For many organisations, investing in data strategy and solid foundations is old news—leaders are looking for the next big thing. AI, and by extension, Copilot, is that thing. With Microsoft continuously improving Copilot for Power BI, I truly believe this will be a game changer.

That being said, organisations must ensure their Power BI environments are well-managed, or AI adoption will quickly become a challenge.


Poorly Managed Power BI Environments Will Kill AI Adoption

Let’s face it – many organisations have let Power BI grow organically, and now they’re dealing with workspaces that nobody owns, redundant reports, and security risks that keep people up at night. Busy environments like these are a victim of their own success. People used the tool, which is great! But now, it’s time to regain control before AI adoption makes things even messier.

Organisations must:

  • Audit existing reports and models to ensure efficiency.
  • Lock down access and implement governance policies.
  • Clean up legacy workspaces that are no longer in use.
  • Implement structured change control processes.

A messy Power BI environment won’t support seamless Microsoft Fabric Copilot integration. Before unlocking AI’s potential, organisations must first get the fundamentals right.


Write-Back Functionality: A New Era for Power BI

Another major highlight was Power BI’s new write-back functionality in Fabric. During the keynote, Patrick Le Blanc demonstrated a use case where users could view insights, apply discounts, and update data in real-time – all within Power BI.

For finance teams, this could be transformational – approving deals straight from Power BI without needing to open Excel. Instead of merging multiple Excel files, business users will be able to tag, group, and manipulate data directly within Microsoft Fabric, ensuring consistency and centralised governance.

Write-Back Functionality: A New Era for Power BI


Pausing Fabric Capacity Might Cost More Than You Think

Managing Fabric capacity is a hot topic, and a session led by Benni De Jagere highlighted the importance of smoothing and throttling in Microsoft Fabric. Many organisations assume they can pause their capacity to save costs, but the reality is more complex.

Understanding how Fabric manages capacity – especially for pay-as-you-go vs. reserved pricing models – is critical to preventing unnecessary overages. This is a topic I’ll definitely be revisiting with more thoughts soon.


No Version Control Tech Can Beat People-First Processes

Heini Ilmarinen presented a great session on Version Control in Fabric, where he stressed the importance of keeping it simple. While it may be tempting to create custom scripts and automate every process, organisations should consider whether the effort and complexity outweigh the Microsoft Fabric benefits.

A better approach? Focus on clearly defined roles, improved collaboration, and streamlined approval processes. Over-governance can slow teams down, leading to workarounds that undermine security. Instead, fostering collaboration and communication between teams can help prevent bottlenecks while maintaining control.

No Version Control Tech Can Beat People-First Processes


How Do We Support Data Teams as a Value Centre, Not a Cost Centre?

Chargeback reporting was a key topic in a keynote by Patrick Le Blanc. With enhanced monitoring coming to Microsoft Fabric, organisations will soon have better visibility into which departments are consuming the most capacity.

By attributing costs to specific teams or functions, organisations can justify further investment in data teams. Instead of being seen as a cost centre, data teams can demonstrate their value as enablers of business growth.

How Do We Support Data Teams as a Value Centre


Let’s Get Real(Time) with Power BI

Real-time reporting is in high demand, with sessions showcasing how organisations are using DirectQuery and DirectLake for instant insights. The consensus? While real-time is powerful, not all scenarios need millisecond updates – context is key.

Let’s Get Real(Time) with Power BI


If You’re Using DirectLake, You Have to Be Using Semantic Link Labs

For many who have been tinkering with DirectLake models in their early Fabric proof of concepts, DirectLake may have – dare I say – disappointed for some. That first load, expecting lightning-fast performance but instead seeing the dreaded buffer wheel, is a familiar experience. However, there is plenty of content available to help navigate this. DirectLake will fall back to DirectQuery if used incorrectly. I attended a great session on this from Chris Webb back at SQLBits 2024, which you can watch here. The key is not to assume this new supercharged storage mode eliminates the need for an optimised model. We still want that slick star schema. There is also a model setting that allows you to disable DirectQuery fallback using tools like Tabular Editor or any other TMDL viewer.

At Fabric February, I checked out a great session from Marc Lelijveld and Mathias Halkjaer on this topic to hear about their experience optimising with Semantic Link Labs. Personally, I much prefer the idea of navigating this optimisation from the comfort of a Fabric Notebook rather than jumping to third-party tooling. Their point about the lack of Power Query in DirectLake models being a good thing is one I absolutely echo. If you’ve gone to the effort of developing a brand-new Fabric solution that supports a DirectLake model, don’t fill it with tactical transformations. Move that upstream into your notebooks, or if necessary, a Dataflow Gen2, which has all the functionality of Power Query if you’re not quite ready to give up your M code yet. While this may feel a bit inconvenient for Power BI developers looking to tactically transform their data while modelling, I see it as a great way to push us towards adopting Microsoft Fabric methodologies.

There’s a great blog from Kurt over at Data Goblin on Semantic Link Labs in Fabric Notebooks that you can check out here.

You Have to Be Using Semantic Link Labs


The End of P SKUs: A New Era for Power BI Licensing

With Power BI Premium Capacity SKUs set to retire, businesses must transition to Microsoft Fabric License models. This shift will provide greater flexibility and cost efficiency, especially for those leveraging Copilot and AI-driven analytics.


Diving Into Fabric February:

David’s Thoughts

There were plenty of sessions to attend at Fabric February, covering the latest innovations in Co-Pilot, Artificial Intelligence, Real-Time Analytics, and Power BI. While these topics were undeniably exciting, I deliberately focused on the not-so-sexy but essential sessions – Fabric Security, Disaster Recovery, Data Quality and Governance, and Fabric Capacity Administration.

Why? Because we all love AI, real-time insights, and automation – until something goes wrong. Imagine this:

Your data gets stolen by hackers. Your data gets deleted or becomes corrupted. Your data is inaccessible due to a disaster recovery scenario.

At Fabric February, I prioritised sessions that help prevent these worst-case scenarios. After all, the most powerful AI models and real-time dashboards mean nothing if your Microsoft Fabric platform lacks security, governance, and resilience.

My key learnings from the security-based MVP-led sessions included:

Fabric Security & Access Control – Fortifying a Fabric environment with private networking, robust access policies, and continuous monitoring.

Fabric Disaster Recovery – What every Architect should consider to prepare for real-life DR scenarios.

Data Quality & Governance – Ensuring organisations not only secure their data but also maintain its integrity, accuracy, and compliance.

Fabric Capacity Administration – Tips on configuring capacities, access controls, domains, environments, and workspaces.

I also attended a fantastic session on SQL Databases in Microsoft Fabric, a relatively new feature still in Preview. While there are still performance and functionality quirks to iron out, the MVP presenting the session was confident this will soon become a valuable component of the Fabric platform.


Final Word

What a joy to attend a tech event organised by three women! Catherine, Marthe and Emilie pulled off a blinder and I was so glad to get a chance to say hello and thank them for organising such a smooth day.

You could really feel the female touch on things, from the fun tone of voice of the day, to the baskets of ladies accessories in the bathrooms, and not to mention the best food I’ve had at an event. #SNÆCKS anyone?

#SNÆCKS anyone?