As data estates continue to grow in complexity, the burden on IT teams to deliver accurate, timely insights is heavier than ever. Many organisations whose data environments have grown organically over years are experiencing data sprawl, infrastructure inefficiencies and limited interoperability. The question now becomes how to evolve into something more unified, scalable, and future ready.
That’s exactly why we were eager to deliver an interactive Fabric Analyst in a Day workshop, in partnership with Microsoft. This workshop provided data analysts, BI specialists and technical decision makers with a guided, hands-on introduction to Microsoft Fabric, grounded in real practice rather than high-level theory.
Following our most recent workshop we caught up with our course leaders and technical consultants, Andy Jones and Kabita Thapa, to discover the key insights from day.
What is Microsoft Fabric?
For many attendees, Fabric was something they had heard of but never had the opportunity to properly explore. As Andy Jones explained during the session, Fabric brings together what used to be multiple, separate Azure and Power BI components into one cohesive platform.
Traditionally, delivering an analytics project meant stitching together different services and platforms, each with its own configuration, deployment steps, security model, and costs. Fabric replaces this complexity with a single, integrated environment where:
- Data Factory
- Data Engineering
- Data Science
- Data Warehouse
- Real‑Time Intelligence
- Power BI reporting
- And Databases
…all live in one place.
This integrated experience ensures your whole data team, from data analysts to senior data engineers have the capabilities they need. The result is a more cohesive and efficient way to unlock business value from data.
Hands‑on learning in Fabric: The advantages of practical application
A key consideration for the day was participants wanting real experience in Fabric, not just another slide deck. Many had been working in Power BI or other analytics tools for years but had never stepped into the broader Fabric environment.
That’s why the hands‑on labs were so powerful.
Attendees moved through each stage of the analytics lifecycle throughout the day, from ingestion to transformation to visualisation. At each stage practical tasks provided the opportunity for attendees to explore the platform independently using synthetic data to replicate how they might use the tool in the real-world. Kabita and Andy were able to guide attendees one-on-one to build confidence and answer any questions.
One participant, who had previously relied heavily on Excel for reporting and visualisation, remarked how refreshing it was to experiment with Fabric ahead of their organisation rolling it out. Providing them the essential insight into how it can be used to evolve their reporting into a scalable, governed model.
Key Fabric benefits highlighted
1. A single place for all your data: introducing OneLake
The OneDrive for your data, introducing OneLake – Fabric’s central data hub. OneLake brings all organisational data into one governed location rather than scattering it across services and storage accounts. This resonated strongly with attendees during the workshop, and reflects one of Fabric’s most compelling benefits: fewer moving part means more control.
2. Built‑In AI for Faster Insight
Attendees were excited to hear about Copilot in Fabric, where AI assistance is embedded directly in the platform. From transforming data to narrating visuals to suggesting insights, AI is infused throughout the Fabric platform.
3. Practical Skills That Apply Across Roles
Whether you’re a Power BI analyst, a data scientist, or an IT professional responsible for governance and security, Fabric offers benefits that map naturally to different job roles. It also empowers a data culture across the business, with seamless integration from data to visualisation.
Common Microsoft Fabric Misconceptions
A recurring myth uncovered in the workshop is that adopting Microsoft Fabric means rebuilding everything or starting from scratch. Andy addressed this directly:
“In reality, attendees saw how Fabric can complement and extend existing Microsoft data investments while simplifying the overall architecture.”
Fabric works with, not against, your existing Microsoft investments. Teams can modernise at their own pace without wholesale migration.
Another misconception is treating Fabric as a set of separate features. As both Andy and Kabita emphasised, the real value comes from the interconnectedness of the platform, not individual components.
Why Attendees Found the Day Valuable
- They gained exposure to parts of the Microsoft data stack they’d never used before
- They could troubleshoot in real time with two expert instructors
- They left with clarity on how Fabric fits into their organisation’s analytics maturity
- They experienced the end‑to‑end journey of a modern analytics workflow
Feedback from attendees:
Allowing a VM (Virtual Machine) environment so we can effectively trail the tool in a protected environment was really good
The content was really detailed and useful to understand Microsoft Fabric and the trainers were really proactive and helpful.
Course leaders were very knowledgeable and helpful if attendees had any issues with the labs
What’s Next? Pathways following the Fabric Analyst in a Day Workshop
All attendees received a certificate for completing the workshop, but what’s next on their Fabric journey. Depending on their own roles there are many paths that they can take, including:
- Exploring Fabric’s free trial capabilities
- Diving deeper into the workloads most relevant to their role
- Working towards Microsoft Fabric certifications, such as Fabric Analytics Engineer Associate and Fabric Data Engineer Associate.
This workshop is the start of a clear route to certification that can make learning Fabric feel more concrete and something attendees could integrate into their professional development goals.
If you would like to attend a future Fabric Analyst in a Day workshop or want to discuss how Microsoft Fabric can better enable your organisation for innovation. Reach out to our experts using the form below.
It’s becoming increasingly apparent that artificial intelligence will be integral to how organisations operate effectively and remain competitive. But responsibility is a topic that regularly rears its head, and the question of how you use AI responsibly isn’t one purely for IT but for your organisation’s executive leadership. Here we consider how to benefit from AI, while remaining true to your organisation’s values, obligations and stakeholders.
Much is written and spoken of AI’s power to drive business transformation, efficiency and innovation. But as the saying goes, ‘with great power comes great responsibility’.
Using AI responsibly isn’t just about regulatory compliance. It’s about trust, safeguarding reputation, and ensuring that AI strengthens rather than undermines the organisation’s values and purpose. There are three important topics to consider – People, Planet, and Policy.
People
Let’s start by confronting the really big question: jobs. We’ve all seen and heard carefully worded references to AI’s labour-saving capabilities. It does less work, it means fewer workers, but does that also mean redundancies? This question needs considering, carefully, at a very senior level, and very early on.
They’ll need to know the expected time savings and whether these affect fractions of roles or entire roles. You should also consider timeframes in each area, and how these compare to natural attrition, retirement and contract expiration timescales. They’ll also need to know recruitment pipelines, so hiring can be slowed or redirected rather than abruptly frozen, as well as redeployment opportunities and the skills required for new or expanded roles such as AI oversight, data literacy, and creative problem-solving.
This will impact much of what follows.
Enablement, not displacement
Responsible AI should augment human judgement and not replace it – freeing people from repetitive work. But to achieve this, it needs to be accompanied byreskilling and digital literacy programmes that enable employees to work effectively with AI systems. Success should be measured in terms of human productivity and satisfaction, and not headcount reduction.
Transparent and ethical
Everyone involved with AI, from developers to decision-makers, must understand what AI can and cannot do. Build a culture of AI literacy and ethical awareness supported by specific training on responsible data use, bias awareness, and explainability. Employees using AI outputs should be able to interpret and justify its decisions, especially in regulated sectors. Staff must appreciate that humans remain accountable for AI-assisted outcomes and feel confident challenging algorithmic decisions without recrimination.
Inclusion and fairness
Similarly, fairness and inclusion must be embedded in your use of AI. These systems will typically maintain or increase any biases in training data, so utilise diverse teams in AI design and validation. Train models with diverse data sets and monitor for bias, especially in HR, credit, or customer-facing use cases.Treat governance of AI fairness with the importance of a workplace equality and diversity issue, rather than a technical issue.
Planet
AI’s benefits should also be considered in the context of its environmental impact and sustainability. During training AI models can consume significant energy, and operationally AI infrastructure has a significant carbon footprint. But with the right actions, this can be mitigated.
Opt for energy-efficient architectures
Data centres powered by renewable energy, with liquid cooling, and using energy-optimised GPUs (Graphic Processing Units) and ASICs (Application-Specific Integrated Circuits) are more energy efficient. Also consider scheduling AI workloads to optimise power use.
Actively manage your technology lifecycle
Using cloud and hybrid models can allow you to dynamically scale, without having an over-provisioned on-premises infrastructure. Apply sustainability principles to AI hardware: responsibly sourcing, refurbishing and/or reusing, and recycling at end-of-life.
Use AI for sustainability
‘Planet’ doesn’t just mean mitigating AI’s environment impact. AI can also make a positive contribution towards meeting corporate sustainability goals through data-driven energy optimisation, intelligent logistics routing that lowers emissions, predictive maintenance to reduce waste, and carbon accounting.
Policy
A responsible use of AI also depends on robust governance that ensures transparency, accountability, and compliance. A key consideration for the board is who will be accountable for AI ethics and compliance, and how governance can be shown to be effective?
A best practice approach combines collective ownership with clear executive accountability. It is likely to blend existing structures with some new, specialised capabilities. This might take the form of a Chief Information Officer or Chief Digital/Technology Officer with primary accountability, working with a cross-functional AI Governance Board. This would include Technology, Data, HR/People, Legal, Compliance, Risk, Operations, your ESG (Environmental, Social, and Governance) team, and business unit leaders.
This will provide the basis for effectively actioning the following.
Establish an AI governance framework
Determine the principles which will guide your use of AI. These need to be consistent with your organisation’s values and risk appetite and will typically encompass fairness, transparency, accountability, privacy, and sustainability. Bear in mind that different contexts may require different ethical considerations – what’s appropriate in one area may not be in another. AI ethics will touch IT, legal, HR and compliance so ensure that there is clear ownership within and across these areas.
Control and oversight
Integrate AI risk management into existing risk frameworks, with a focus on model validation, auditability, explainability, and version control. Track who built which model, with what data, and how it is performing. Require human-in-the-loop oversight for all critical decision and systems.
Regulatory alignment
There will be external interest in your AI use from regulators, customers, investors and other stakeholders, so aim to stay ahead of expectation. There is an EU AI Act, with most provisions applying from August 2026, and a UK AI Assurance Framework. The Information Commissioner’s Office has provided AI guidance, with sector-specific guidance expected in several areas (like from the FCA in financial services). Maintain audit trails for AI models, data lineage, and decision logic to satisfy auditors and regulators.
But, above all, be transparent about how AI is used, governed, and improved.
A final thought
Using AI responsibly requires deliberate, pre-emptive leadership. It means ensuring that AI use aligns with organisational purpose, is trusted by employees and other stakeholders, and contributes to sustainable growth. Many will do this badly, but those that do it well can successfully position their organisations as trustworthy and responsible innovators.
Cloud Direct can help you successfully benefit from AI in a real and responsible way. Request a call with a subject matter experts through the form below.
Cloud Direct’s Data & AI Practice Lead Dan Knott explains how you can strike a workable balance between speed of delivery, cost, and effectiveness with Microsoft Fabric.
I spend a lot of my time talking to executives and technologists. I understand the time and cost constraints; I understand the pressure to implement fast; I understand that many don’t have the appetite for lengthy assessments and strategising. But I also know that without some consideration of four key factors outside of, but directly impacting, Microsoft Fabric you’ll probably fail.
But first, a quick reminder.
Microsoft Fabric in a nutshell
You probably already know of Microsoft Fabric. It is a one-stop shop for data: a unified data platform that can ingest, process, analyse, and visualise your data. It centralises data storage in OneLake – a single, integrated data lake that supports structured, semi-structured, and unstructured data – and combines capabilities from Power BI, Azure Synapse, and Data Factory into a seamless experience.
Since its November 2023 launch into General Availability, Microsoft has continued to add functionality and, if you’re not already, you should now be looking at it.
Is Fabric a quick win?
Many in IT are going to look at this as a relatively easy implementation. But is it a quick win?
In one sense, yes. Fairly quickly, you can get to the point where it’s installed, providing some nice dashboards, and offering an incremental improvement over Power BI.
But in terms of delivering genuine business value, it won’t.
You’re going to hit obstacles. I know this because those that have gone before you are consistently telling me: “we tried to implement Fabric”, “we hit some bottlenecks”, and “the adoption wasn’t quite there”.
Navigating the pitfalls and driving real value from Fabric
While Fabric will happily ingest data from anywhere, it won’t fix fundamental data issues and it relies on users asking the right questions.
So, consider how the business is going to benefit from Fabric. There are valid analytical, AI, and machine learning use cases. If your use case is analytical, for example, and your interest is in sales, are you looking forwards or backwards? If you’re looking back, what lessons are you trying to take from this? If your focus is the future, how does this need to align with your growth or business strategy?
Regardless of your objectives, if people don’t trust the data then they’ll soon stop using Fabric. This, in itself, raises questions around the data, like its reliability and accuracy (realistically some areas will be better than others), who owns it, and security and governance considerations around who can access what.
Given the chance, I’ll always argue passionately for a strategic consideration of what I call your four key pillars: innovation, platform and technology, process and tools, and people and culture. It’ll help you to understand where you currently have gaps, where you can reliably use Fabric now, any priority areas for action, and enable you to make longer term plans. In short, it’ll enable you to ensure that your organisation can derive real business value from Fabric straight away.
Reality bites
Set against this, there are time and budget pressures: “we need to get this in”, “let’s do it and find out”, “what’s the worst that could happen?”
But from what I’m seeing and hearing, without a bit of thought and planning your implementation won’t get much beyond a tick in the box.
The adoption of Fabric is far wider than just putting the tech in, and if you’re familiar with project management’s ‘Iron Triangle’ you’ll know that when it comes to cheap, fast and good, and can only have two of them.
Striking the right balance
With a little planning and thought, a lot of the pitfalls can be managed and, to an extent, avoided.
Your journey probably won’t be the same as everybody else’s, but if we think in terms of the four pillars I mentioned, you’ll already know that there are some gaps.
What do you want to gain from your data? It needs to be grounded in purpose.
Are there data quality issues? Who’s accountable for this data? Are there governance considerations, perhaps around compliance and who can see which data? Do users have the skills to use the data well?
This will quickly tell you if ‘just do it’ feels rash or even scary, and whether or not you’re setting up Fabric to ultimately fail.
So, why not incorporate a bit of planning up front? Make sure that we’ve got the whole picture and have given some thought to those other areas which will impact the wider implementation of Fabric.
There’s often a lot of value to be gained from a thorough Data Strategy Assessment, but much depends on where you already are and, of course, time and budget pressures. This is where one of our Maturity Assessments will help you quickly create solid foundations for your Fabric implementation.
Microsoft Fabric really can show the value, purpose, and reliability of your data – but please, please, please put a little time into ensuring that your project can deliver business value, and ultimately succeed, before you get started.
If you’d like an informal chat about how you can best approach your use of Fabric, you can get in touch, using the form below.
It’s no secret that businesses are producing more data than ever, and you will only be able to drive greater efficiencies and enable innovation once you have a clear strategy with the right tools in place.
Earlier this year, Microsoft launched Fabric, their all-in-one data solution, for general availability. Microsoft Fabric combines some of Microsoft’s most powerful tools, such as Data Factory, Synapse Analytics, Data Explorer, and Power BI into a unified, cloud-based platform to help simplify your data workflow. The combination of the tools will enable you to innovate with AI safely and securely by managing your data in a single user-friendly platform.