Making sense of challenging data topics one step at a time.

Category: Fabric

FabCon Takeaways

Did you attend FabCon during the last week of March in Las Vegas, NV? If you were not able to attend, it was packed with a lot of great knowledge, exciting announcements, and wonderful data professionals to bounce ideas off of all week. I was lucky enough to attend with my colleague and friend, Michael Heath. At every break when we would reunite, we were so excited to share what we had just learned.

Picture of Michael Heath and Dominick Raimato at FabCon.
Michael and I connecting in person at FabCon!

I know not everyone got to attend this event. And even for those of us who were there, it was a lot of information to take in over three days. I wanted to summarize some of my top takeaways. I mean, it is difficult to distill an entire conference into a single blog post, but I wanted to at least highlight some of the key moments that made me go aha!

FabCon Takeaway #1 – Out of the Box

While I heard this many times before, I finally was able to wrap my head around it during the conference. Fabric is an all inclusive platform to simplify the management of data for all forms of projects. Whether it is warehousing, reporting, or data science, the platform is ready to use.

Like Buying a Car

The best analogy I heard all week was the concept of buying a car. Before Fabric, you needed to pick out each part individually. You needed to identify your storage medium, ETL tool, virtual network components, and data science tools. This is the equivalent of going to the auto parts store and picking each component out one by one and then assembling it to build a car. While some of you might choose to do this, it is a lot of work. You need to not only have the skills to do the data work, but the experience to setup your Azure environment.

Instead, Fabric allows you to buy the car as it is off the lot. When you setup your workspace, you can add components without the hassle of configuration. It just works which is a really nice feeling. Even better, because you are using a capacity license, it is easier to estimate costs. I know there are calculators out there for Azure, but you always forget something when you are attempting to build a budget. Fabric eliminates that challenge with a clear pricing structure. This is a huge step forward for self-service data science outside of the traditional IT function.

Microsoft has been pushing their “5×5” approach with Fabric. The concept is 5 seconds to sign up and 5 minutes to your first wow. While I appreciate this approach, I think there is a little work to go in this space. If you have a guide to help you, I believe this to be true. However, if you are new to Fabric, I do not find it to be quite intuitive for that. Long story short, make sure you do a little learning before you try it on your own.

What About Data Engineers?

Does this mean they are out of a job? Not at all! Data engineers will still be essential. For me, I think it redirects their efforts into more valuable workloads. Instead of fighting with Azure, they are focused on getting value out of the platform. It also opens the door to data engineers and scientists with deep domain knowledge in the business. That is a game changer!

Does this mean our existing data infrastructure is obsolete? Also no! These tools are still essential. If anything, this might allow less important workloads be shifted to Fabric reducing administrative burden. That allows you to focus on the mission critical workloads within your existing infrastructure. While Microsoft would like you to move your workloads to Fabric, not everything has to go.

FabCon Takeaway #2 – CI/CD Is Close

If unfamiliar, CI/CD stands for Continuous Integration and Continuous Deployment. To handle this, you need to have a mechanism to manage your source code for components of Fabric. While it is not 100% there yet, the future is quite bright. While some might not like how this is done, this is a huge step forward.

The integration with Azure DevOps is slick. The ability to create a feature branch and merge it back to the main branch is slick. With a few clicks, you can create a new workspace, replicate your components from the main branch, and replicate it to Git. When you are finished with your feature, you can easily merge it back in. But for me, the nicest part is that you can retire your workspace as part of the merge which is so nice.

I also love the fact you can choose your deployment path. If you would prefer to use Deployment Pipelines in Fabric, you can do so. However, if you rather have Git repos for Dev, Test, and Prod, you can do it too. This allows you the flexibility to manage and control your deployments as you see fit.

Stay tuned for an article on this in the near future!

FabCon Takeaway #3 – Integration Points

For me, the biggest question I kept asking was how well could I integrate these components into other solutions. Creating a machine learning model is fantastic, but useless if you cannot integrate it with other applications. This is huge!

I need to investigate how this works further, but the promise was pretty clear. The idea is that your data scientists can perform their analysis and train a model with Microsoft Fabric. Once completed, they can leverage it via endpoints and provide predictions. I have done this with Azure ML in the past, but the process was a tad clunky. I am hoping this experience gets better with Fabric.

Another teaser was the ability to build your own custom Copilot chat bots within Fabric. This is not using Copilot to create a model, but rather training a Copilot to use your data. Eventually, this data could find its way into other tools. To be honest, this has been somewhat happening already today with the Q&A feature in Power BI. However, if Microsoft can streamline this into a single Copilot experience, this would be huge. Being able to query your data without having to go Fabric would go a long way.

Anything Else from FabCon?

Yes, there is a ton more to talk about. But these were the stand outs for myself. What is hard to put into a post are the conversations with other professionals were outstanding. Meeting some of the conquering heroes of the Power BI world is always fun, but eating lunch with different people was where the biggest insights were gained. Learning about everyone’s unique scenarios allows you to refocus your vision of the platform. I always recommend people attend conferences for that reason alone. That is where you gain the most interesting insights!

Also, since everyone asks this question, Michael and I were able to enjoy some of the fun around us. It was a busy week at the conference and work did not slow down in our absence. While we did not get to any shows, we did get to unwind with other professionals and even meet one of the best Elvis impersonators I have seen in my lifetime.

Michael and I posing with The King

Conclusion

This coming year is going to be exciting when it comes to Microsoft Fabric. Between what was announced at FabCon and what is coming in the near future, the future is bright! I look forward to sharing more insights over the coming months.

Did you make it to FabCon? What were your key insights? Anything I missed in my summary? Tell me in the comments below!

Get Ready for the DP-600 Exam

I took the Implementing Analytics Solutions Using Microsoft Fabric (DP-600) recently and it was a unique experience for me. This was the first time I took a beta exam from Microsoft. It was also the first time I took an exam with the new rules around being able to search Microsoft Learn. I felt it was a different experience, but a successful one as I did pass.

It would be unethical to discuss specific questions and details around the exam. However, there are a few things you can do to make sure you are prepared for this exam. A little preparation can go a long way!

Content Breakdown for the Exam

This exam assesses your knowledge with Microsoft Fabric. As a user of Power BI, you might find elements of this exam easier. Because Power BI is a big part of Fabric, you will find elements of it in there. The breakdown of the exam is as follows:

  • Plan, implement, and manage a solution for data analytics (10–15%)
  • Prepare and serve data (40–45%)
  • Implement and manage semantic models (20–25%)
  • Explore and analyze data (20–25%)

With this knowledge in hand, it is easier to start to understand what you need to focus on when it comes to the prep for the exam. Clearly, prepare and serve data is a hefty portion of the exam. And if you have not touched Fabric, you will not perform well with this exam. But all is not lost as you will need some of that knowledge to be successful!

Studying for the DP-600

Ideally, everyone would attend a training for the DP-600. However, that is not always realistic. My work schedule has been chaotic, so even if I wanted to attend a class, how would I do it?!

To start, Microsoft has a few learning paths that can get you on the right track. They are designed to walk you through the core components of Microsoft Fabric. They are available on the DP-600 page, but I have them summarized here:

In addition, Microsoft has created some learning paths to help you out. It might be best to start with the DP-600 Study Guide from Microsoft. This will help you wrap your head around the core components of the exam. It also gives you some links to additional learning paths to help you out.

However, you will want to get some hands on experience with Microsoft Fabric. You can fire up a free trial if you have an active Microsoft 365 environment in place. To get started, sign into Power BI, go to your account in the upper right hand corner, and start the trial:

Screenshot of account view where you can initiate a trial of Microsoft Fabric.

You will have sixty days to explore Microsoft Fabric at no cost. Make sure you are ready to commit to your learning to ensure you do not run out of time before you take the exam!

Practice Makes Perfect!

I am always prompted to add a practice exam when I sign up for an exam. But Microsoft has a free exam you can take to get a baseline of your knowledge. Take a look at this screenshot from the DP-600 exam page:

Screenshot of the scheduling pane for the DP-600 exam

This practice exam runs you through fifty multiple choice questions. It does not mimic the actual exam as there are no case studies. In addition, the exam does have some questions that do not use multiple choice for the answers. It is not an accurate representation of the exam, but it does a good job of giving you a baseline of your knowledge.

If you have not taken many certification exams, you might want to invest in the paid practice exam. In general, most exams are broken into three sections. The majority of the exam is a collection of questions that you can come back and review before moving on to the next section. However, once you move on, you cannot go back. The same is true for the lab section. Then, there is a section where you cannot go back to update previous answers. That causes some anxiety for some people, so taking a paid practice exam can help prepare you for that experience.

Regardless of the route you choose, I love that you can check the answer on the spot. When you do it, they provide a link to the documentation on Microsoft Learn that explains why the answer is correct. Sometimes you are just guessing and get lucky, but this will help you confirm why you chose the right answer.

Anything Else to Consider?

The resources above helped me pass the exam. However, you might need to know a little more than what is outlined above. Being multi-lingual will help you be successful with this exam. You need to know your DAX, M (both code and GUI), T-SQL, and PySpark syntax and best practices. You might not be an expert in all of them, but a strong command of the basics is critical.

Microsoft Learn was helpful for me as there were a few items that I got stuck on. In fact, my wife commented on the fact that I spent more time on this exam than any others I have taken in the past. Having Microsoft Learn at my disposal helped a lot, but it was not a replacement for my existing knowledge. While it will assist you, relying on it will result in you running out of time.

Conclusion

I felt like this exam was one of the better exams I have taken in the recent past. I feel like so many certification exams are about trying to trick you. The broad amount of content in this exam allowed it to be both challenging, yet meaningful. It may seem weird, but I really enjoyed this exam! If you are looking to continue your analytics journey, you will want to add this exam to your list!

Have you taken the DP-600 yet? What did you think about it? Did you find it meaningful? Tell me in the comments below!

Powered by WordPress & Theme by Anders Norén