Making sense of challenging data topics one step at a time.

Author: Dominick Raimato Page 2 of 5

Drill Through for Deeper Insights with Power BI

It is a pet peeve of mine and it drive me nuts. I hate it when someone tries to fit as many visuals on the canvas as humanly possible. It is cluttered, ugly, and hurts the performance of your report. But how do we get around this issue? The answer is simple! Add a drill through to your report!

I will be honest – I was confused by the initial setup like you might be. The setup is a little intimidating at first, but once you understand how to configure the drill through, it is easy to implement. Like what I said at the end of my article about bookmarks in Power BI, you will probably get things wrong the first time. However, with some guidance, I hope you will be able to implement them with ease!

Preparing Your Report

Generally, this starts with a basic report page with a goal. Right now, I have a Sales Data report page that has a lot of information. However, I want to drill down on a manufacturer and learn more about the products being sold. If you look at our current Sales Data tab, you will note that there is some product information, but not the context I want:

Screenshot of the Sales Data report page with a line chart comparing year over year of sales data, revenue by state on a map, a table of the top 25 products and their manufacturer by revenue, and a bar chart of quantity sold by manufacturer.
Current state of the Sales Data report

Our first challenge is to create a meaningful drilldown page that shows information by manufacturer. In my scenario, I wanted to see the top products by both revenue and quantity. I also wanted to see product segments and have the ability to filter by category. The result is a drilldown page that looks like this:

Screenshot of the manufacturer drilldown that shows two tables that return the top 20 products by revenue and quantity respectively.  There is a donut chat showing distribution of revenue by segment and a tile slicer for product category.
Manufacturer drilldown report page

You will note that I did not place a slicer for manufacturer on the canvas. This is because the drill through will handle the filtering for us. We will tackle that next.

Add Drill Through Logic

With our detailed report created, we need to enable the drill through functionality. On the report page, we must add the field we want to use for the drill through. Since we want to look at the manufacturer, we will add that field to our visualization pane under drill through:

Adding the manufacturer field from the data panel to the drill through section for the visualization panel.
Adding the drill through field

One thing you might not have noticed right away is the back arrow that was added to the page in the upper left hand corner. This allows you to return to the original page when you drill down on your data. By default, this is set to just go back to the last page. However, you might want to hard code a page navigation. This will ensure you return to the correct page and is a requirement if you allow drill through on multiple levels:

Screenshot of the back button that is added to the page along with the configuration of the button.
Configuring the back button added to the report canvas

You will note that with the drill through, there were two additional settings available to us. The first setting is on by default which is to keep all filters. This simply maintains the filtering from the page where the drill through occurred. So if it is filtered on a specific year, it will be maintained.

The cross-report drill through allows you to make this report page available on other reports. When published to the Power BI service, other reports in the workspace can drill through with this page and see the detail. This allows you to reuse this page on multiple reports which is very convenient.

Testing the Drill Through

With everything in place, it is time to test our drill through. Going back to the Sales Data page, we can right click on one of the bars in our quantity by revenue visual. From there, we can see the drill through menu option with our new report page:

Screenshot of the drill through menu on our quantity by manufacturer visualization.
Testing our drill through settings

If you have the ability to select your new page, then you did it right! Go ahead and click on it and you will see the new detail page filtered on the manufacturer:

Screenshot of the drill through report page filtered on a manufacturer.
Drill through report page filtered for Van Arsdel

It take a little work to configure, but the results are beautiful. It does a lot to help clean up your canvas and allow you to get more detail on a specific dimension. With a clean look like this, it makes it easier to consume your report with the ability to dig further for details.

Anything Else to Consider?

What if you want to use other fields for the drill through on the same report? You can easily facilitate that by adding them to the drill through field. This will give you the ability to use the same page for multiple dimensions to reduce effort.

In addition, consider the detail of your page when designing it. The ability to reuse it over and over again will reduce a lot of repeated effort. If you leverage the cross report feature, it could make your drill through page an extremely helpful feature to any report you create.

Conclusion

This is one of those unsung heroes of the Power BI Report. It is hidden in plain sight and not always easy to create. But with a little effort, it can really transform your report. It is worth the time as it will take the experience of your reports to the next level!

So, have you used drill throughs on your reports? What are your use cases? Did I miss anything? If so, tell me in the comments below!

Refresh from Files Without a Power BI Gateway?!

I cannot begin to tell you how many people I meet who think they need a Power BI Gateway to refresh their dataset using a file as a source. Yes, if you have selected a file data source, you will need to use a Power BI Gateway. However, there is a way around this challenge. It takes a little extra work to prepare your dataset, but you can make it work.

Before everyone starts moaning about using files as a data source, sometimes this is unavoidable. I worked with a third-party business travel platform that did not provide an endpoint for me to connect directly to the data. As a result, I needed to export data on a weekly basis to feed my dashboard. Since then, that platform has resolved this issue.

I also have a current scenario where I do not have access to the APIs for an ERP solution. Because of roles and permissions, I am not allowed to access this data. A creative solution is to export the data through a scheduled job that sends a CSV file to my email. From there, I can use Power Automate to position the data so it can be accessed by the Power BI Service without using a gateway. Is it ugly? No doubt. Somewhat obnoxious of a process? You betcha! But does it work? Absolutely!

What is the Power BI Gateway?

The Power BI Gateway might be new to you. If you are not an administrator of your Power BI environment, you probably do not know about it. The concept is simple – the gateway brokers access to your data sources that reside on premises within your network. Instead of opening up firewall ports or creating crazy firewall rules, you can deploy a gateway to broker the connection.

There are two types of gateways that can be used. The enterprise gateway provides an enterprise-wide connection for your data sources with the controls to manage access to them. Generally, you install the gateway on a server that has access to your data sources. Once it is in place, you configure your data connections within the Power BI Service by adding credentials and delegating access. The biggest benefit besides the controls, you can provide redundancy to your gateway.

The other type of gateway is the personal gateway. This is deployed by an individual user. The data source must be accessible from the device from the personal gateway, but controls of the data source are essentially inherited by the user. This is a quick way to get a gateway in place, but causes a risk because it is tied to an individual user with no oversight. In addition, your device must remain on for the refresh to be successful which does not always work out.

Regardless of the type, the Power BI Gateway allows you to connect to any data source that resides on premises. While files work with gateways, they are a bit of a burden to manage. Unlike other sources such as SQL Servers, they have a tendency to move around or adjust. If we can manage this without the gateway, it will be easier for end users to manage without having to involve an administrator in the process.

Bypassing the Power BI Gateway

In short, the easiest way to bypass using a Power BI Gateway for your files is to use SharePoint Online. Since it is part of Microsoft 365 and already in the cloud, you do not need a gateway to connect to your file. Start by making sure your file is in SharePoint and open it in the desktop app from the site. From there, go to the file ribbon, select info, and click on copy path.

Once you have a file path, create a new source in Power Query and select the web connector. Paste your URL into the field, but remove the ?web=1 at the end of the path. Click OK and use your Organizational Account to access your file. From there, you will be prompted to go through and access the data based on the file structure. I talk about the web connector with REST APIs, but the concept is the same if you have never used this connector before.

Screenshot of the Web Connector configured to connect to a file inside of SharePoint Online without the Power BI Gateway.
Configuring the Web Connector for your file in SharePoint

Now, this process works well for Excel Workbooks and CSV files as they can be opened with Microsoft Excel. But what about JSON, XML, or Parquet files? You can get around this challenge by simply adding any Microsoft Office file into the directory to get the file path and replace the file name with the correct one. This will save you the effort of building out the URL.

But I Already Have Done So Much!

So often, I have people tell me that they have spent hours transforming their query already and fear they need to start from scratch. But believe it or not, it is easy to resolve. Let’s start with this block of code in your advanced editor:

let
  Source = Excel.Workbook("c:\data\taxi_zone_lookup.xlsx", 
    null, true), 
  ExcelData = Source{[Item = "TaxiZoneTable", Kind = "Table"]}[Data]
in
  ExcelData

We can see from the code block above, we are accessing the file from a folder in the root of our hard drive. Once we have in inside of SharePoint, we can replace the “c:\data\taxi_zone_lookup.xlsx” with a web connector:

let  
  Source = Excel.Workbook(
    Web.Contents(   "https://<tenant>.sharepoint.com/sites/NYCTaxi/Shared%20Documents/Taxi%20Data/taxi_zone_lookup.xlsx"
    ), 
    null, 
    true
  ), 
  ExcelData = Source{[Item = "TaxiZoneTable", Kind = "Table"]}[Data]
in
  ExcelData

You will have to excuse the formatting in this example, but you can see it was a simple swap out in the code. With the change in place, we did not need to make any additional changes to the downstream code. While this code block is small, we could have a lot of transformations in place which would require considerable rework.

Anything Else to Consider?

A big benefit to moving to the cloud location is that you can easily share your Power BI Desktop file for development work without having to update the file path. You will need to share the file with users who will be working directly with the source, but that is easy to manage.

In addition, you can easily add files to your data source without having to get an administrator involved. Regardless of the type of Power BI Gateway you want to use, you will likely need an administrator to help you. If you are using the enterprise gateway, you will need a platform administrator to configure the connection. If you want to use a personal gateway, you might need someone with administrator rights to install the software on your machine. Regardless of the type, you will likely need some help.

Conclusion

I wish more people talked about managing file refreshes this way. This is a great option, yet Microsoft tends to keep quiet about it. In fact, you can leverage this method to provide a refreshable report from a form you created. Patrick from Guy in a Cube created a video on how to achieve this. Hard to believe how a few small changes could make your life so much easier!

So have you ever created a refreshable interface with Power BI like this? Do you have a use case for this? Tell me about it in the comments below!

Artificial Intelligence is Available Inside of Power Query?!

Unless you have lived under a rock in the past year, you know that artificial intelligence is a popular topic. Whether it is Copilot, ChatGPT, Bard, or Open AI, artificial intelligence is here to stay. It is time to embrace the utilization of artificial intelligence and learn how to embrace it for our benefit.

But surely, the use of artificial intelligence inside of Power Query is brand new, right? Wrong! I remember seeing it available five years ago and it is a powerful tool to integrate with your models! I am not going to be able to go into all of the details of artificial intelligence in this article, but rather help you understand how to integrate it within your existing queries.

What Is Artificial Intelligence?

In the fast few years, Artificial Intelligence (AI) has become a force in society. It permeates the world we live in and has become an integral part of how things are done. We see examples of it in our lives through things like chat bots, self-driving cars, and even social media. While some may find it intimidating, the reality of AI is that it has the power to make our lives easier.

Artificial Intelligence is used to describe technologies that perform tasks that once required human intelligence to complete. When properly deployed, AI models can recognize patterns, solve problems, make decisions, and understand natural language. These models do not organically exist. They are created through extensive training activities to ensure the results you receive are valid. The work we perform with Power Query may result in the creation of a model while others will leverage one.

While generative artificial intelligence (GenAI) is all the rage right now, we will not be using it for our queries. Instead, we will be integrating models that have been trained to provide a response. You might be the creator of these models while others are creating them for you. The result is the opportunity to bring back a predictive response based upon your query.

How to Get Started?

While Power Query is available in both Microsoft Excel and Power BI, this functionality is only found in Power BI. While you can leverage similar AI models in Microsoft Excel, it is done outside of Power Query. This is an important item to note if you prefer using Power Query inside of Excel.

Second, you must have some form of a premium workspace to enable this functionality. This can be in the form of a Premium Per User, Premium Capacity, or Azure Power BI Embed environment. If you already have one of these environment at your disposal, you are set. If not, check out my article on licensing strategies around Power BI. I recommend spinning up an Azure Power BI Embed SKU or activate a Premium Per User trial if you want to try these features at a minimal cost.

What if you do not have a Power BI environment available to you? Unfortunately, you will not be able to take advantage of this functionality inside of Power Query. I recommend you work with your organization to spin up a trial to explore this functionality.

Pre-Built Models in Power Query

Power Query allows us to leverage pre-built models. Leveraging Azure Cognitive Services, we can perform text and image analysis on our queries. We can do this from the add column ribbon inside of Power Query:

Screenshot of the add column ribbon inside of Power Query with an arrow pointing to the text analytics function.
Activating the Text Analytics function

If you are doing this for the first time, you will be prompted to allow the analysis of your data. Once you have done that, you can select which model you wish to use:

Screenshot of the Text Analytics screen with options to select the detect language, extract key phrases, and score sentiment models.  Next, you can select the column to analyze and language associated with the text.
Selecting a Text Analytics model for your analysis

As you can see, you can select multiple pre-built models to perform your analysis. With text based analysis, you should consider both the text and the language being used. This helps ensure the proper analysis is being performed. By default, it will use your tenant’s default language. If your column is using the same language consistently in the query, you can hard code it in this screen. However, if you have multiple languages, you might need to to use the detect language model to bolster your analysis.

You can also identify tags for images you ingest with Power Query. This is not easily performed, but is ideal for small images. Because of the practicality of this, you will likely not use the vision model. However, you can try it out for yourself:

Screenshot of the tag images model in the vision module from Azure Cognitive Services.
Configuring the tag images model

These models can provide quick analysis and provide meaningful results with minimal effort. I find that combining these results with other dimensions it allows me to sort through results and find the most meaningful responses in surveys.

Custom Artificial Intelligence Models

If you have created a custom model using Azure Machine Learning, you can leverage it within Power Query. Leveraging a public dataset around diabetes, I created a custom model with Azure Machine Learning. With an endpoint created, I can access it within Power Query with a few clicks:

Screenshot of the home ribbon in Power Query with an arrow pointing to the Azure Machine Learning function.
Activating the Azure Machine Learning function

Once I have selected Azure Machine Learning, I can select which endpoint I wish to use. I have already deployed my model and selected it below:

Screenshot of an Azure Machine Learning endpoint being configured for Power Query.
Configuring my predictive model in Power Query

As you can see, there are multiple inputs I must identify for my endpoint to return a predictive response. I can identify the different columns in my query and return a result. We will need to identify values for each field to allow this to work.

It is important to consider that Azure Machine Learning uses a consumption pricing model. Every time you refresh your query, it is seeking new predictions. Keep this in mind when building out your queries as you could rack up some extra costs without realizing it!

Any Other Options?

If you are a user of R or Python, you can leverage custom algorithms and models inside of Power Query as well. You need to deploy software on your device to make this work and can be prohibitive if you do not have admin access on your machine. I do go into detail on these tools in Chapter 11 of my book, but decided not to discuss them in this post. Primarily, I wanted this to be about out of the box and easy implementation.

Conclusion

It is amazing how artificial intelligence has infiltrated Power Query. For many, they have no clue it was there or what is needed to get it activated. With a little effort, you can really transform your queries with a few clicks. I recommend you try it out to see how you can harness the value of artificial intelligence in your queries!

So did you know about this feature in Power Query? Have you used it before? Are you going to try it out? Tell me about it in the comments below!

Self Service Analytics Does Not Equal Easy

This is a difficult post to write. From a personal perspective, I really struggle with this topic. In my daily work, I assist organizations with enabling a self service data strategy. Collaborating together, we spend hours working to find the right path that matches the needs and goals of the organization with the realities of the organization. I am a huge advocate for self service analytics and believe in it. Because of this, I want to be ridiculously helpful with everyone I encounter.

Sadly, I run into people on a regular basis that want “a little help” to perform some analysis with Power BI or Excel. The problem with this is that the ask is quite complicated. The reality of my situation is that I can make the ask happen quickly. I can spend an hour or two on a report and it will look beautiful. Having worked with Power BI for almost ten years, I have a pile of experience to make it happen efficiently. But the requestor wants me to help them build it themselves. This takes three to five times longer because I am guiding someone through the process. And I am all for doing this because it helps enable another person to use Power BI or Excel.

While I am all for helping people, I find that their interest in learning about these tools is often non-existent. They have a goal in mind that needs to be completed and they think Power BI or Excel is an easy way out. It is a self service analytics tool, so it should be easy! And yet they find the opposite. Without taking taking the time to understand the basics, they end up lost.

What makes this hard for myself and others is that it cheapens the skills we possess as data professionals. Data is not easy. Personally, I find this approach to be a little insulting. So how do we fix this problem?

Setting Expectations for Self Service Analytics

I started responding to every request I receive with clear expectations around self service analytics. I explain the challenges and skills they need to possess before they embark on such a project. Sometimes this scares them off, but it is reasonable way to find alignment.

I often share the basics they will need to learn to be successful. I start with understanding data modeling. We start by talking about things like building a star schema, using a date table, and managing relationships. Sometimes a flat table from Excel or SharePoint is fine, but other times they want more depth and dimension to their reports. Without this in place, they will never achieve their goal.

Next, we discuss how to build out measures. Like every beginner, including myself at one time, they rely on implicit measures. These measures often provide basic aggregations and get the job done. However, they often ask for something more complex. They need to perform time based analysis or use filtering. They are going to need to learn how to write DAX expressions for explicit measures.

Lastly, we will discuss what report design looks like. Often, I have a theme file or template for the organization available, so this helps with branding. But it is important to address best practices with data visualizations. This avoids confusion and provides a better consumer experience.

Assign Some Homework

With expectations set, it is time to assign some homework. Self service analytics requires some investment. But before helping anyone, I assign some basic training. The concept is simple – if someone does not know the basics, they are never going to understand the advanced topics.

Microsoft Learn is my go to for training others around the basics of Power BI. Because it is free, the only real investment is time. It covers a lot of the basics of the tool and allows me to focus on the more advanced topics. I, just like you, need to protect my time. The more I can push to other resources, the easier it is for me to stay focused on my work.

There are two benefits from using this method. First, I can ask for a badge validating that they completed the training. However, I like that this method helps gauge the interest of someone who is asking for help. The course linked above takes about three hours to complete. If they cannot invest three hours in themselves, then why should you invest three hours in them? The answer to that question is up to you, but it sets expectations for the relationship you will encounter through this engagement.

Create Working Sessions

At this point, if they have completed their homework, I recommend creating working sessions to set boundaries. I start with setting up 45 minute sessions with a clear agenda. Ideally, a weekly working session is best. Naturally, you might need to schedule more frequent sessions based upon the business requirements. It is important that you agree upon the cadence to avoid conflict in the future.

For each working session, start with a recap of the previous session. Review the previous topic and any progress made since you last met. Then, work through the assigned topic for the current week. Make sure it is recorded so it can be referenced afterwards. Lastly, decide what the topic of the next working session. This helps keep future sessions focused and your timeline compact.

I recommend that you continue assigning homework. For example, if you have an upcoming working session that will focus on a particular DAX statement, provide resources ahead of time. If you have a measure that needs to use the SWITCH() function, send them to an article like this one to help them understand the concept ahead of time. You can also assign work to be completed in their report before the next session to streamline your working sessions.

As a rule of thumb, I work to keep sessions scheduled. An ad-hoc working session often ends up being a mess. You also have other deliverables and expectations from your leadership that could get in the way. If someone needs time with me between sessions, I generally ask them to schedule it. I feel like the “Can we hop on a quick call? I have a quick question!” conversation always comes at the most inopportune time. Setting these boundaries goes a long way to ensuring you can maintain a balance in your workday.

Provide Some Learning

It never hurts to provide additional learning and training. There are several sources out there. There are several books and sites out there to help. If you need help with Power Query, you can check out my book as an option to assist. If it is with DAX, I recommend The Definitive Guide to DAX by Marco Russo and Alberto Ferrari. Lastly, you can suggest Now You See It by Stephen Few if they need help with data visualization.

At some point, you they will need to embrace self service analytics on their own. By providing materials, they can be focused on their learning and be able to sustain their own solutions. These resources will provide assistance both during your working sessions and in the future.

If self service analytics is a goal, investment in learning is key. I see customers who continue to struggle with this because they do not invest in training and learning. When you are working with an individual, at some point they will need to invest more in themselves to achieve that goal. If you do not help them head down that path, you will be stuck helping them forever.

Conclusion

I love helping people discover the world of self service analytics! I hope you are too! Just make sure you help others in a sustainable manner. These types of relationships are good to have, but you need to manage them well. Be ridiculously helpful – just take care of yourself as well!

Have you encountered this situation? Do you struggle with people who think self service analytics is easy? If so, tell me about it in the comments below!

Custom Headers with Calculation Groups

In working with a customer a few weeks back, I ran into another odd scenario. This was a new ask for me, but made sense in a post COVID-19 world. Normally when I build out time intelligence with Calculation Groups, I usually only perform comparisons between the previous year and the current year. However, this customer wanted to go back a few years to be able to see the post COVID-19 impact on their metrics. Naturally, Calculation Groups makes quick work of a request such as this. However, there was no way to create their own custom headers and they found it confusing:

Screenshot of generic headers assigned from calculation groups.
Matrix with generic column headers

The use of generic headers led to confusion when reading the visualization. On top of that, they had a fiscal year that started on July 1st. This made it more complicated to understand which fiscal year they were currently in. In short, they wanted to provide dynamic column headers in the matrix visual that aligned with the associated fiscal year. If you were looking at the 2023-2024 fiscal year, the labels need to translate accordingly:

Calculation Group LabelCustom Column Label
CYm32020-2021
CYm22021-2022
CYm12022-2023
CY2023-2024
YoYChange from Last Year
YoY_Pct% Change from Last Year
CYm3 Year End2020-2021 YE
CYm3 Variance YE2020-2021 Var YE
CYm2 Year End2021-2022 YE
CYm2 Variance YE2021-2022 Var YE
CYm1 Year End2022-2023 YE
CYm1 Variance YE2022-2023 Var YE
Generic versus custom labels

Well this was a challenge and upon all of my research, I found a possible solution. And while I was able to get my solution to work, I am not 100% sure that this is still the best solution. However, it worked for my situation. And while the performance is not the greatest, it still got the job done.

To get started, you do need to understand about Calculation Groups and using Tabular Editor. If you have not used them before, the guys over at SQLBI have a great series on how to use Calculation Groups. And if you have not already done so, head over to TabularEditor.com to download the latest version to create your Calculation Groups. For our scenario today, we are going to assume you already have your Time Intelligence measures setup. Our focus is on applying the logic for the headers.

Creating the Column Headers

There are a few challenges with the creation of these column headers. One, they must map to a calculation group item. Two, they must follow the naming convention outlined above. And three, they must be properly sequenced.

We are building our logic inside of Power Query under the assumption we do not have access to our data source to enact this solution. This allows us to see how versatile Power Query can be for our projects. However, you are welcome to perform this logic upstream in a SQL server if you have one available.

If you are not comfortable with Power Query, I suggest you check out my book which will give you the knowledge and experience to create this table.

Reference your date table

The first step in the process is that we will right click on our date table and reference it in another query. From there, we will right click on the Fiscal Year column and remove all other columns. Once completed, we will remove the duplicates in the fiscal year column so our results look like this:4

Screenshot of our custom column headers query starting with the fiscal year data extracted from bi_date.
Referencing bi_date for our custom column headers

This is the easiest way to ensure that your custom column headers will align with your current date table. This will eliminate any issues with it being out of sync in the future.

Add mappings for Calculation Groups

Our next step is to add our mappings so our custom columns will leverage the right calculation group measures. Referencing the table above, we will add a new column for each calculation group item. We will also fill in the column with the sequence value in the order we wish to display these items.

For each measure, go to the Add Columns ribbon and select Custom Column. Change the column name from “Custom” to your label from the Calculation Group. Then add the sequence number in the formula box below:

Screenshot of creating a custom column for each measure from Calculation Groups
Create a custom column for each measure

When finished, your list of labels should look something like this:

Screenshot of all measures from Calculation Groups added into the query as columns.
All measures loaded as columns

The last thing we need to do is unpivot our columns so they become rows in the dataset. Right click on the Fiscal Year column and select “Unpivot Other Columns”. This will turn the other columns into rows. Rename the columns so they are measure and measure sequence respectively:

Screenshot of measures from the Calculation Groups pivoted into rows
Measures pivoted into rows for use

Now that this has been done, it is time to build out our logic for the labels.

Assigning labels with a conditional column

At this point, you have two choices. Because of the complex nature of our labels, you will need to write them out in the advanced editor. The question is if you would prefer to get the conditional column logic in place first. If you are not comfortable with the logic, you can use the graphical interface to achieve this. Otherwise, you can use if/then/else in the advanced editor.

I am not going to get into the weeds with this code as there were a number of functions used. However, I will provide a full printout of the code for this query when we are finished. In the end, our goal is to have our labels show up accordingly:

Screenshot of custom column header labels for our calculation groups
Creating custom column header labels

This process will take some time, but the trick is to use the measure sequence to make your life easier. By using that, it will make your creation of column headers easier. It takes some practice, but you will get there quickly.

Address sequence of columns

This was the trickiest part. We will implement this solution using conditional columns, but there is a catch. We will be sorting our custom header column with this new sequence column. However, we must have an equally distinct value return for the sequence. So how do we make that work?

The trick is grouping. CY, CYm1, CYm2, and CYm3 will be part of the first group. YoY will be in group 2 and group 3 will be YoY_Pct. Then the year end and year end variance columns will be part of group four.

Now, you might already be concerned about a collision. And you are correct, there is one at this moment. But we are going to do one more thing to fix this. For groups 1 and four, we are going to add the fiscal year as a decimal value. This will allow us to use consistent headers and maintain sequencing. I promise you that I have spent hours figuring this out and I do not know of a better way to achieve this outcome. If you think there is a better way to do it, give it a try. But I know for a fact that this outcome works:

Screenshot of the header sequence column for our custom headers.
Sequence column added for our custom headers

Now that we have this in place, we can look at the final code.

Final Code – Custom Column Headers

As you can tell, this was not an easy process. It took a lot of custom coding to make it work. However, the results will speak for themselves shortly. Regardless, here is the code we used to achieve our outcome:

let
  Source = bi_date, 
  #"Removed Other Columns" = Table.SelectColumns(
    Source, 
    {"FiscalYear"}
  ), 
  #"Removed Duplicates" = Table.Distinct(#"Removed Other Columns"), 
  #"Added CY - 3" = Table.AddColumn(#"Removed Duplicates", "CYm3", each 0, Int64.Type), 
  #"Added CY - 2" = Table.AddColumn(#"Added CY - 3", "CYm2", each 1, Int64.Type), 
  #"Added CY - 1" = Table.AddColumn(#"Added CY - 2", "CYm1", each 2, Int64.Type), 
  #"Added CY" = Table.AddColumn(#"Added CY - 1", "CY", each 3, Int64.Type), 
  #"Added YoY" = Table.AddColumn(#"Added CY", "YoY", each 4, Int64.Type), 
  #"Added YoY%" = Table.AddColumn(#"Added YoY", "YoY_Pct", each 5, Int64.Type), 
  #"Added CY - 3 YE" = Table.AddColumn(#"Added YoY%", "CYm3 Year End", each 6, Int64.Type), 
  #"Added CY - 3 Var YE" = Table.AddColumn(
    #"Added CY - 3 YE", 
    "CYm3 Variance YE", 
    each 7, 
    Int64.Type
  ), 
  #"Added CY - 2 YE" = Table.AddColumn(
    #"Added CY - 3 Var YE", 
    "CYm2 Year End", 
    each 8, 
    Int64.Type
  ), 
  #"Added CY - 2 Var YE" = Table.AddColumn(
    #"Added CY - 2 YE", 
    "CYm2 Variance YE", 
    each 9, 
    Int64.Type
  ), 
  #"Added CY - 1 YE" = Table.AddColumn(
    #"Added CY - 2 Var YE", 
    "CYm1 Year End", 
    each 10, 
    Int64.Type
  ), 
  #"Added CY - 1 Var YE" = Table.AddColumn(
    #"Added CY - 1 YE", 
    "CYm1 Variance YE", 
    each 11, 
    Int64.Type
  ),
    #"Unpivoted Columns" = Table.UnpivotOtherColumns(
    #"Added CY - 1 Var YE", 
    {"FiscalYear"}, 
    "Measure", 
    "MeasureSequence"
  ), 
  #"Added ColumnHeaders" = Table.AddColumn(
    #"Unpivoted Columns", 
    "ColumnHeaders", 
    each 
      if [MeasureSequence] = 0 then
        Number.ToText([FiscalYear] - 3)
          & "-"
          & Number.ToText([FiscalYear] - 2)
      else if [MeasureSequence] = 1 then
        Number.ToText([FiscalYear] - 2)
          & "-"
          & Number.ToText([FiscalYear] - 1)
      else if [MeasureSequence] = 2 then
        Number.ToText([FiscalYear] - 1)
          & "-"
          & Number.ToText([FiscalYear])
      else if [MeasureSequence] = 3 then
        Number.ToText([FiscalYear])
          & "-"
          & Number.ToText([FiscalYear] + 1)
      else if [MeasureSequence] = 4 then
        "Change from Last Year"
      else if [MeasureSequence] = 5 then
        "% Change from Last Year"
      else if [MeasureSequence] = 6 then
        Number.ToText([FiscalYear] - 3)
          & "-"
          & Number.ToText([FiscalYear] - 2)
          & " YE"
      else if [MeasureSequence] = 7 then
        Number.ToText([FiscalYear] - 3)
          & "-"
          & Number.ToText([FiscalYear] - 2)
          & " Var YE"
      else if [MeasureSequence] = 8 then
        Number.ToText([FiscalYear] - 2)
          & "-"
          & Number.ToText([FiscalYear] - 1)
          & " YE"
      else if [MeasureSequence] = 9 then
        Number.ToText([FiscalYear] - 2)
          & "-"
          & Number.ToText([FiscalYear] - 1)
          & " Var YE"
      else if [MeasureSequence] = 10 then
        Number.ToText([FiscalYear] - 1)
          & "-"
          & Number.ToText([FiscalYear])
          & " YE"
      else if [MeasureSequence] = 11 then
        Number.ToText([FiscalYear] - 1)
          & "-"
          & Number.ToText([FiscalYear])
          & " Var YE"
      else
        "Error", 
    type text
  ), 
  #"Added HeaderSequence" = Table.AddColumn(
    #"Added ColumnHeaders", 
    "HeaderSequence", 
    each 
      if [MeasureSequence] = 0 then
        (([FiscalYear] - 3) / 10000) + 1
      else if [MeasureSequence] = 1 then
        (([FiscalYear] - 2) / 10000) + 1
      else if [MeasureSequence] = 2 then
        (([FiscalYear] - 1) / 10000) + 1
      else if [MeasureSequence] = 3 then
        (([FiscalYear]) / 10000) + 1
      else if [MeasureSequence] = 4 then
        2
      else if [MeasureSequence] = 5 then
        3
      else if [MeasureSequence] = 6 then
        (([FiscalYear] - 3) / 10000) + 4
      else if [MeasureSequence] = 7 then
        (([FiscalYear] - 3) / 10000) + 4.00001
      else if [MeasureSequence] = 8 then
        (([FiscalYear] - 2) / 10000) + 4
      else if [MeasureSequence] = 9 then
        (([FiscalYear] - 2) / 10000) + 4.00001
      else if [MeasureSequence] = 10 then
        (([FiscalYear] - 1) / 10000) + 4
      else if [MeasureSequence] = 11 then
        (([FiscalYear] - 1) / 10000) + 4.00001
      else
        0, 
    type number
  )
in
    #"Added HeaderSequence"

It almost looks worse than you might imagine. However, keep in mind that some of this is to help ensure we have a solid experience with our custom headers.

At this point, we will close and apply our query changes and load them into our data model.

Preparing the data model

At this point, we have created our custom headers. Next, we must map them for use with our Calculation Groups and fit them into our model. We will need to build a relationship to our custom headers query from our date table and create a custom measure to leverage our calculation group.

Connect date and custom headers table

Our first task is to connect our date table with our custom column headers table. This will filter down our custom column to only show the headers we need for a selected fiscal year. We will simply connect the fiscal year columns on both tables to build the relationship.

Now, you might be concerned about the many to many relationship being established. While often discouraged, there are times where they are required. This is one of those moments. We will go ahead and click OK to create the relationship:

Screenshot of the many to many relationship being created between the date and custom column headers tables.
Connecting the date and custom column headers tables

With this relationship in place, we can now focus on creating a measure to return our desired results!

Custom measure for Calculation Groups

The last step before we create our visualization is to create a measure that leverages our calculation groups, but it is performed from the custom column headers table. A fun fact that you might not realize is that you can use your calculation groups as a filter in a CALCULATE() function. We are going to use the measures we created inside of the custom column headers table to perform that exact filtering!

The formula is simple. We are going to use CALCULATE(), call our Revenue measure, and select our time intelligence measure to apply. The result is a measure that looks like this:

Total Revenue = 
   CALCULATE (
       [Revenue],
       TimeIntelligence[Name] = MAX ( CustomColumnHeaders[Measure] )
   )

With our measures set, we just need to create our matrix.

Add custom headers to our matrix

To complete the process, we need to simply update our matrix. For the column headers, replace the time intelligence field with the custom headers from our custom column headers table. We then need to replace our measure with our new custom measure to ensure the data comes back correctly. The result is the following:

Screenshot of our custom column headers associated with our calculation groups.
Custom column headers for our matrix

This is quite the process to setup, but it can be rewarding if it works out. It will stretch your skills in a few areas within Power BI. However, the results speak for themselves.

Anything else to consider?

Keep in mind that if you go this route, updating the logic with your calculation groups will grow increasingly complex. If you need to add a new measure, this will add an additional burden to the update process. Each new item will need to be updated manually in Power Query resulting in additional effort.

My recommendation is that you take the time upfront to establish all of the requirements. This will help avoid future rework in the future.

Conclusion

First of all, you read all of this and understood it – congratulations! This is a very complex topic and not easy to digest. You might need to re-read this article a few times to fully comprehend what happened, but I promise it will work out. What is important is that you have a grasp of the key elements.

So have you ever ran into a requirement like this? Do you have a more efficient way to achieve this outcome? Do you plan on using something like this in the future? Tell me about it in the comments below!

Dynamically Compare Customers in a Report

I was working with a customer recently and she came up with an interesting use case with Power BI. The idea was simple – she needs to compare a manufacturer against the top three manufacturers within a period of time. Thinking it was pretty straight forward, I suggested she just use a bar chart and filter down to those manufacturers and call it a day. If only it were that simple!

She came back and said well I need to dynamically select the chosen manufacturers. And in a perfect world, I would like to change the threshold of top manufacturers. Naturally, this whole scenario became far more complex! Being someone who does not back down from a challenge, I decided to give it a shot!

To help break down how this will work, I decided to use DAX Studio. It is a free tool that you can download and install from here. This is not a prerequisite, but it does make explaining the measure easier.

Part 1 – Design the measure in DAX Studio

The first step in the process is that we have to build out our base query. To start, we will pull back a list of manufacturers and the revenue associated with them. We will return that query ordered by the manufacturer with the highest revenue at the top:

DAX Studio window with a list of manufacturers and their associated revenue totals.
List of all manufacturers and their associated revenue totals

Next, we need to limit the list to only the top number of manufacturers selected. We will add a variable with a hard coded value of three for now. However, we will make it dynamic later on. We also need to add a filter to our list of manufacturers so only the top three return. We will use RANKX() to achieve that result.

Query in DAX Studio updated to only return the top three manufacturers and their associated revenue totals.
Returning the top three manufacturers by revenue

Next, we want to declare a manufacturer and return the associated revenue. For now, we will hard code the manufacturer Fama and update it later. We will then return the revenue associated with our selected manufacturer.

Screenshot of DAX Studio showing the top three manufacturers revenue plus our revenue for Fama.
Final code from DAX Studio

If you are following along at home, the final code for what we put together in DAX Studio is as follows:

DEFINE
    VAR TopRank = 3
    VAR SelectedManufacturer = "Fama"
    VAR ListOfManufacturers =
        ADDCOLUMNS (
            SUMMARIZE ( bi_manufacturer, bi_manufacturer[Manufacturer] ),
            "Revenue", [Revenue]
        )
    VAR FilteredList =
        FILTER (
            ListOfManufacturers,
            RANKX ( ListOfManufacturers, [Revenue],, DESC ) <= TopRank
        )
    VAR SelectedManufacturerRevenue =
        CALCULATE ( [Revenue], bi_manufacturer[Manufacturer] = SelectedManufacturer )
    VAR UnifiedList =
        UNION (
            FilteredList,
            { ( SelectedManufacturer, SelectedManufacturerRevenue ) }
        )

EVALUATE
UnifiedList
ORDER BY [Revenue] DESC

With that set, we can now turn our attention back to Power BI to prepare our report for the next step.

Part 2 – Preparing Power BI

There are two elements we must prepare for this scenario. First, we must prepare an independent manufacturing list. To avoid any conflicts with filtering, we must create this as a new query that parallels our existing bi_manufacturing table and is used in our calculation.

Next, we will need to add our top manufacturer threshold selection. Right now we hard coded the value three in our formula, but we want it to be dynamic. We will use a what-if parameter to achieve that goal.

Preparing the manufacturer list

The next step in our process is to prepare a list of distinct manufacturers. You could choose to create a DAX table to achieve this, but I suggest performing this in Power Query. I simply referenced my bi_manufacturer table, removed my extra columns, and updated the name to ManufacturerList.

Image of my prepared manufacturer list inside of Power Query.
List of manufacturers in Power Query

Next, I will need to create a measure to identify the selected manufacturer. This will allow me to select my manufacturer from a list. And if a single value has not been selected, it will return a blank value instead.

Selected Manufacturer = SELECTEDVALUE(ManufacturerList[Manufacturer], "")

With all of this in place, we just need to add a slicer on our report page.

Screenshot of our Top N Compare report canvas with the Manufacturer slicer present and set to Fama.
Top N Compare report page with Fama selected from the ManufacturerList

With this in place, it is time to add our threshold measure.

Preparing the Top N selector

In the past, I wrote an article on leveraging What If Parameters in Power BI. In that scenario, we used a decimal value for percentages. Instead, we will use whole numbers for this situation. However, a small change has occurred since that article was published. Now, there are two types of parameters we can select. We will be selecting the numeric parameter to achieve our goal. Head to the modeling ribbon and select the numeric range option from the new parameter dropdown. You will need to update the highlighted fields below:

Screenshot of selecting and updating a numeric range parameter in Power BI.
Adding a numeric range parameter in Power BI

Leaving the add slicer option checked, click create. This will generate a new table and measure that will work in concert with the new slicer. We do not have to do anything extra at this point, but it is worth noting that you could choose to create these elements manually if you choose.

Part 3 – Modify DAX Studio code for use in Power BI

Time for the grand finale! With all our hard work, the last few steps are about making some small adjustments to our DAX Studio code. We will start by creating a new measure and adapting our code for use in Power BI.

Taking the code from above, the first step is to remove the word DEFINE from the block. Next, we will update our hard coded values for top rank and selected manufacturer. If you create a report measures table, note that the [Top N Manufacturer Value] will not appear there unless you move it. Check under that new table we created.

Next, we will add a new variable at the top called CustomerLine and use it to bridge our revenue to the right customer. The code will look like this:

VAR ManufacturerLine = MAX(bi_manufacturer[Manufacturer])

Lastly, we will change our EVALUATE to a RETURN and update the value to perform a SUMX() on the revenue of our unified list where the ManufacturerLine value matches the Manufacturer. Our final code for our measure should look like this:


VAR TopRank = [Top N Manufacturers Value]
VAR SelectedManufacturer = [Selected Manufacturer]
VAR ManufacturerLine =
    MAX ( bi_manufacturer[Manufacturer] )
VAR ListOfManufacturers =
    ADDCOLUMNS (
        SUMMARIZE ( bi_manufacturer, bi_manufacturer[Manufacturer] ),
        "Revenue", [Revenue]
    )
VAR FilteredList =
    FILTER (
        ListOfManufacturers,
        RANKX ( ListOfManufacturers, [Revenue],, DESC ) <= TopRank
    )
VAR SelectedManufacturerRevenue =
    CALCULATE ( [Revenue], bi_manufacturer[Manufacturer] = SelectedManufacturer )
VAR UnifiedList =
    UNION (
        FilteredList,
        { ( SelectedManufacturer, SelectedManufacturerRevenue ) }
    )
RETURN
    SUMX ( FILTER ( UnifiedList, [Manufacturer] = ManufacturerLine ), [Revenue] )

At this point, I added my Top N Revenue measure to my table to compare and I discovered two issues. One, it is returning revenue for every line, not just the Top N and the selected manufacturer. But in addition, it is doubling the revenue for Fama:

Screenshot of issues from our new Top N Revenue measure.
Top N Revenue measure not working properly

To fix every value coming back, we will simply add an ALL() around our original list of manufacturers. This will make sure all manufacturers are accounted for in this calculation:


VAR TopRank = [Top N Manufacturers Value]
VAR SelectedManufacturer = [Selected Manufacturer]
VAR ManufacturerLine =
    MAX ( bi_manufacturer[Manufacturer] )
VAR ListOfManufacturers =
    ADDCOLUMNS (
        SUMMARIZE ( ALL ( bi_manufacturer ), bi_manufacturer[Manufacturer] ),
        "Revenue", [Revenue]
    )
VAR FilteredList =
    FILTER (
        ListOfManufacturers,
        RANKX ( ListOfManufacturers, [Revenue],, DESC ) <= TopRank
    )
VAR SelectedManufacturerRevenue =
    CALCULATE ( [Revenue], bi_manufacturer[Manufacturer] = SelectedManufacturer )
VAR UnifiedList =
    UNION (
        FilteredList,
        { ( SelectedManufacturer, SelectedManufacturerRevenue ) }
    )
RETURN
    SUMX ( FILTER ( UnifiedList, [Manufacturer] = ManufacturerLine ), [Revenue] )

To avoid a potential collision where our selected manufacturer is also in our top list, we just need to add some logic in our return statement to see if our manufacturer is present. If it is, simply return the value from the top list and move on.

VAR TopRank = [Top N Manufacturers Value]
VAR SelectedManufacturer = [Selected Manufacturer]
VAR ManufacturerLine =
    MAX ( bi_manufacturer[Manufacturer] )
VAR ListOfManufacturers =
    ADDCOLUMNS (
        SUMMARIZE ( ALL ( bi_manufacturer ), bi_manufacturer[Manufacturer] ),
        "Revenue", [Revenue]
    )
VAR FilteredList =
    FILTER (
        ListOfManufacturers,
        RANKX ( ListOfManufacturers, [Revenue],, DESC ) <= TopRank
    )
VAR SelectedManufacturerRevenue =
    CALCULATE ( [Revenue], bi_manufacturer[Manufacturer] = SelectedManufacturer )
VAR UnifiedList =
    UNION (
        FilteredList,
        { ( SelectedManufacturer, SelectedManufacturerRevenue ) }
    )
RETURN
    IF (
        SUMX ( FILTER ( FilteredList, [Manufacturer] = ManufacturerLine ), [Revenue] ) > 0,
        SUMX ( FILTER ( FilteredList, [Manufacturer] = ManufacturerLine ), [Revenue] ),
        SUMX ( FILTER ( UnifiedList, [Manufacturer] = ManufacturerLine ), [Revenue] )
    )

And the results speak for themselves:

Screenshot of the Top N Revenue values corrected.
Top N Revenue working properly

Is it a lot of work to enact? Yes. But is there an opportunity to extract valuable insights from this type of report? Also yes.

Where to go next?

If you want to take this to the next level, consider adding some conditional formatting. Go back and read my article on creating conditional formatting with a column chart. This will allow you to call out your selected manufacturer a lot easier. It is a small adjustment, but it could go a long way to enhancing your report design.

It is worth considering that my queries to achieve this outcome were very simple. Depending on what measures you are comparing, this could become a very cumbersome solution. It is important that you take the time to review your report’s performance and ensure it does not overwhelm your capacity in the Power BI Service. Use the performance analyzer and continue to tune your measures to ensure no issues with rendering this data.

Conclusion

This is a bit of a challenging project. Even while writing this one, I was getting exhausted. It seems simple in practice, but it does require a lot of effort to achieve this desired outcome. I hope if you run into a scenario where you have this requirement that you will find this solution helpful!

So, have you seen anything like this before? Have you run into challenges like this? If so, make sure to share in the comments below!

Enable Self Help Overlays for Power BI Report Viewers

We can all use a little self help when it comes to Power BI Reports. I often need help with visualizations, formulas, or even our data sources themselves. A little help can go a long way! However, this post is not about that.

As a report creator, how often do you get asked about details around your visualizations? The questions I most frequently receive are focused on definitions of terms, how a metric is calculated, or how a particular visualization is filtered. In fairness, our report consumers do not live inside of our heads. Therefore, they do not know what we were thinking when we designed our reports. However, a little design change in your report can make all the difference. By adding a self help panel, we can eliminate these questions once and for all!

Preparing for self help

To get started, we need to take a screenshot of our report. I will use SnagIt to capture the report canvas only. If you do not have SnagIt or similar tool, you can easily use the Snipping Tool app inside of Windows. SnagIt allows me to be precise with my selection of the report canvas.

Screenshot of our Power BI Report to be used to design our self help overlay.
Screenshot of my report canvas

Once, I have my screenshot, I will open PowerPoint. I know, it seems crazy, but hang with me on this! I will create a new blank presentation, insert a blank slide, and paste my report screenshot on it.

Adding a screenshot of our Power BI Report inside of PowerPoint to prepare our self help overlay.
Adding my report screenshot to a slide in PowerPoint

With our canvas in place, it is time to work some magic in PowerPoint!

Create self help overlay objects

Our first step is to add a shape on top of our screenshot. First, go to the insert ribbon and expand the insert shape menu. Next, select the rectangle shape and cover your entire screenshot. Do not worry that we cannot see the screenshot as we will fix that next.

Inserting a rectangle shape in PowerPoint to serve as our self help overlay base
Inserting a shape inside of PowerPoint

Next, right click on the shape and select format shape. From there, expand the fill menu and change the color to black. Next, adjust the transparency to a level where we can see the canvas clearly, but realize there is something in front of it. In this scenario, 85% seems to be right for this report. Next, you can make adjustments to the border of the shape under the line section. This is a personal preference with building out self help, but I usually leave it enabled.

Formatting the overlay shape in PowerPoint as a base for our self help.
Making my overlay shape transparent

With our overlay in place, it is time to add our self help comments. Go back to the insert ribbon, expand shapes, and select one of the callout boxes at the bottom.

Inserting callout shapes in PowerPoint for the self help overlay.
Inserting callouts for the self help overlay

Keep adding callouts until you are satisfied with your overlay. I recommend erring on the side of too many callouts. However, make sure you still see the report canvas!

Final version of our self help overlay containing five callouts for our Power BI Report.
Report callouts for our self help overlay

With our overlay designed, it is time to add it to our Power BI Report!

Insert self help overlay into you Power BI Report

The first thing we must do is delete the old screenshot from the slide. Right click on your overlay shape and send it to back. Once you have completed that, you can delete the screenshot from the slide.

View of PowerPoint showing the self help overlay objects without the screenshot from Power BI

Self help overlay objects with no screenshot in PowerPoint

Next, we must save our self help objects as an image. Use Control + A to select all objects and Control + G to put them into a single group. After that, we right click on the overlay group and select save as picture. Save this picture to your device so you can access it in Power BI.

Right clicking on the self help overlay group and selecting save as picture so it can be used.
Save self help overlay group as an image in PowerPoint

Next, head back to Power BI, open the insert ribbon, select image, and insert our saved picture. Then, expand the image so it covers the entire report canvas. You will note two things. First, the image is transparent, so we do not hide the report canvas. Second, you will notice that the self help overlay does not reach the edges of the canvas. We can fix that by going to the general section of the format pane, expanding properties, and set padding for all sides to zero.

Inserting a self help overlay image into a Power BI Report canvas and adjusting the padding.
Inserting self help overlay into Power BI

Now that this has been inserted, we just need to add a few bookmarks to make it appear when we want it to be visible!

Expose self help overlay with bookmarks

Now that we have our overlay in place, it is time to hide it. To accomplish this task, we will use bookmarks. I have already written extensively about how to leverage bookmarks in a previous post. If you are not sure how to leverage them, please use that link to learn more. In this article, I am just going to show you how I quickly configure them so the self help overlay will appear and disappear when desired.

I have already added two bookmarks – one to open and one to close the self help overlay. I will simply configure my bookmark to only update the selected visuals I have clicked on and update the slicer. In the image below, I am opening the self help overlay, so it is visible. However, I will just simply hide the visuals for closing the overlay.

Updating the bookmarks to only show the self help overlay when selected and not impact the entire report canvas.
Updating bookmarks for self help overlay

Now that this is in place, I just need to update the actions on my help button and overlay image. To apply the close self help overlay action, I will select overlay image in the the report, go to the format pane and expand action. Once there, I will set the type of action to bookmark and select my close self help overlay bookmark. I will do the same to the open self help button inside of my report footer.

Setting the action properties on our image to use the close self help overlay to hide our help image
Applying the close self help overlay bookmark to our overlay image

Now that you have this in place, you can easily open and close your self help overlay. To test this in Power BI desktop, you will need to hold down the control key to activate the action. However, that will not be required when you publish the report to the Power BI Service.

Opening and closing the self help overlay on the Power BI Report canvas.
Opening and closing the self help overlay

And just like that, we have enabled your viewers to learn more about your report with a single click!

Anything else to consider?

As convenient as the self help overlay is for your viewers, there is an additional burden for you to keep the overlay up to date. I recommend that you save your PowerPoint file for easy access and updates in the future. This will save you time in the future

I always recommend having a checklist of steps that must be completed when publishing content to the Power BI service. If you choose to use a self help overlay, a step must be added to ensure it is still valid. This will prevent any confusion from your report viewers in the future.

Conclusion

This seems like a trivial addition to a report, but I promise it makes life easier. Any chance you have to eliminate the questions coming to you is time well spent. While it is a little design intensive and adds a burden to future updates, it creates a better overall experience for your report viewers. I highly recommend trying it out once or twice in a development environment so you feel comfortable doing it in production.

So, have you ever created a self help overlay? Have you found it helpful? Or are you a viewer who has benefited from an overlay? Tell me about it in the comments below!

Save Space with a Slicer Panel

Nothing is worse than a cluttered report canvas. This is one of the most common pieces of feedback I provide to customers on a regular basis. The idea that you can pack as much information into a single view is honestly obnoxious and leads to confusion. If readers have a hard time understanding the canvas, they will not understand the message you are communicating with your report. And the experience is worse if you have half of the canvas covered with slicers!

In addition to the confusion, it has the chance to impact your report performance. As you add visualizations to the Power BI canvas, you will hurt the overall performance of your report. Because each visual must be rendered and associated data queried from the model, it will impact the responsiveness when loading. But a little known secret about Power BI is that if you hide the visualization from the canvas, it does not impact the performance.

This is where slicer panels come in. Instead of having a cluttered canvas full of slicers that is slow to load, we can hide these slicers and activate them when needed. This is a small change to your canvas, but it provides a lasting impact. It only takes a few minutes to implement if you know how to do it.

Example of a cluttered report canvas.
Example of a cluttered report canvas

To resolve this issue, we will simply add a slicer panel to our report canvas. The best part of this is that we will be able to leverage our existing slicers to save us time!

Prepare the Slicer Panel

The first step in preparing the panel is adding a background. Go to the insert ribbon and insert a square shape from the dropdown. Once it appears, resize it and place it on the canvas. Personally, I add the slicer panel on the left side. However, you may choose to have it appear somewhere else on your canvas. My recommendation is to use whatever enhances your report design best is ideal.

Inserting a shape from the Insert ribbon to serve as the slicer panel
Inserting a shape for the slicer panel

Next, we need to add a way to close the panel. Under the insert ribbon, we will expand the button drop down. From here, you can pick the one that works best for you. I personally use the back button, but the reset button seems to work well for others too. You might need to perform some updates to the formatting so it is easily visible on your slicer panel.

Inserting an exit button from the button drop down to help us close the slicer panel.
Inserting an exit button for your slicer panel

Now that we have our panel in place, it is time to add our slicers.

Loading Slicers into our Panel

In our example, we already have our slicers in place. We are going to use the selection pane, found under the view ribbon. From there, we will re-order the items on our canvas so our panel appears above any visualization we add to our report. We also want to make sure our slicers sit on top of our shape so they are not hidden. Once this has been completed, you can arrange your slicers on the shape.

Updating the canvas order and placing the slicers in the right sequence for our slicer panel
Updating our canvas order and arranging our slicers

Next, as a best practice, we will want to rename our shape and our button. The traditional way to perform this task is to update the title under the general settings of the format visualization pane. However, there is an easier way to perform this action! Instead, just double click on the object on the selection pane and you can update it right there.

Renaming objects on the Power BI canvas using the selection pane by double clicking on the object name.
Renaming objects on the Power BI canvas

You will note that I have also grouped these objects together by selecting them, right-clicking, and selecting group. I renamed the group the same way I did with the other objects. This is not required, but makes life a lot easier for the next step.

Show and Hide your Slicer Panel

Now that we have our panel in place, it is time to hide it. To accomplish this task, we will use bookmarks. I have already written extensively about how to leverage bookmarks in a previous post. If you are not sure how to leverage them, please use that link to learn more. In this article, I am just going to show you how I quickly configure them so the panel will appear and disappear when desired.

I have already added two bookmarks – one to open and one to close the slicer panel. I will simply configure my bookmark to only update the selected visuals I have clicked on and update the slicer. In the image below, I am opening the slicer panel, so it is visible. However, I will just simply hide the visuals for closing the panel.

Configuring bookmarks to only update the selected visuals ensuring that only the slicer panel will be affected by the opening and closing of my report.
Updating bookmarks to open and close the slicer panel

Now that this is in place, I just need to update the actions on my buttons. To apply the close slicer panel action, I will select by button in the slicer panel, go to the format pane and expand action. Once there, I will set the type of action to bookmark and select my Close Slicer Panel bookmark. I will do the same to the open slicer button inside of my report footer.

Adding the bookmark action to the button to close the slicer panel
Applying the close slicer panel action to the close slicer panel button

Now that you have this in place, you can easily open and close your slicer panel. To test this in Power BI desktop, you will need to hold down the control key to activate the action. However, that will not be required when you publish the report to the Power BI Service.

Short video of the slicer panel opening and closing in Power BI
Opening and closing the slicer panel

Just like that, we have provided more space on our report canvas and made it easier to view!

Anything else to consider?

If you remember, I put my slicers in a group with the panel objects. While not required, it is a best practice. The reason for this is that you will at some point need to add a new slicer to your report. To make it easier on yourself, the group allows you to update the objects without having to update your bookmarks.

When I am in a hurry, I feel like bookmarks never update the way they are supposed to. The result is in a rushed experience that never works out. However, because I used groups, I can simply drag my slicer into the group and not have to make any adjustments to the bookmark. It is an extra step in the process, but it will save you considerable time and headaches in the future!

Conclusion

This is a small update for a report, but it really makes an impact. I know it takes extra time, but I promise you that it is worth the effort to preserve your canvas. This is especially helpful if you have reports using more than five slicers for a page.

Have you used a slicer panel before? Do you find them helpful? If so, tell me in the comments below!

Microsoft 365 Group to Drive Row Level Security (RLS)?

As you are reading this post, you are one of two people. The first group are saying “Wow – I never thought to use existing Microsoft 365 Groups to drive my Row Level Security!” The other group is screaming “Microsoft 365 Groups don’t work for Row Level Security!”

Technically, the second group is correct. If you try to apply a Microsoft 365 Group for Row Level Security, you will discover that it is not available. This is very frustrating for a number of report creators. But as with all the great solutions, necessity is the mother of invention.

The Scenario

In the last month, I have been approached by two customers about how to leverage existing Microsoft 365 Groups in Power BI. When talking to these customers, they built out their security model using these groups. With Microsoft heavily relying on groups to drive security, it throws a wrench in things. And unlike SharePoint which allows you to leverage existing Active Directory groups, Teams does not allow you to do that.

So you bought into Teams and started using it to allow self service collaboration. Your users are successfully updating memberships with no issues. But now you need to create a security group and replicate it. Doing it manually takes time and it is frustrating at best. Plus it will likely lead to multiple service desk calls when it is not working as planned.

And while I am not lazy, I like to make simple solutions that enable users. So let’s automate the solution to make it easier!

Power Automate to the Rescue

Power BI and Power Automate are part of the Microsoft Power Platform. While powerful alone, they are better together when it comes to creating solutions. Power Automate enables end users create self-service automation solutions. We can quickly build a flow that will automate our replication process.

The concept is simple – we will create a flow that monitors a Microsoft 365 Group membership and update our security group. You might have looked into Power Automate in the past and decided to not use it because of licensing costs. But the best part this solution is that you can use the free tier of Power BI to enable the solution!

Implementing Microsoft 365 Group Replication

We could create the flow from scratch, but why do that when I have already built it! This will help you save some time and quickly re-deploy it for other groups.

Step 1 – Create a Security Group

To get started, you must create a security group that you will use. Since I am 100% cloud based, I go to the Microsoft 365 Admin Center to create the group. Go to Teams and Groups on the left hand navigation and select Active Teams and Groups. From there, create a new group. Select the Security group type and click next. Give your group a name and finish up the group creation.

Creating a Security Group

Once created, go ahead and open the group. Make sure whatever account is creating the flow in Power Automate is added as the group owner. Lastly, before you close the window, copy the group ID out of the URL as you will need it shortly. You can accomplish the exact same task in Azure Active Directory, but I feel like this is easier.

Finalize the setup of your Security Group

As a point of convenience, you might want to add the existing members of the Microsoft 365 Group into the security group. This will need to happen anyway, so why not do it now?

One last thing – if the account creating the flow is not already a member of the Microsoft 365 Group, make sure they are added before moving on to the next step!

Step 2 – Import the Flow

First, download the prebuilt package I created. This will save us some time with the deployment of this solution and will make it easier for you to replicate.

One you have downloaded the package, login to Power Automate using your Microsoft 365 Credentials. Once signed in, go to My Flows on the left, select Import, and pick Import Package. Click upload, pick the package you downloaded, and click OK.

Once you have loaded the package, you need to update the connection details for the Microsoft 365 Groups connector and the Azure AD connector. Click on Select during import and specify which account to use. Click Import when you are finished

Setting up connections when importing the flow

The import will take a few moments. You will receive a message when it is finished. Once the import is complete, click on Open flow.

Import complete message

Next, we need to update the flow to use the groups you specify.

Step 3 – Specify the Microsoft 365 Group and Security Group

Updating the Microsoft 365 Group is really easy. Just clear out the existing content and select your group from the drop down. It is easy as that!

For the security group, it is a little harder. All you need to do to is add the group id you copied when you created the group in our variable. Just set the value field for Initialize Security Group ID Variable block. We are assigning it to a variable because we need to use it with two actions. I don’t have to do it this way, but I like it because I only have to update it once. This way I don’t miss anything!

Updating groups in the flow

Make sure you save your work before moving on to the next step!

Step 4 – Test Your Flow

Now that everything is in place, go ahead and test your flow. Add a new user to a Microsoft 365 Group. It will take a minute or two, but it should appear in the security group.

Once it shows up, you can remove the user from the Microsoft 365 Group. Again, it will take a minute or two, but it should be removed from the security group.

If it passes the test, you are all set! Congratulations! Go and use the security group as appropriate in Power BI!

Final Thoughts

This is a simple solution, but keep in mind this is a very specific solution for a very specific problem. I only use this solution for a very targeted set of groups. You essentially need to duplicate these groups and create a flow for each and every one of the groups you need to replicate. That can get messy really quickly. Personally, I would limit it to less than 10 groups, but you can decide what is right for you.

I would also consider putting this flow under a service account. If the person who sets up the flow uses their account, the flow will stop working when their account is disabled. A service account prevents any issues with this situation.

Have you run into this challenge? Do you think this solution works well? Tell me in the comments below!

Time Zone Management in Power Query

I ran into a unique challenge this past week when it came to managing time zones in Power Query. I have a dataset that brings a dataset in using Universal Coordinated Time (UTC). Like a majority of the world, my key stakeholders do not reside in the UTC time zone. On top of that, they wanted to use multiple time zones, so this could become quite complicated.

You might be reading this article and asking why would this be such a complex issue? Well there are a couple of factors that you need to consider. Naturally you need to consider the time zone you need to use. In addition, you must consider daylight savings time. In the United States, every state except Arizona observes daylight savings time. As a result, I have to adjust my model accordingly. And while 99% of my work is based in the United States, many other countries across the world observe Daylight Savings Time.

To add to the complication, one of the required time zones was for Indian Standard Time (IST). If you are not familiar with the IST time zone, there is an additional 30 minute offset from UTC. In short, if it is 1:00 AM UTC, then it is 6:30 AM IST (UTC +5:30).

Since I know I am not the only person with this challenge, I thought it would be helpful to share my experience and how you might be able to work around this issue.

Convert UTC to IST Time Zone

Let’s start with an easy conversion. When setting the IST time zone, you do not have to worry about daylight savings time. This means it is an easy conversion!

To get started, we are going to create a custom column in Power Query and convert our date/time in the UTC time zone. However, the dataset does not assign the time zone to the timestamp. We will assign it using the DateTime.AddZone() function. The function accepts three parameters:

  1. Date/Time value (in our case, the Timestamp column)
  2. Hour offset
  3. Minute offset
=DateTime.AddZone([Timestamp], 0, 0)

This formula will replicate your current timestamp and add “+00:00” at the end to show no offset from UTC. Now that we have a time zone assigned, we need to convert to IST. We will wrap our last function with the DateTimeZone.SwitchZone() function to adjust the offset. The parameters are the same as the function above.

=DateTimeZone.SwitchZone(DateTime.AddZone([Timestamp], 0, 0), 5, 30)

Now the column has been updated with “+05:30” at the end, but the time has been incremented by the same amount. It is that easy!

Now, you might not want the time zone offset showing in the report canvas. We can easily remove that from the view by wrapping the last formula with DateTimeZone.RemoveZone()

=DateTimeZone.RemoveZone(DateTimeZone.SwitchZone(DateTime.AddZone([Timestamp], 0, 0), 5, 30))

Now there is just one pesky problem left – our new column is shown as a string instead of date/time. This is an easy fix! Just crack open your advanced editor and add the column type on the same step!

Custom Column = Table.AddColumn(#"Changed Type", "TimestampIST", each DateTimeZone.RemoveZone(DateTimeZone.SwitchZone(DateTime.AddZone([Timestamp], 0, 0), 5, 30)), type datetime)

Yes, you can update the column type with the next step. But I am a fan of keeping things compact and clean. Since you are already doing the transformation, it is an easy add to save some compute time!

If your time zone is not IST, you can look up the offset here. You just need to update the code above with the correct offset.

What About Daylight Savings Time?

The biggest challenge is to know what rules define the time shift. In the United States, our formula is simple. Daylight Savings Time starts on the second Sunday in March and ends on the first Sunday in November. So how do we put that knowledge to work?

One option is to add a field on your date dimension to show if daylight savings time applies. That works, but it requires maintenance long term. Another option is to create a table of dates, but once again it is a lot of work to keep current.

But there is a way to scale this logic using a few functions in Power Query

Find the Start and End of Daylight Savings Time

Like before, let’s start off easy by finding the end of Daylight Savings Time. The first thing we need to do is establish the first day of November. We will use the #date(year, month, day) function to set the date. But remember, our goal is to make it scalable, so we need to make the year dynamic:

=#date(Date.Year([Timestamp]), 11, 1)

Now that we have the first day of the month established, we will use it to find the first Sunday of the month. We will use the Date.EndOfWeek() function to establish the last day. We will add a parameter with the value of “1” to shift the end of the week from Saturday to Sunday.

=Date.EndOfWeek(#date(Date.Year([Timestamp]), 11, 1), 1)

Now to find the second Sunday of March, we will use the same formula as a base. We will change the 11 to 3 for March. But there is one difference, we are wrapping this formula in Date.AddDays(). We will use that formula to add 7 days to our date which will give us the second Sunday in March.

=Date.AddDays(Date.EndOfWeek(#date(Date.Year([Timestamp]), 3, 1), 1), 7)

Like I said above, your time zone might have a different set of rules. However, with some research, I found that the rules are generally the same. You will need to add or remove weeks to get to your desired date. Maybe use the End of the Month and work backwards. If it starts on Saturday, you can ignore the extra parameter. And if you are in one of the countries that observes on a Friday, just use the Date.AddDays() and work forward/back the appropriate number of days to make it work!

Apply Dates to Time Zone Offset

Now that we have the start and end of Daylight Savings Time, we just need a basic conditional statement to provide the offset. Because our start and end dates are pure dates with no time associated, we will need to do our comparison using the DateTime.Date() function to convert our timestamp.

=if DateTime.Date([Timestamp]) >= Date.AddDays(Date.EndOfWeek(#date(Date.Year([Timestamp]), 3, 1), 1), 7) and DateTime.Date([Timestamp]) < Date.EndOfWeek(#date(Date.Year([Timestamp]), 11, 1), 1)

This will check our timestamp and make sure it is between our two dates. Now we just need to use the same logic from above to show the offset:

=DateTimeZone.RemoveZone(if DateTime.Date([Timestamp]) >= Date.AddDays(Date.EndOfWeek(#date(Date.Year([Timestamp]), 3, 1), 1), 7) and DateTime.Date([Timestamp]) < Date.EndOfWeek(#date(Date.Year([Timestamp]), 11, 1), 1) then DateTimeZone.SwitchZone(DateTime.AddZone([Timestamp], 0, 0), -4, 0) else DateTimeZone.SwitchZone(DateTime.AddZone([Timestamp], 0, 0), -5, 0))

Since I am using Eastern Time in the US, there is an offset of 5 hours from UTC, but only 4 during Daylight Savings Time. With this all in place, we now have a function that will scale with Daylight Savings Time over the years. Feels like a lot of work to get here, but it works well!

Get Fancy with a Time Zone Function

Now, you might be thinking that this could get painful if you need to provide multiple time zones over and over again. For those of us in the United States, we are using the same logic for several time zones. So instead of doing everything manually, you can create a function to make your life easier!

Using the same logic, the function will accept two parameters: the timestamp and the offset. We then can use a few lines of code and make a function that we can reuse over and over:

(TimestampValue as datetime, TimeOffset as number) =>
let
    TimestampDate = DateTime.Date(TimestampValue),
    TimestampYear = Date.Year(TimestampValue),
    Source = DateTimeZone.RemoveZone( if TimestampDate >= Date.AddDays(Date.EndOfWeek(#date(TimestampYear, 3, 1), 1), 7) and TimestampDate < Date.EndOfWeek(#date(TimestampYear, 11, 1), 1) then DateTimeZone.SwitchZone(DateTime.AddZone(TimestampValue, 0, 0), TimeOffset + 1, 0) else DateTimeZone.SwitchZone(DateTime.AddZone(TimestampValue, 0, 0), TimeOffset, 0))
in
    Source

With a few substitutions, I can simplify the logic and make it easier to understand. Now I can just invoke the function by using the timestamp and adding the offset. Not too shabby!

Anything Else to Consider?

This solution is just using dates and ignores the official time when the change takes place. In my use case, the precision was not that important when we are looking at 4 hours out of 8,760 hours in a year. If you need that level of precision, you will need to add time to the logic in the formulas.

I will say, this is a gaping hole in Power BI. With Power Automate, you are able to perform precision conversions based upon time zones much easier. A majority of my work just uses dates and ignores time, but there are use cases like audit data that we need this information. Hopefully Microsoft will add functionality here one day. However, users have remarked about this challenge for years. I might be better off waiting for the United States to pass a law eliminating Daylight Savings Time. But that is not looking too good either…

Anyway, how have you handled this in the past? Do you have a different approach? If so, tell me in the comments below!

Page 2 of 5

Powered by WordPress & Theme by Anders Norén