Making sense of challenging data topics one step at a time.

Category: Copilot for Microsoft 365

Copilot Reports with Microsoft Fabric – Part 3

We have been on a journey the past two weeks to deploy Copilot Reports with the new user activity reporting endpoint. While the journey has taken some time, we have arrived at the final step which is deploying our reports! You might have already performed some of this already, but I developed a template to make your life easier.

At the end of this article, you will have a report deployed to help you track week over week utilization of Copilot for Microsoft 365 within your organization. As a bonus, we will implement a few other reports to help you along the way on your adoption journey.

By the way, if you have not deployed Copilot for Microsoft 365 yet, you still want to deploy this report. There is a special page that will identify potential top users of Microsoft 365 within your organization. They are generally primed for being a participant in a Copilot for Microsoft 365 pilot program and potentially serve as a champion for your organization.

First Things First – I Made an Error

If you have been following this series, I discovered an error in our data flow code. I resolved this error on November 16th, 2024 at 9:30 AM eastern time (New York City time) and updated my GitHub repository. If you have been following this series before that correction has been made, you will need to edit the CopilotData_Raw table. You can open the advanced editor and replace it with the following code:

let
  GetCopilotData = (Params) =>
    let
      Path = if Text.Length(Params) > 0 then Params else "$top=1000", 
      Source = Json.Document(
        Web.Contents(
          "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUsageUserDetail(period='D30')", 
          [RelativePath = "?" & Path, Headers = [Authorization = "Bearer " & #"Get-AccessToken"()]]
        )
      ), 
      NextList = @Source[value], 
      result = try
        @NextList & @GetCopilotData(Text.AfterDelimiter(Source[#"@odata.nextLink"], "?"))
      otherwise
        @NextList
    in
      result, 
  CopilotData = GetCopilotData(""), 
  #"Converted to table" = Table.FromList(
    CopilotData, 
    Splitter.SplitByNothing(), 
    null, 
    null, 
    ExtraValues.Error
  ), 
  #"Expanded Column1" = Table.ExpandRecordColumn(
    #"Converted to table", 
    "Column1", 
    {
      "reportRefreshDate", 
      "userPrincipalName", 
      "displayName", 
      "lastActivityDate", 
      "copilotChatLastActivityDate", 
      "microsoftTeamsCopilotLastActivityDate", 
      "wordCopilotLastActivityDate", 
      "excelCopilotLastActivityDate", 
      "powerPointCopilotLastActivityDate", 
      "outlookCopilotLastActivityDate", 
      "oneNoteCopilotLastActivityDate", 
      "loopCopilotLastActivityDate", 
      "copilotActivityUserDetailsByPeriod"
    }, 
    {
      "reportRefreshDate", 
      "userPrincipalName", 
      "displayName", 
      "lastActivityDate", 
      "copilotChatLastActivityDate", 
      "microsoftTeamsCopilotLastActivityDate", 
      "wordCopilotLastActivityDate", 
      "excelCopilotLastActivityDate", 
      "powerPointCopilotLastActivityDate", 
      "outlookCopilotLastActivityDate", 
      "oneNoteCopilotLastActivityDate", 
      "loopCopilotLastActivityDate", 
      "copilotActivityUserDetailsByPeriod"
    }
  ), 
  #"Removed columns" = Table.RemoveColumns(
    #"Expanded Column1", 
    {"displayName", "copilotActivityUserDetailsByPeriod"}
  ), 
  #"Renamed columns" = Table.RenameColumns(
    #"Removed columns", 
    {
      {"lastActivityDate", "Last Activity Date"}, 
      {"copilotChatLastActivityDate", "Copilot Chat"}, 
      {"microsoftTeamsCopilotLastActivityDate", "Copilot for Teams"}, 
      {"wordCopilotLastActivityDate", "Copilot for Word"}, 
      {"excelCopilotLastActivityDate", "Copilot for Excel"}, 
      {"powerPointCopilotLastActivityDate", "Copilot for PowerPoint"}, 
      {"outlookCopilotLastActivityDate", "Copilot for Outlook"}, 
      {"oneNoteCopilotLastActivityDate", "Copilot for OneNote"}, 
      {"loopCopilotLastActivityDate", "Copilot for Loop"}
    }
  )
in
  #"Renamed columns"

In short, I missed the paging requirement for this endpoint. I assumed that this endpoint was like the other graph reports endpoints. But then again, we all know what happens when we assume! Regardless, with this small change in place, you are ready to collect data for your entire organization.

If you are not up to speed, go back to my first post in this series and start collecting data to build your Copilot reports.

Deploying the Copilot Reports Template

To get started, go my GitHub repository to download the Copilot Utilization Report template. It is a pbit file that we will be able to connect to the semantic model built off your Lakehouse. Once you have downloaded the template, you can open it. You will immediately receive an “error”, but that is simply because you need to connect to your semantic model:

Screenshot of connecting our Copilot Reports to our semantic model attached to our Lakehouse.
Connecting to our semantic model associated with our Lakehouse.

With our template activated, we can save and deploy the report by publishing it to Power BI! Before we get too far, let’s do a quick tour of the report!

Copilot Reports – High Level Overview

The template comes with nine pages. Three are focused on Copilot Activity, four are focused on collaboration behaviors within the Microsoft 365, and the Collaboration Score page. There is also a report template page that is available for you to create additional custom reports.

Each page has a slicer panel to filter based upon your user information from Entra ID. With the caveat that your user information must be accurate, you can easily filter to see how users leverage tools within the Microsoft 365. You can filter department, company, office location, or even job title. You can also filter based upon licenses assigned to users as well!

In addition to slicers, each page has a help button that when activated, provides an overlay to help you understand the report visualizations. This will help you and any future consumers understand how the report is interpreted and how values are calculated. Because of this, I will not go into great detail, but give you a high-level overview of the pages themselves.

How to Use the Copilot Reports

There are three pages dedicated to Copilot utilization. The first page, Copilot activity, dives into the utilization of Copilot over the past week. It also allows you to trend utilization over time. Remember, we can only capture dates when someone last used Copilot. It is best to compare week over week due to inconsistencies with how Microsoft releases the data. However, we can smooth this out over a week instead.

The Active Copilot Users page allows us to quickly identify which users used Copilot over the past week. In addition, it shows how many applications it was leveraged within. For example, if you used Copilot for Microsoft 365 within Microsoft Teams, Word, and PowerPoint, you would have an app count of three. This can help you find users for a focus group to learn more about use cases.

The Inactive Copilot Users page allows you to identify who is not using their assigned license. During a pilot, we want to make sure that everyone is leveraging Copilot. This report helps us track and activate dormant licenses. It also helps us to find who has used Copilot for Microsoft 365 in the past but have not used it for over two weeks. This might prompt you to reassign licenses to users who have more time to spend exploring Copilot.

While not a perfect science, these three reports will help you drive a quality pilot program. Beyond your pilot, it will help you drive broader adoption across the organization as well.

How to Use the Activity Analysis Reports

Why do I care about utilization of Microsoft 365 tools? Remember, Copilot for Microsoft 365 is using data from your tenant. The better you use the platform, the better your experience with Copilot for Microsoft 365 will be!

Each page is dedicated to the core utilization metrics available from Microsoft Graph. While these are high level metrics, it is enough to understand general trends over the past thirty days. Our goal is to see balance between applications and drivers of quality collaboration.

How to Use the Collaboration Score Report

There is a difference between using software and adopting software. Even though Microsoft Teams became the standard since Skype for Business was retired in 2020, it is not well adopted. The concept of the Collaboration Score is to see how well people are using their collaboration tools. But how does it work?

The Collaboration Score focuses on behaviors that lead to sharing knowledge. You could have people who work in their OneDrive and private Teams Chats all day, but they are siloed. Even if they send 500 messages a day, they will have a low score. The reason? They do not collaborate with others in shared spaces!

The better users leverage centralized collaboration repositories, the broader the corpus of knowledge will be. In addition, when new people join your organization, they will be able to draw upon that knowledge right away with Copilot. If it is siloed, it might take a while for that knowledge to be shared through individual collaboration tools like private chats and OneDrive.

What the Heck is a Collaboration Score?

I credit the Collaboration Score to my friend and colleague Kristen Hubler. She recently wrote a great article on using versus adopting software. We collaborated together to find a way to identify top users of Microsoft 365 users. What started as a hunch proved out time over time when it came to Copilot.

The concept was simple – the more diverse your Microsoft 365 utilization, the better their Copilot experience. The higher the Collaboration Score, the higher their self-reported use case count and savings. In fact, my favorite thing to do is temper “Copilot sucks” feedback with Collaboration Scores. If a user says that with a Collaboration Score of 10, it is likely because they do not use their tools well. However, if they collaborate well, there is a better corpus of knowledge to draw upon with Copilot.

In addition to the data they will have access to, it shows a user’s propensity to use new technology. It has been four years since Microsoft Teams became the official standard for collaboration. If some of your users have not been adopting these tools, they might struggle with Copilot.

What does this all mean? You might need to work on your adoption of Microsoft 365 in addition to Copilot. However, that can be part of your overall rollout of Copilot for Microsoft 365 as you go forward.

Next Steps

With these reports in place, you can now track utilization of Copilot for Microsoft 365 for adoption. If your adoption is low, start working with your organizational change management team within your organization to get the most out of your investment. Technology professionals do not consider the behavior changes you need to make with a tool like Copilot. However, having the data can help shape that roadmap.

After you have deployed this report, drop some comments below on how it is going! I want to know what you discovered with your data and some changes you might make! Also, please do not hesitate to provide feedback as I will likely be releasing new versions of this report template over time.

Copilot Reports with Microsoft Fabric – Part 1

Over the past year, I have been helping customers with their Copilot for Microsoft 365 deployments. The number one complaint I have received from customers has to do with the ability to provide quality Copilot Reports. If you are reading this, you are probably not alone. However, some recent changes have come about that will help us get better reporting.

While in preview at the moment, Microsoft released reporting APIs for Copilot User Activity. While these endpoints are in preview with a target release of November 2024, we can start taking advantage of them today. The biggest challenge is that in order to provide some quality reporting, we will need to start populating data. To make this a reality, this week’s post will be 100% focused on populating data in your Fabric Lakehouse.

It is important to note that I cover a lot of the mechanics of how to make this happen in previous posts. I have linked to the appropriate posts where applicable but wanted to highlight this now. There are a lot of moving pieces to make this happen and there is some prerequisite knowledge for you to be successful. Please take the time to reference these posts when applicable.

Elephant in the Room with Copilot Reports

Before we get started, we need to address the headwinds with Copilot Reports. If you know me personally, you know I am a balls and strikes kind of person. I call things as I see them and sometimes that means calling out some of the issues that come with new solutions.

Challenges with the Data

If you are a data person like me, you find the reports available from Microsoft to be frustrating. They are detailed with specific behaviors inside of Viva Insights, but you cannot drill down enough to see where use cases are being leveraged. The only way to look at individual user activity has been with the Microsoft 365 Admin Reports for Copilot. This data is what the API endpoints are pulling at this time. While the individual detail is nice, it does not resolve the frustration we have been seeing.

The reason for this frustration is twofold. Unlike the endpoints we explored in my previous series, these endpoints have two drawbacks. The first is that I cannot identify activity for a specified date. I can only pull for a period of activity. Second, I do not get a count of activity for each use of Copilot. Instead, I just get a date for the last time Copilot was used. Most of my customers have been vocal about this second point.

Overcoming the Challenges

I will not speculate on why Microsoft has chosen this path. Instead, I pushed to find a way to work around the challenge and make something happen. Instead of focusing on individual days of data, we will need to look at data week over week. Is this an imperfect system? Absolutely! But I rather focus on what I can do than what I cannot do. Previously, I was able to do this by exporting this data on a weekly basis. Annoying? Yes, but now we can automate it with the advent of the new reporting endpoint. And I was even able to enhance other elements of my analysis as a result!

This all means our reporting will simply look at user activity week over week with Copilot. This will help us drive the value we seek in some meaningful way instead of guessing at how people are leveraging Copilot. My customers have found this insight valuable in driving better adoption which results in stronger utilization.

Foundations of Copilot Reports

If you followed along for my series on Graph API reports, you should already be familiar with the key steps to get things going for this project. While we did a lot of this in that series, we will be revisiting parts of it again. To simplify the process, we will go through high level the steps we need to complete. If you need a primer on this, please go back to my previous series to go through the details on how to prepare a Lakehouse in Microsoft Fabric.

Build a Fabric Lakehouse

To get started, sign into Microsoft Fabric and create a workspace for your Copilot Reports. You will likely want to share these reports with others and it might make sense to create a dedicated workspace for this. The key point here is that we will want a new workspace to streamline the data collection process as it will be changing.

Screenshot of my new Fabric Lakehouse for my Copilot Reports
New Fabric Lakehouse for my Copilot Reports

With our Lakehouse in place, we are ready for the next step in the process.

Creating Tables in the Lakehouse

Just like we did before, we will use a Fabric Notebook to create our tables. To get started, download the Create Graph Tables notebook I created for you. Remember, that you must swap your context to the Data Engineering view before you can import the notebook.

Screenshot of the Data Engineering view inside of Microsoft Fabric where we will import our notebook.
Swapping to the Data Engineering view to import your Notebook

Once imported, you can open the notebook and connect it to your new Lakehouse. From there, you can run the notebook. This process will take a few minutes, but will hopefully make some parts of this process much easier for you long term.

Screenshot of our notebook creating tables in our Fabric Lakehouse for our Copilot Reports
Creating tables in our Fabric Lakehouse

With our tables connected, we need to start harvesting data from Microsoft Graph.

Harvesting Data for our Copilot Reports

Now that we have our foundation in place, it is all about getting our data liberated from Graph and into the Lakehouse. To get this process started, download the Copilot Reporting Template I created for your Dataflow. Once downloaded, create a Dataflow Gen2 and import your template:

Screenshot of the process to import a Power Query template in a Dataflow.
Importing my Power Query template in a Dataflow

Once imported, you will need to update your parameters with your Service Principal information. If you never did this before, follow the steps I outlined in my Reporting with the Graph API post a few weeks back to ensure you have one configured properly. If you do, just update the parameters:

Screenshot of the Dataflow with the required parameters that need to be updated for your Copilot Reports
Update your parameters with your Service Principal details

Now that you have updated your Service Principal information, you just need to update your connection properties ensuring all endpoints are using an anonymous method as we are using the Service Principal to authenticate. From there, you just need to map your data to your new Lakehouse:

Screenshot of us mapping data from our Dataflow to our new Fabric Lakehouse
Mapping our data to our Lakehouse

We will need to map the following tables with the following properties:

Query NameLakehouse Table NameBehavior
GraphUsersgraph_usersReplace
Date_Tabledate_tableReplace
LicensingDatalicensing_dataReplace
TeamsUserActivityteams_user_activityAppend
OneDriveUserActivityonedrive_user_activityAppend
SharePointUserActivitysharepoint_user_activityAppend
VivaEngageUserActivityviva_engage_user_activityAppend
CopilotDatacopilot_dataAppend
CopilotLastActivitycopilot_last_activityAppend
Table mappings for our Dataflow to our Fabric Lakehouse

Once this is completed, we can publish our Dataflow.

Next Steps

Now that we have all of this in place, it is time to setup the refresh. If you remember from the beginning, we will be looking this data on a weekly basis. Therefore, we will want to setup the refresh to pull the data weekly. For myself, I setup the refresh to occur Sunday mornings at 9 AM as that is a quiet time of the week. It also gives me the best shot at pulling the data on the same day of the week. Remember, we are at the mercy of Microsoft and when they are releasing the data, so we have to somewhat hope for the best in this situation.

With the refreshes running weekly, we will start to collect some meaningful data we can use. Next week, we will work on preparing the semantic model to allow us to connect our Power BI report. Until then, we will keep harvesting data to help bring better value to our report!

Powered by WordPress & Theme by Anders Norén