Making sense of challenging data topics one step at a time.

Author: Dominick Raimato

How do I connect to REST APIs with Power BI? (Part 3)

In my last article, I showed you how to connect the quote endpoint from the Yahoo Finance API to Power BI. By leveraging the documentation and using copy/paste, we successfully connected to the API without a single line of code. In our review of the outcome we noted that our response only brought back data for a single stock.

While helpful, we know you probably want to refresh data for more than one stock. In fact, I happen to have a list of stocks that I am currently watching:

A list of stocks and when I wish I had bought them back in the day.

My next challenge is to take my original query and make it sustainable for however many stocks I have on my watch list.

Options are Plentiful

As with every Microsoft product, I have several different ways that I could manage this challenge. All have their benefits and drawbacks. Let’s look at them each and see which one would work best for my scenario.

Option 1 – Append Individual Queries

The first solution you might have considered is building a separate connection for each of the individual stocks we have in our list. Each query would be individually created and then appended into a master list so that way it stays organized. This solution would work and be effective, but there are drawbacks.

First of all, the biggest draw back is that I have to manually create this query each and every time I want to add a new stock. This does not take a ton of time, but it would require a few minutes of effort each and every time. Not a big deal if you only need to add one or two stocks to the model but it would add up with 20-30 stocks.

The other drawback is that you will have to continue managing the data model. If the list of stocks is dynamic, that means you either need to create a single model with every single stock in it or continue to manage it over time. This is not really sustainable as new stocks enter the market on a regular basis and requires manual maintenance.

Option 2 – Go Back to the Documentation

Sometimes you have some options when it comes to your endpoints for expanding your query. If we go back to look at our documentation for the quote endpoint, you will see that we can specify up to 10 stock symbols in our query.

Note that our API documentation says you can select up to 10 stock symbols all separated with a comma.

Perfect! So instead of having to write out 25 separate queries, I can condense it all into three. That is much more manageable and efficient! Well – almost more efficient…

Remember what I stated above – because we are still specifying the stock symbols for each query we still have to manually intervene every time we make a change to our stock list. This could get difficult especially as we go to maintain our list over time as we might need to remove stocks later. As a result, we will spend time searching for which query to resolve.

Option 3 – Scale Out our Original Query as a Function

The last option we have is to make our current query scalable. How do we make a query scalable? By converting it into a function.

Functions allow us to reuse that query over and over again to simplify our refresh. Acting similar to our first option, we can apply a function to a list of stock symbols and quickly bring back the quote information. Now instead of managing the queries, we are just managing a list of stock symbols. As symbols are added and removed, our data model adjusts accordingly. The result allows us to automate refreshes without having to adjust the data model in the future.

Converting a Query to a Function

Remember how I said you might need some minimal coding? That time has come. We will need to do something you may have never done before – open the advanced editor window in Power Query.

Before we do that, I do want to rename the current query to “Get-StockInfo” so it is a little easier to read. Just update the name on the query settings bar on the right. I like using the word get in front to help quickly identify it as a function versus a query.

Once you have done that, click on Advanced Editor in the home ribbon and strap in for the ride!

Step 1 – Convert the Query to a Function

Our goal is to make this function flexible for any stock symbol. That means we need to require a stock symbol to be called every time the function is leveraged.

When you open the code editor, the first line has the single word “let”. We are going to insert a row above that and put a pair of parentheses followed by an equals sign and greater than sign. This effectively makes the conversion from a query to a function.

()=>
let
    Source = Json.Document(Web.Contents("https://yfapi.net/v6/finance/quote?region=US&lang=en&symbols=MSFT", [Headers=[Accept="application/json", #"X-API-KEY"="bHjSKbiRtr4cOzmOopdrI5TUKX5kMKXU6weUteeL"]])),
    #"Converted to Table" = Table.FromRecords({Source}),
    #"Expanded quoteResponse" = Table.ExpandRecordColumn(#"Converted to Table", "quoteResponse", {"result", "error"}, {"quoteResponse.result", "quoteResponse.error"}),
    #"Expanded quoteResponse.result" = Table.ExpandListColumn(#"Expanded quoteResponse", "quoteResponse.result"),
    #"Expanded quoteResponse.result1" = Table.ExpandRecordColumn(#"Expanded quoteResponse.result", "quoteResponse.result", {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"}, {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"})
in
    #"Expanded quoteResponse.result1"

However, we have not done anything to handle the requirement to pass a stock symbol when the function is called. To require a stock symbol be called, we will simply add a variable called StockSymbol and specify that it be passed in a text format. This will require a stock symbol be passed any time you leverage this function.

(StockSymbol as text)=>
let
    Source = Json.Document(Web.Contents("https://yfapi.net/v6/finance/quote?region=US&lang=en&symbols=MSFT", [Headers=[Accept="application/json", #"X-API-KEY"="bHjSKbiRtr4cOzmOopdrI5TUKX5kMKXU6weUteeL"]])),
    #"Converted to Table" = Table.FromRecords({Source}),
    #"Expanded quoteResponse" = Table.ExpandRecordColumn(#"Converted to Table", "quoteResponse", {"result", "error"}, {"quoteResponse.result", "quoteResponse.error"}),
    #"Expanded quoteResponse.result" = Table.ExpandListColumn(#"Expanded quoteResponse", "quoteResponse.result"),
    #"Expanded quoteResponse.result1" = Table.ExpandRecordColumn(#"Expanded quoteResponse.result", "quoteResponse.result", {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"}, {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"})
in
    #"Expanded quoteResponse.result1"

Step 2 – Make the Connection String Dynamic

Now that we converted the query to a function and required a stock symbol be passed, it is time for us to use it dynamically with the connection string.

On line 3, you see our endpoint url and parameters that specify MSFT as our designated stock. We need to replace MSFT with our new variable. We will simply delete the current symbol, add an ampersand (&) outside of the quotes, and specify StockSymbol to concatenate the values together.

(StockSymbol as text)=>
let
    Source = Json.Document(Web.Contents("https://yfapi.net/v6/finance/quote?region=US&lang=en&symbols=" & StockSymbol, [Headers=[Accept="application/json", #"X-API-KEY"="bHjSKbiRtr4cOzmOopdrI5TUKX5kMKXU6weUteeL"]])),
    #"Converted to Table" = Table.FromRecords({Source}),
    #"Expanded quoteResponse" = Table.ExpandRecordColumn(#"Converted to Table", "quoteResponse", {"result", "error"}, {"quoteResponse.result", "quoteResponse.error"}),
    #"Expanded quoteResponse.result" = Table.ExpandListColumn(#"Expanded quoteResponse", "quoteResponse.result"),
    #"Expanded quoteResponse.result1" = Table.ExpandRecordColumn(#"Expanded quoteResponse.result", "quoteResponse.result", {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"}, {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"})
in
    #"Expanded quoteResponse.result1"

With the dynamic variable placed in the connection string, the function will now pull stock quote information based upon the symbols we have in our list.

Step 3 – Resolve Connection String Errors

If you clicked done and used this function in your report, everything would be fine inside of Power BI Desktop. However, you are going to run into an issue when you publish to the Power BI Service. Because we are using a dynamic parameter, the cloud service is not sure how to handle authentication. The result is an error in the Power BI Service and a report that will not refresh.’

To resolve this issue, we want to split our connection string to resolve this issue. We will essentially split it into two different sections. The first section will serve as the connection string and help with our authentication. The second half will be treated as a dynamic portion of the query and has no bearing on the authentication process.

In short, all we are going to do is move a portion of that connection string to prevent any errors from happening.

(StockSymbol as text)=>
let
    Source = Json.Document(Web.Contents("https://yfapi.net/v6/finance/", [RelativePath="quote?region=US&lang=en&symbols=" & StockSymbol, Headers=[Accept="application/json", #"X-API-KEY"="bHjSKbiRtr4cOzmOopdrI5TUKX5kMKXU6weUteeL"]])),
    #"Converted to Table" = Table.FromRecords({Source}),
    #"Expanded quoteResponse" = Table.ExpandRecordColumn(#"Converted to Table", "quoteResponse", {"result", "error"}, {"quoteResponse.result", "quoteResponse.error"}),
    #"Expanded quoteResponse.result" = Table.ExpandListColumn(#"Expanded quoteResponse", "quoteResponse.result"),
    #"Expanded quoteResponse.result1" = Table.ExpandRecordColumn(#"Expanded quoteResponse.result", "quoteResponse.result", {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"}, {"language", "region", "quoteType", "quoteSourceName", "triggerable", "currency", "firstTradeDateMilliseconds", "postMarketChangePercent", "postMarketTime", "postMarketPrice", "postMarketChange", "regularMarketChange", "regularMarketChangePercent", "regularMarketTime", "regularMarketPrice", "regularMarketDayHigh", "regularMarketDayRange", "regularMarketDayLow", "regularMarketVolume", "regularMarketPreviousClose", "bid", "ask", "bidSize", "askSize", "fullExchangeName", "financialCurrency", "regularMarketOpen", "averageDailyVolume3Month", "averageDailyVolume10Day", "fiftyTwoWeekLowChange", "fiftyTwoWeekLowChangePercent", "fiftyTwoWeekRange", "fiftyTwoWeekHighChange", "fiftyTwoWeekHighChangePercent", "fiftyTwoWeekLow", "fiftyTwoWeekHigh", "dividendDate", "marketState", "earningsTimestamp", "earningsTimestampStart", "earningsTimestampEnd", "trailingAnnualDividendRate", "trailingPE", "trailingAnnualDividendYield", "epsTrailingTwelveMonths", "epsForward", "epsCurrentYear", "priceEpsCurrentYear", "sharesOutstanding", "bookValue", "fiftyDayAverage", "fiftyDayAverageChange", "fiftyDayAverageChangePercent", "twoHundredDayAverage", "twoHundredDayAverageChange", "twoHundredDayAverageChangePercent", "marketCap", "forwardPE", "priceToBook", "sourceInterval", "exchangeDataDelayedBy", "pageViewGrowthWeekly", "averageAnalystRating", "tradeable", "priceHint", "exchange", "shortName", "longName", "messageBoardId", "exchangeTimezoneName", "exchangeTimezoneShortName", "gmtOffSetMilliseconds", "market", "esgPopulated", "displayName", "symbol"})
in
    #"Expanded quoteResponse.result1"

This is a subtle but important move. By placing the dynamic content inside the relative path, we have eliminated confusion during the authentication process. Power BI will authenticate on the connection string in place and then concatenate the relative path to complete the request. You only need to do this when your queries are dynamic, but it is an important step in the process!

Step 4 – Test the Function

After clicking done, your view in Power Query will have changed. Note that the query list now shows an “fx” symbol for function and you no longer have a tabular list of data. In fact, you have a place to enter a stock symbol. This allows us to quickly test to see if our function is working. Go ahead and enter MSFT and click invoke to see it work.

The outcome from converting a query to a function.

After you invoke the function, you will be taken to a new query called “Invoked Function” and can see the stock data for Microsoft. That was just to test and see if it works. We can go ahead and delete that query because will not need it again.

Applying the Function

With a working function in hand, it is time to apply it to our data model. I already have a list of stocks in Excel, so I will import that list to Power BI. If your stock list is somewhere else, that is perfectly fine. Just get that list of symbols inside of Power BI.

Once imported, go to your “Add Column” tab and select “Invoke Custom Function”. Select your function from the list. You will notice that the StockSymbol variable will appear. We need to specify which column of values will be passed to that variable. Select the symbol column and click OK.

Setting up the custom function to use the stock symbols from the spreadsheet.

At this point, you may be prompted about data privacy. If you do, just select ignore privacy levels for now as this is a demonstration.

You should have a new column with the word table highlighted inside of it. We can go ahead and expand those values and confirm the current stock quote info is available for each different symbol.

And just like that you created your own function and applied it in Power Query!

Next Steps

With a working function, what is left to discuss? As I stated at the beginning of this series – each API is like a snowflake. They all are unique and have their own challenges. This is a single example of how to connect a REST API to Power BI. There are challenges and other considerations you must balance when creating your reports. In my next post, I will dive into challenges you will see when leveraging other APIs inside of Power BI.

Until then, was this your first time creating a function? Did you find it challenging? Do you have other applications for a function inside of Power BI? Tell me in the comments below!

How do I connect to REST APIs with Power BI? (Part 2)

In my last article, I discussed the basics around REST APIs. We did not do anything with Power BI. While I did not get to the real purpose of this series, it was important to establish the basics. Using the elements we discussed, you can copy and paste them into Power BI to connect to your API.

For this post, I am using the Yahoo Finance API to demonstrate how this works. I do not receive any kick back for sharing this API. They have a free tier and it is a great API to use for beginners. If you want to try this on your own, you can follow along with this article.

RTFM – Read the Fine Manual

Before you do anything with APIs you must review the documentation. I worked with an API a few years ago that required a unique API key for each individual endpoint. If I had spent some time reviewing the documentation, I would have prevented hours of frustration as I tried to connect to various endpoints. If you are like me, you might find this frustrating. However, I find reading the documentation far less frustrating than constantly struggling to get an endpoint to work inside of Power BI.

Identifying Usable Endpoints

The first time I review documentation I am sorting out which endpoints are useable or not. I need to find out which API endpoints that will work with Power BI. I opened the documentation and was excited about what I saw…

Yahoo Finance API Documentation – Welcome Page

At a quick glace, all of the endpoints are using the GET connection method. As a result that means every single endpoint can be used in Power BI. That is great to see!

Which Endpoint is for Me?

Knowing what you are looking for is key when reviewing documentation. I want to find stock price information and compare prices. Quality documentation will have one of two things to help you with this process:

  • A data dictionary that clearly defines each column in the query
  • A same response from the query
Sample Response from the Quote Endpoint

This documentation brings back a sample. As expected, I can find which endpoint meets my needs. As an example, I am interested in stock prices. According to the documentation, I can use the quote endpoint to accomplish my goal.

Setting the Parameters

Now that we have the connection method and endpoint, we need to establish what parameters are needed. Sometimes parameters are identified but not clearly documented. As a result, you might get frustrated because you cannot apply parameters to your query. This API is much easier to understand because of the clearly labeled parameters with sample inputs.

Sample Parameters for the Quote Endpoint

As you can see, the documentation provides sample values for each parameter so you know how to leverage it. This results in far less frustration because you know what values are accepted for each parameter.

Since this API has an option to try it out, I have updated the parameters to pull back Microsoft’s stock information. This allowed me to validate if my parameters were correct and saved me a ton of headaches. Since it worked, you can see that it came back with a response.

Testing Parameters for the Quote Endpoint

As a result of the test, there was another valuable bonus. The documentation assembled my connection string by putting the endpoint url and parameters together into a single connection string. How convenient!

The Connection String is Assembled Automatically

Header Information

Up to this point the Yahoo Finance API documentation has explicitly called out each of the REST API elements. However, the headers in this endpoint are not as easily identified. We have to dig a little deeper to find them.

You probably missed the box above the connection string labeled Curl. Curl is a library used to call APIs if you are developing an application. While we are not developing an application, this box does help us identify our headers. If you notice, there are two lines with “-H” at the beginning. These are the header values we have been looking for!

You have to look closely, but here are the header values we need for the quote endpoint!

Double Check

At this stage, we have identified all the elements we need for our connection in Power BI. We have the connection method, endpoint url, parameters, and headers. Remember that we are not using a POST connection method, so we do not need a body for this request. Now we just need to plug it into Power BI!

Connecting to our REST API (Finally!)

With all of our elements in hand, we can open up Power BI and start building our connection to the API.

Step 1 – Select the Web Connector

Open Power BI, select “Get Data”, and choose the Web connector:

Step 2 – Copy/Paste Elements into Power BI

First things first – switch to the advanced mode for the Web Connector so you can add all of the required details.

Using the Advanced Web Connector view in Power BI

You will notice that there is no option to select your connection method in this view. Power BI defaults this to GET, so you do not need to configure anything. One less thing to worry about!

Go back to the documentation and copy your connection string. If your documentation does not assemble this for you, you might need to do this by hand. Once assembled, paste it into the URL Parts box in the web connector.

Highlighting where the Connection String (endpoint url and parameters) goes in the Advanced View of the Web Connector for Power BI

Next, we can add the headers. This endpoint has two – Accept and X-API-KEY. You will only see one set of boxes at the bottom. You can simply click “Add header” to accommodate for the second header.

For the Accept header, we can select it from the drop down on the left as it is a commonly used header. On the right, copy and paste the value found in the API documentation.

For the X-API-KEY header, you can enter your own value in the drop down or copy/paste it from your documentation. You can copy your API key from the documentation as well into the box on the right.

Highlighting where the headers go in the Advanced View of the Web Connector for Power BI

By bridging the documentation with the web connector, we have successfully filled in all of the blanks for the connection method, endpoint url, parameters, and headers. With everything in place, we can click okay and connect!

Step 3 – Authentication

Since this is the first time we are using this connection, we are prompted to select an authentication method. If you are using the Yahoo Finance API, you can easily select Anonymous and click connect as our API key handles our authentication.

Authenticating with the Power BI Web Connector

On the left, you can see there are other authentication methods. Sometimes your connection method will require a username and password. I have experienced this with ServiceNow and Honeywell APIs. If you are adventurous and want to try using the Microsoft Graph API, you will need to use your organizational account. You will need to review your documentation to determine if you need to use other forms of authentication to use an endpoint.

Transformation Time

By following these steps, you should now see your stock data inside of Power Query. All of the information you saw in the sample response should be in a tabular format and ready to go. For an easier view, I made some adjustments so you could easily compare the values:

Data from the quote endpoint

With this query in place, you can quickly refresh the data and will always have up to date stock quote information for Microsoft.

Next Steps

With everything connected, you probably think we are finished. If there was only one stock we cared about, we would be in good shape. However, investors are looking at several different stocks each and every day and this query only pulled one. In my next article, I will show you how to scale this query so whether you have 10 or 1,000, you will have a maintenance free report that always refreshes.

Until then, did you follow along with the Yahoo Finance API? Were you able to get some valid data? Tell me in the comments below!

How do I connect to REST APIs with Power BI? (Part 1)

We have all been there once before. We are using a 3rd party tool and we want to extract data from it. You might have an option to export the data to a CSV or Excel file, but that requires you to be involved and can result in errors.

Next you might reach out to the support team and they send back documentation for their APIs. They think you are a programmer and probably know how to use them. But in a world full of citizen developers not everyone knows how to use these APIs to bring data into Power BI.

This series aims to demystify the process and start getting that data into Power BI and eliminate those data exports. And believe it or not, it will require a minimal amount of code to make it happen!

What is REST?

REST APIs (also known as RESTful APIs) is a specific architecture of an Application Program Interface (API). Known as the Representational State Transfer (REST), it has a standard set of rules that are widely used in the programming world. It scales well and has become a commonly accepted API that is stable. REST APIs fill a specific need to simplify integration of data between different applications.

API requests are fairly straight forward. It starts with a request from you (the client) to an API endpoint. After being authenticated, the request continues on to a database with the data you need. That data is returned back to the endpoint where it is formatted and then passed back to the client.

The flow of requests with REST APIs

For many, this flow makes a ton of sense. However, if you are not from a programming background, this can be a little intimidating. When you read API documentation, it is often full of code snippets and it can be difficult for you to get started. Sorting through these snippets and figuring out how they work can be complicated and time consuming.

But here is the good news – Power BI can handle these API requests often with no code required! The trick is understanding how REST APIs are structured and then plugging the information from the documentation into the right place.

Anatomy of REST APIs

So if we can take the documentation and plug it into Power BI, it is important to understand how REST APIs are constructed. Once we understand the anatomy of an API, it becomes easier to connect via Power BI. There are five common elements that are leveraged with REST APIs:

  • Method
  • Endpoint URL
  • Parameters
  • Headers
  • Body

You will always need a Method and Endpoint URL when using REST. However, the other elements will vary based off of the documentation. You will have to read your documentation carefully to make sure you have everything you need.

To better understand these elements, let’s take a look at them individually to understand what they do and how they work.

Method

There are four methods that are used with REST APIs: Get, Post, Put, and Delete. Methods help instruct what is happening when the API is called. For example, if you called a Delete method, you would be deleting content on the server.

Sample methods used with REST APIs

To make our lives easier, the Get method is the most common one we will use and is the easiest to implement. Post can be used sometimes, but it requires a little more effort and coding in Power Query to make it work. You will never use Put or Delete in Power BI.

Endpoint URL

The Endpoint URL determines which specific API we are calling. There will be multiple APIs available, so it is important to determine which one you need. In reviewing your documentation, you should be able to determine which endpoint contains the data you need to query.

Sample of an Endpoint URL

Parameters

Parameters filter the results of your request. Notice the “?” that separates the connection string. Before it is the Endpoint URL while after are the parameters. Multiple parameters are separated by ampersands (&) and strung together.

Sample of Parameters tacked on to an Endpoint URL

Parameters speed up requests as they reduce the amount of data requested. Often optional, but sometimes required. The API documentation will specify if any parameters are required.

Headers

Headers are designed to pass information about your request. The most common header you will need is an API key or token but you may run into others. Another common header you might need to identify is the Content-Type to specify how we want to see our response.

Sample of REST API Headers

Body

Since we are using the Get method primarily in Power BI, you will not need the body. However, it is worth mentioning it in case you use a Post method in the future. The body is another set of parameters to assist with the request.

Documentation is your friend

If you are like me then you hate reading directions. I love working through things on my own and discovering. But after many lost hours, I have learned to take a few minutes to read the documentation that comes with an API first. It takes time, but there are several reasons to take a few minutes and understand how the API functions.

Sample of REST API documentation for Yahoo! Finance

Reason #1 – Each API is like a snowflake

Every API is unique and has their own requirements. You will find some that require certain parameters to be declared. Others might need a token. I even worked with one that required a different API key for each endpoint! Understanding the components of our API will reduce headaches as we go to use them.

Reason #2 – Ensure you are receiving the right data

The best documentation is paired with a dictionary to help identify the data being returned. You may need two or three APIs to get all the information you need. Taking a few minutes to understand what data is available will help you build a plan as you build your connections.

Reason #3 – Try it out

Often APIs have an option to try them out. You can select an endpoint, fill out the parameters, and then submit the request. It brings back a sample of the data and even will assemble the connection string and headers for you. It also helps identify any required parameters and diagnose issues before you move into Power BI.

Next Steps

Now that we understand the basics around REST APIs, the next step is to transfer what we have learned into Power BI. You might be wondering why I took the time to understand the basics of REST APIs. As I mentioned before, many of these APIs can be used with no code. With a strong understanding of how they are built, we can copy/paste these elements into Power BI and start querying data. In my next post, I will show you how to connect to the Yahoo Finance API to query stock quotes in real time.

Until then, have you used any APIs in Power BI? Do you have any favorites that you like to use? Tell me in the comments below!

What do I need to know about Hybrid Tables in Power BI?

I was so excited to see the release of Hybrid Tables in Preview with last month update of Power BI. But just like you, I was wondering if Hybrid Tables were for me. I did some investigating and came back with some answers.

Understanding Import versus DirectQuery

It seems like a simple topic, but it is important to understand the difference between the two data connectivity methods because of Hybrid Tables.

Selecting Import or DirectQuery has an impact on how you build your reports in Power BI

When you select Import for you connectivity method, the data is going to be imported into your Power BI report. When properly used, it will quickly render visualizations. You also have the ability to perform transformations on the data by adding columns. (Yes – best practice is to push that as far upstream as you can, but that is not always possible for citizen developers)

When you select DirectQuery, you will be connecting to your data source, but never adding your data to the report file. You will query it every time you need to render visualizations. It keeps the report file size down and eliminates the need to refresh your data. However, this can be slow and does not allow for any transformations.

Why is this review so important? The answer is simple – Hybrid Tables are built on top of Incremental Refresh which uses the Import data connectivity method. If you have never used Incremental Refresh before, Microsoft has provided documentation on how to enable it for your report.

Are Hybrid Tables different from Incremental Refresh?

Hybrid Tables are not different from Incremental Refresh. In fact, Hybrid Tables augment Incremental Refresh. To understand how they work together, let’s look at how the two compliment each other.

To make it easier, I want you to think about your data in three segments – historical, current, and real time.

Incremental Refresh

Incremental Refresh is the base of Hybrid Tables and handles the historical and current data. Incremental Refresh is setup to simplify how much data is imported every time the data model is refreshed. First you will determine how much data you want to bring into the model. In my example, I have set my model up to return the last ten years of data. Then I will select how much of that data will be changing and need to be refreshed. This is generally a shorter window of time. My example is set to refresh the last ninety days worth of data.

Looking at this image, you can see a bar chart that is showing you the breakdown of how your data will be managed in the refresh process. The last ninety days of will be incrementally refreshed – our current data. The remainder of the selected data will be archived – our historical data.

Incremental Refresh setup to bring in the last 10 years of data, but only refresh the last 90 days

If you have been using Incremental Refresh, this is how your data refreshes have been managed. The only problem with Incremental Refresh is that data is still missing. This excludes the most current data from the refresh and results.

Hybrid Tables to the Rescue!

Enabling Hybrid Tables brings you the most current data using DirectQuery. When you check the box, you will see a new segment called Real Time appear. At the same time, the “Only refresh complete day” is now checked and greyed out. What is going on?

Getting latest data in real time adds a section to the bar chart at the bottom and greyed out “Only refresh complete days”

Simply put, incremental refresh now will bring in all data before today. With this data loaded, DirectQuery will now bring in all new data that comes in today. This means the data does not need to be refreshed multiple times in the day. Now a single refresh loads the majority of the data model and any changes that come in today are handled in real time.

What else do I need to know?

If you are excited to try out Hybrid Tables, there are a few things you should know before jumping in.

Import becomes DirectQuerty

If you have been using Incremental Refresh and made transformations like adding columns, you will run into issues. We explained above how Import connections allow you to make those transformations. However Hybrid Tables use DirectQuery. You will need to play by those rules in order to use this feature.

Power BI Premium is a Must

Hybrid Tables can only be used with Power BI Premium. This can be Power BI Premium Per User (PPU) or Premium Capacity. If you already have Power BI Premium, you will be fine. If not, consider leveraging Power BI Premium Per User trial to see if Hybrid Tables work for you.

Refresh Costs

When using DirectQuery, you might be incurring additional costs whenever you view reports. If your data warehouse charges you per query, this could start to add up. You will need to monitor costs to make sure your queries are running efficiently and not costing more than forecasted.

So Should I use Hybrid Tables?

If you have a rapidly changing data set that you need to access in real time, you should try using it. Hybrid Tables work well and are easy to setup. If your data does not change that frequently, it might not be as valuable.

Have you tried using Hybrid Tables? Is there anything I am missing? What was your experience? Tell me in the comments below!

Stop Wasting Time by Using Power BI Themes

Power BI Reports serve several purposes, but they often consumed by larger audiences. As a result, we often strive to provide a quality viewing experience for our report consumers. Most people jump right in with the creation of their reports and neglect how the report is going to look and feel. This is common as we want to get to the good stuff as quickly as we can. But as they say, an ounce of prevention is worth a pound of cure. Taking some time to setup a theme file can really speed up the development process.

Why Are Themes Important?

So before you say that his is a lot of work to just make reports look pretty, it is really more than that. The look and feel of your reports are critical – especially in a self service environment. The look and feel of a report can be a subtle signal to the consumer of who created the report and the reliability of the data.

The Reliability of Poorly Branded Reports

I often work with customers who have created reports that are consumed by large audiences that use color palettes and fonts that do not align with the company branding. This is common with the first several reports that are created. Eventually, someone gets ahold of a branding guide and starts using elements in their reports. Updating the bars, colors, or lines to match the corporate branding helps add legitimacy to the report. Little by little, more corporate branding is added until the theme is fleshed out.

But what happened to those original reports? The ones that used no theme or some of the theme? They often remain neglected. New employees receive links to these reports and often ask themselves “Is this information accurate?” Eventually they will trust the data, but they spend weeks asking if it is correct.

It seems silly, but the branding can instill confidence in your reports.

Managing Branding Manually Hurts in the Long Run

Another common challenge I have identified is inconsistency between visualizations. Actually, it is difficult to even do within a single visualization. When you look at a simple column chart, you would think it is really easy to keep your fonts consistent. Here is a breakdown of where you can make changes to the fonts for a column chart:

  • On the X Axis
    • Value labels
    • X Axis titles
  • On the Y Axis
    • Value labels
    • Y Axis title
  • Legend
  • Small Multiple Grids
  • Column data labels (can be different for each series)
  • Visual title
  • Tooltip text

Add it up and you have a total of nine places to update the font so it is consistent! That is unbelievable! And if I have to do this for multiple visualizations, this can become a lot to manage. And without a consistent theme, I have to keep making these changes. So how do I prevent this from happening to me in the future?

Make Themes Work for You

Like I mentioned before, a little planning can go a long way. Here are my steps to help simplify the process.

Step 1 – Find the brand identity documentation for Your organization

Hands down this is sometimes the hardest part of the process. If you work at a company that has a strong brand identity, this should not be difficult. There is likely a brand identity guide or a PowerPoint template that has all of the details you need such as the color HEX or RGB codes and fonts located in the margins of the slides.

If you cannot find a brand identity guide or PowerPoint that has that information, I always recommend becoming friends with someone in your marketing department. They have this information readily available for their projects. It might cost you a few drinks at Happy Hour on Friday night, but it is a lot easier than having to get this information manually.

If you are not as lucky, you can still get this information. It might be a little more difficult, but you can find it embedded inside other resources. You can usually find a PowerPoint template that has the basic colors and fonts identified. If you select a text box in a presentation, you will likely find the font used for the majority of the presentation. You can also click on the font colors to see if they are the ones your organization uses. If they are there, select a color one at a time and then click on more colors. Doing so will bring up the RGB and HEX color codes for the selected color. I like to copy the HEX colors as they are easier and paste them into an Excel spreadsheet with the color name so I can quickly reference them.

Still cannot find your colors? Not to worry. Find a digital copy of your company’s logo and add it to a blank PowerPoint presentation. From there, use the eyedropper function under the font color dropdown and use it to extract the key colors to your organization’s logo. Then follow the steps above to extract the color codes.

Step 2 – Build out your Power BI Theme

With the RGB or HEX color codes in hand, it is time to update your Power BI theme. Go to the View tab in Power BI desktop and click on the drop down to the right of the themes.

Accessing the Power BI Theme customizer

From here, you can start to build out your theme. Start by adding a name for your theme. From there, update the color palette. Simply add the RGB or HEX codes you identified into each of the positions. Keep in mind that the order of your colors will match the order they are selected for legends. If you have a particular combination of 2-3 colors you like to use a lot, make sure they sit in the first three positions so they can easily be leveraged.

Setting up the color palette for your Power BI Theme

Next, you will want to address the text in the report. By default, Power BI is set to use a blend of Segoe UI and DIN fonts. The list of fonts is limited to ensure they stay web safe and visible in the Power BI service. Your first choice might not be in the list, but most organizations have a secondary font that is for such an application. You can also set default sizes and colors for generic text, titles, cards/KPIs, and tab headers.

Customizing the text for your Power BI Theme

Step 3 – Tweak your Power BI Theme until it is set

Once you have the basics in place, you can start making other adjustments to your theme. Other items you might choose to customize include visual backgrounds, borders, report backgrounds, and other adjustments. You have a lot of flexibility to make adjustments that meet the needs of your reports, so play around with them. Whatever you do, try as hard as you can to avoid the formatting pane for your visuals.

Understand that this process can take a little time, so do not be afraid to spend more time than expected. The more customizations are set in the theme, the less customizations you will have to apply to visualizations when you build your reports!

Step 4 – Export your Power BI Theme

When you are comfortable with the theme you have developed, it is time to export your theme so it can be used with other reports.

In Power BI Desktop, go back to the View tab and click on the dropdown next to themes. Select Save Current Theme and save it to your OneDrive. This way it will be automatically backed up in the event of a computer emergency.

Step 5 – Deploy your new Power BI Theme

This is where things can get a little hairy. You have a few options for deploying your theme to others within your organization.

1). You can distribute the Power BI Theme file you created. Users can import the theme every time they want to use it when building a report.

2). You can create a Power BI Report using the theme. Leave it blank and save it as a Power BI Template (*.PBIT) file and distribute it.

Both solutions work, but they feel a little clunky. For me, I have found most of the report creators are quite savvy with theme files, so it is not a big deal. However, if you are interested in building a strong citizen developer community, this could prove to be challenging.

Best practices/considerations for building Power BI Themes.

When you build out your theme, keep these best practices and considerations in mind to help drive the best experience for your report consumers.

Check your theme against multiple visualizations

The temptation is always there – if you only use bar and column charts you only check those visuals. When building out your theme, make sure you try all of the different types of visualizations to make sure your theme works across all available visualizations.

Consider a style guide for reports

If you followed the first best practice, this should be an easy one. Take the report file you used to test the theme and create a sample report. From here, add in key information such as how to label data, types of charts, and other organizational standards that have been set forth by your team. This will help content creators have a roadmap for building quality reports.

Consider color contrast to make reports accessible to all

This might be a bigger concern if you do not have an existing style guide from marketing, but is always an important consideration. Content is difficult to consume if creators use poor color selections. Even worse, it causes strain for users with visual disabilities who have trouble viewing the content. Use a tool like WebAIM’s Contrast Checker to ensure the best experience for all report consumers.

Conclusion

It is a lot of front end work, but I promise you that you will love how easy it is to deploy reports when you have a custom theme. Take the time to build it out right and you will never have to mess with updating individual visuals ever again.

Are you using themes already? Do you have some tips/tricks on how to use them? How are you distributing themes? Leave a comment below and let me know!

Welcome and Hello!

Thank you for visiting! I am excited to share my thoughts and experience around all facets of data. I started this journey because of the frustration I often see others suffer from every time they try to effectively use data. My goal is to share some of the tips and tricks I have picked up along the way with tools like Power BI and Excel to simplify the process.

I also am looking forward to sharing my thoughts on the state of data literacy. It has an impact on the readability of content created and the consumers who view it. Frequently I find horrible examples of data visualization out in the wild. Sadly, I see these examples being produced by organizations who claim to hold themselves to high standards. I want to bring these examples to light, dissect them, and show you how to improve them. Report viewers do not always think about these things. Together, we can change how content creators put together reports that are more accessible by consumers.

So thank you for joining me here and letting me share my stories! I look forward to us both learning together!

Page 4 of 4

Powered by WordPress & Theme by Anders Norén