Using Power Platform Dataflows to extract and process data from Business Central – Post 2
This is the second post in a blog series about how you can use Power Platform Dataflows to get data out of Business Central in a form where it can be accessed by other applications.
The first post covered a basic scenario of Power BI connecting to the Business Central’s API. We started there because Power BI and Power Platform Dataflows are closely related, which will also quickly be clear from this post. The setup we ended with can be illustrated like this:
The dataset has a private data store, and we can refresh the data in that data store either manually or on a schedule. Reports can visualize the data in datasets.
In this blog post, we introduce Power Platform Dataflows. The setup we end up with in this blog post can be illustrated like this:
On first impression, the difference is not huge, but as you can perhaps guess, the fact that data gets stored in an Azure storage account potentially opens for new possibilities – which we will get to in the third blog post.
Prepare to create the first dataflow
Before we can create dataflows, we need to upgrade our Power BI license to a paid version (see https://powerbi.microsoft.com/pricing). Only users who create and maintain dataflows need paid licenses. Users who only consume dataflows can use the free license.
Power BI is organized around workspaces. Each user has a default workspace called “My workspace”, but those do not support dataflows. We need to create a new workspace. Click “Workspaces” and then “Create a workspace”:
Give the workspace a name. We’ll call ours “Business Central Workspace”. When we select our new workspace, we can suddenly create dataflows:
Create a dataflow
Click “Get started” under “Dataflows”, and then click “Add new entities” on the next page, then choose “OData” as data source.
Paste the URL (https://api.businesscentral.dynamics.com/v2.0/production/api/v1.0/companies) in the same way as we did in the first blog post:
Sign into Business Central using Organizational account. Click “Next”.
Now the same list of companies appears as in post 1:
Click [Table] on the entity (e.g. “items”) for the company you are interested in.
Now you can transform the data in the same way that we could when we used Power BI Desktop. As an example, in the following picture I remove the “type” column:
Rename the query in the upper-right corner from “Query” to “Items”. When you are done, click “Save & Close”.
You will be prompted to give the dataflow a name. Name it simply “Business Central”. Some people consider it a best practice to have one dataflow for each data source, but that is completely up to you.
Here, you also notice a new concept; entity.
Each dataflow can contain multiple entities. In the case of Business Central, we could add another entity Customers in the same way as we did before, except we start by clicking “Add entities” on the existing dataflow. Our dataflow now contains two entities:
Close this page. Now you see your newly created dataflow:
The “Last Refresh” time is N/A, which means it hasn’t read the data from the Business Central service yet. You can click the little arrow-circle to refresh the data for the first time. You can also configure recurrent refreshes, e.g., once per day.
Notice how we stayed in the online version of Power BI all along. In the first blog post, we used Power BI Desktop to create the connection to Business Central’s API and to transform the data.
Using the dataflow in a report
The dataflow is created, and it has refreshed the data from Business Central, so now the dataflow has a copy of the data. The data is stored in an Azure storage account, but the storage account isn’t visible to us – it’s managed by the Power Platform Dataflows service:
Let’s try to create a report that uses the dataflow. For this purpose, we need to use Power BI Desktop.
First, make sure you are logged into Power BI Desktop – check the upper-right corner. Then click “Get Data” and locate dataflows (still called “Power BI dataflows” in some places):
Click “Connect”. Now it connects to your Power BI Online account, and you can select one or more of the entities that you created earlier. Notice how it is structured as workspaces -> dataflows -> entities:
I selected both entities. You can choose to transform the data further at this point – beyond what the dataflow itself has already transformed, such as the column I removed earlier.
When you are done, you can create visualizations as usual, based on the two entities.
In the end, publish the report to Power BI Online as we did in the first blog post. You will notice that you still end up with a report and a dataset, in addition to the dataflow. The report doesn’t get the data directly from the dataflow, it still goes via a dataset.
The setup can be illustrated like this:
In the next blog post, we will take the final step to set the data free; we will configure Power Platform Dataflows to store the data in our Azure storage account! Once we have done that, we can access the data using other tools and services. Stay tuned!