Developing Power BI for Dataverse + PowerApps, and Dynamics 365 - Part 1
Scott Sewell
Principal Program Manager @ Microsoft | Focused on Microsoft Fabric, Power BI, and Dynamics 365
Overview
Dataverse is the standard data platform of the Microsoft Business Application such as Dynamics 365 CE and Power Apps Canvas Apps – along with Forms Pro, Power Automate Approvals, Power Portals, and others.
For the past 4 years, I've focused on the incredible value of using Power BI with Dataverse. I've tons about what works well - and even more about how to avoid common pitfalls.
This document is intended to serve as a high-level guide to the process, a prompt for the decision process you’ll work through as you evaluate your options, and a pointer to relevant and useful documentation found online at https://docs.microsoft.com.
Value Proposition: Use Power BI to Expand the Visibility of your Biz Apps
Because of its ease of configuration, rapid deployment, and widespread adoption, there’s now an increasing volume of data being managed and stored in these databases across an organization – and an even greater need and opportunity to bring the value of Analytics into those processes.
Power BI is a powerful accelerator for users in these applications – and their leadership. From the perspective of the organization using these apps, there are several key benefits:
Decision Factors for Getting Dataverse Data into Power BI
There are three primary ways to connect Power BI to Dataverse data. (There are, of course, many more variations, but we’ll cover these as the most common examples.)
Before choosing a connection method, it’s critical to evaluate the report’s use across several decision factors. This will help you choose the right approach for the current reports.
Organizations will need to rank or choose the factors according to the report’s needs – and different reports within the same organization may take different approaches.
The most common decision factors are
Performance Tolerance
Users will have different experiences interacting with a report based on imported data versus one built on DirectQuery. – The Imported data approach will be dramatically faster and more responsive since the data is pre-cached and calculated. In contrast, the DirectQuery report is re-querying the data each time the report is opened, or new filters are applied.
The front-end report experience for a user viewing a report built on imported data is typically a much faster, more responsive report versus one built over DirectQuery. Since the data is already loaded into the data model, any interaction such as a filter/cross-filter or drill-through doesn’t require a re-query of the database. This results in a faster, more responsive report interaction experience for the user.
DirectQuery should be used only if other decision points, like the freshness of data, or enforcement of the Dataverse security model, require it. In cases where DirectQuery is needed, the report should be simplified. The amount of data being analyzed in the report are minimized to reduce DirectQuery performance bottlenecks as much as possible. ?We’ll cover specific DirectQuery performance strategies later in this document.
领英推荐
A good resource on design implications of DirectQuery is Using DirectQuery in Power BI - Power BI.
Data Volume
The amount of data being transferred from the source into the report has a strong, direct impact on which strategy is used.?With any approach other than DirectQuery, the impact on the user is minimized since the data is already in the Power BI dataset. With DirectQuery, as more data is requested from the server, the time it takes to query and transfer that data increases, and consequently, the user will wait longer for the report to refresh.
DirectQuery is generally only considered viable when the report’s dataset is not "too large". (I'm aware that's a subjective measurement, but in this context, assume that it means there are either fewer than 100K records in the report’s source tables or, after filters are applied, fewer than 100K records are returned to Power BI in the results.) Create a Power BI report using the Dataverse connector - Power Apps (Note that each query must return less than 80MB in under 2 minutes. Later in this document, we’ll talk about adapting to those limits and other essential strategies in Dataverse DirectQuery implementations.)
When the query result size is in the range of a few hundred thousand records, Power BI Premium Dataflows or Datasets with Incremental Refresh might be more appropriate. Incremental refresh for datasets and real-time data in Power BI - Power BI | Microsoft Docs or Using incremental refresh with dataflows - Power Query ?Either of these would be an option for an organization that has somewhat more data needed in the report and can limit the queries to only recently modified records for the incremental query.
Larger datasets of a few hundred records upwards to multi-million record datasets can utilize the enterprise-scale supported by Dataverse Azure Synapse Link. This approach sets up an ongoing managed pipeline to copy Dataverse data into Azure Data Lake as CSV or Parquet files. The data can then be ingested from the lake via Azure Synapse and imported into Power BI. Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace - Power Apps.
Security Complexity
Dataverse supports incredibly complex security models to control the visibility of records to different users. (For example, one sales user might be able only to see their opportunities, but other opportunities are entirely hidden from them, yet a team manager would see all their team’s opportunities but not opportunities from another team. – The level of complexity implemented from one organization to another will vary with each organization’s needs.) Not every organization implements that level of complexity – and not every report needs to partition the data to accommodate that individualized partitioning of the data.
Suppose the report will be shown to users whom each have different permissions controlling which records they are allowed to see. In that case, DirectQuery over Dataverse ensures that Power BI can display data to the user based on that user’s permissions from Dataverse. ?Two users with different permissions in Dataverse will see only the data they are allowed to see reflected in Power BI when viewing the same report.
Other organizations can manage security by simply controlling which users are allowed to see the report. Any user with permission to the report can see all data in that report. This could be where there are no rules in dynamics partitioning the data – or where, even if there are rules; the users are managers who have permission to see all data anyway. The more flexible and performant approach would be a direct import of the data, whether from Dataverse or via Synapse.
A third scenario would be where relatively straightforward data access rules could be implemented using Power BI’s Row-Level security.?Row-level security (RLS) in Power BI Report Server - Power BI This approach would support the data partitioning with improved performance over the DirectQuery method.
We caution that an RLS implementation should not attempt to mimic the Dataverse security model if the model implemented in Dataverse has any level of complexity implemented through rules around Ownership, Teams, Sharing, Inheritance, etc. ?
There’s no fixed guidance on precisely when the rules become too complex to implement in Power BI’s Row Level Security, but keep in mind that a Power BI RLS-based security model is entirely separate and would likely need ongoing management to ensure changes in Dynamics are in synch with the security rules in Power BI.
Configuration Complexity
Connections to Dataverse – whether they be direct or import mode – are relatively straightforward and do not require any additional software or elevated permissions in Dataverse. This is an advantage for organizations or departments that are just getting started. ?It’s also an advantage for organizations that do not need the enterprise features available in the Synapse solution.
The Synapse option requires system administrator access to Dataverse and an Azure subscription. Configuration is not complex, but experience with Azure Storage and SQL is beneficial. This may limit which organizations can take advantage of this enterprise-oriented feature.
Coming up in future installments:
Conclusion
Power BI is amazing as a tool for exposing the value of data stored in Dataverse to Business Decision Makers. Getting the data out of Dataverse and into Power BI is the first step on a rewarding journey of discovery.
Founder & CEO at Advvy
2 年Advvy is an ISV that provides a Dataverse based application as a managed solution to enterprise and SMB media agencies. We’ve just started providing PowerBI dashboards and record level reports. We also have invested significantly into our PowerApps Portal for our client’s clients (the advertisers) in which we can thankfully provide an embedded PowerBI reports page. It was a challenge to work through the documentation to compliantly and securely do this ! (No mention in your article on PowerBI in Powerapps portals ?) we have to use pages to lock down specific reports to specific portal contact users. The overhead at scale may be untenable, but right now it’s setting us apart from any competitor offering!!!
Microsoft Copilot, Power Platform, Dynamics 365 CE, M365 system integrator: I make people's day easy by empowering them with Microsoft apps.
2 年Thank you for this article. It's great.
Solution Architect & Power Lead at CGI - Dynamics 365 Finance & Operations, Power Apps, Power Automate and Power BI
2 年It’s probably a difficult request - but it would be great if we were able to to automate the deployment process of the Power BI dashboards. Currently, we are (at least to my knowledge) required to set the targeted Power BI report manually - i.e. it’s not something we can do using the Power Platform Build Tools + the actual deployment of the Power BI report is handled seperately through Power BI Automation Tool. The first challange could probably be removed if you allowed us to define the relevant report using an environment variable. The “unification” of PPBT and PBAT is probably a greater architectual challange.