When I observe the use of Dynamics 365, I often see that most often there are well-established processes and routines for getting data into the Dynamics 365 system. But using this data is often limited to retrieving financial reports that show everything in dollars and cents. However, the information contained in the system is often of high value, but to effectively use of the data has not been implemented. The reason for this is often simple; One does not know how. And often it can end up in overcomplicated enterprise scale solution that costs much more than needed.
Here is a small list of what is standard, and quite quickly can expose the data to reporting, like Power BI and Excel
- Data export
This is the easy way, where you select data entities to be exported as Excel sheets, CSV or XML. Manual, simple and requires very little demanding in setting up. -
ODATA
Odata is also a very simple and easy way to get access to Dynamics 365 F&O data, and can be consumed directly in Power BI. But it is slow compared to the other ways, and I don’t recommend using this to transactional data. Use of OData for Power BI reports is discouraged. Using entity store for such scenarios is encouraged. - BYOD – Bring your own database
In Dynamics 365, you can set up an Azure SQL database, as a destination when exporting data. Power BI can then read directly from this database. This makes it easy to access the data. But an Azure SQL database can be expensive, and in the long run this way of exporting data will probably become less common. Data Lake will be taking over more for this form of exposing data. -
Entity stores are analytic cubes that are already in place in the standard solution. When you go into the different work areas, there are already many Power BI embedded analyzes that can be used directly. But the very few are aware is that these cubes can be made available in a Data Lake, so they can be used in reports that you create yourself. Dynamics 365 updates data lake continuously and there is a short delay until the data is available in Data Lake (trickle feed). I’m a but surprised that very few customers are using this option to create additional Power BI reports, and even to be able to open the data flows directly in excel. You literally can just select your dimensions and measurements directly from the entity store data lake. Why are almost nobody using this standard feature?
-
Dataverse og dual write
Dual Write is a built-in solution, where the data in Dynamics 365 is synchronously updated between the various apps. Typically, this is used to have shared registers between “customer engagement apps” and “Finance and Operations apps”. But in reality, you can use the entities you want. - Virtual entities
With virtual entities, the data is in Dynamics 365 Finance and Operations, but they are exposed as entities in Data Verse. (It could be that you need to use the legacy connector to access virtual entities in Power BI) -
This is the solution that will really give the data value in the future. In a future release, it will be easier to set up which tables and entities are to be written in Data Lake in almost real time. But it is not only the data that is written, but also metadata that can describe the information and relationships. So keep an eye on the roadmap on this.
These ways of exposing data can be set up as data flows that can be subscribed to. Not just Power BI but also Excel or other services that need this data. In Power BI you can subscribe to several data sources, so you can build the visualization and analysis that is desired.
Then comes the big question; Is it a lot of work to set up? What you may not be aware of is that a lot of this is already part of Dynamics 365. It requires very few hours to set up, and Power BI is also something that is relatively easy to use.
One exciting area that comes in the wake of this is to link this to MachineLearning / AI directly from Power BI. So that the system can build up prediction models, which see the connection between the data and which come with predictions. Dynamics 365 Finance comes full of solutions that give good indications of when customers will pay, suggestions for the next budget or how future cash holdings will be. Within trade / retail, there have been solutions for product recommendations based on customer profile and shopping cart.
The value of your data is determined by how you use it, and the first step is to make it available for use.
Thank you for providing this information. I just have a couple of clarifications if possible…
Is dual write and export to dataverse exporting AXDB or AXDW?
Can an export of Entity Store (AXDW) only be achieved via Export to Data Lake?
Is it possible to export Entity Store (AXDW) to Data Lake and maintain the existing AXDW database maintained by Microsoft? We have existing Power BI reports using DirectQuery and it appears we will loose this store if we enable export to Data Lake ie Dynamics will migrate from maintaining AXDW to instead maintaining Data Lake.
LikeLike
Hi Kurt, we’ve just started the D365 implementation project in our company in Turkey. the project will convert the system from the 2009 version to the D365. I know it’ll be very challenging for us. As BI project manager in the project, I’m thinking to use entity stores + PowerBI for near real-time data requests. But what about the data we need to consume in the data warehouse? Is it possible to use entity stores for exporting transactional data out to the D365 in order to use it in the existing data warehouse?
LikeLike
Hi Gulten. I have had much discussions with Microsoft on the ability to use Entity Store and Direct Query into power BI dataflows. Much of it is documented here: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/entity-store-data-lake The current pain is that Microsoft still sticks to the story that this is in public preview. The reason is that almost everything seams to be centered around setting up Export to Azure Data Lake as described here(https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/configure-export-data-lake). My advice is that you may try out this feature in a Tier-2 environment and enable Export to Azure Data Lake in prod, when it is released. Combining dataflows from different sources is quite common.
In short; Azure Data Lake is your path forward 🙂
LikeLike
Hello, Kurt!
We are looking for someone to help with a proof of concept using Data Lakes to replace the existing SQL/DB solution we are using. Our thought is to have someone (such as yourself) help us establish functionality in one of our Custom Modules that display Job Cost by Element and Category. We are using D365 F&O with HomebuilerONe (HB1 – a homebuilder ERP) sitting on top of D365.
Would you please reachout to me either by email or phone.
deangori@outlook.com
818-800-0122
Thank you,
Dean
LikeLike
Hi Kurt,
Could you maybe do a short tutorial of how you create a report based on the data in data lake entity store? I assume you start in PowerBI desktop or Excel… and then you need to connect to the data lake as a data source… and you can select from the entities that are published? No configuration in D365 needed?
Thanks
Martin
LikeLike
Did a little bit of research. It looks like the future is in exporting tables to ADL (code name: project Como, there is a private yammer group for it: (286) Yammer : Finance and Operations Insider Program : PRIVATE: Project Como, https://www.yammer.com/dynamicsaxfeedbackprograms/#/threads/inGroup?type=in_group&feedId=32768909312. The entity store might be “future deprecated” and this new feature should be GA early next year.
LikeLike
You asked “Why are almost nobody using this standard feature?” about Entity Store in Data Lake. Maybe because it’s (and has been a looong time) in public preview? From docs “This feature is currently in public preview. Do not enable this feature in production environments.”. I’m not holding my breath for this to go GA anytime soon either.
Hopefully we get the “Export to Data Lake” feature GA soon as it also has been in the roadmap for a long time. That really is a game changer with option to expose all tables to Data Lake. Well, maybe more of a comeback, as direct access to DB was available back in the on-premises world. 🙂
LikeLike
I have reached out to the Microsoft team that is responsible for the following documentation page : https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/entity-store-data-lake and asked for clarification on the statement “public preview”. I see that the docs page have not been updated since 23/9/2020, so it could be that the issue is just related to missing documentation.
LikeLike
Hi Kurt,
Did you ever receive an update from Microsoft regarding when Export to Data Lake will be generally available?
Thanks
LikeLike
Hi. It is GA. Have been for many months. What we have been waiting for is the ‘close to real-time’ part of the solution.
LikeLike