D365 Recurring Integrations Scheduler(RIS)

Today I would like to pay my respect to the free GitHub initiative Recurring Integrations Scheduler (RIS). RIS is a solution that transports files/messages between on-premises folders and D365FO. This is a way to perform good old file integrations, and even for automating import/export of data entities between environments. And without having to develop/program anything.

It calls methods exposed by D365FO to import or export files and data packages. It can also monitor status of D365FO internal processing of imported data. Based on this status it can move input files to “status” folders.

The solution is quite easy to setup, and is also quite good documented. You can download the installer here. It also means that with a few hours of reading and understanding this tool you can automate import and export of files.

I will not try to explain the solution here, but I would like to give you some screenshots on how it looks. The tool is a windows service, but have a front-end client where you can monitor that the exports and imports are running. Here I have 2 jobs. One that exports customer from D365F&O, and one that imports customers to D365F&O.

These jobs will create a set of folders where files are to downloaded or uploaded from.

So if I add a excel spreadsheet of customers to the “Upload” folder, it is automatically being sent to F&O, and the customers are imported in the Data Management Framework.

In D365 I have 2 data management projects for importing and exporting customers.

In the export(or import) project, I have the data entity, and I also need to setup the “Manage recurring data jobs“, as this automatically will export/import according to a reoccurring schedule.

In the manage scheduled data jobs, you may setup the frequency.

There is a field named “ID” in the picture above, that contains a GUID, and this the Activity Id that is used for downloading files from the D365 Blob storage.

In the RIS tool, I’m pointing the Dynamics data Job towards this activity ID. This is how RIS understands what to download or upload.

When opening the Manage Messages, we see the download/upload status of each file, until they appear in the download folder.


This is a great tool, that have been available for a long time. It have not received the attention it deserves. Take a look, and start making simplified integrations!

I also want to thank Tomek Melissa from Microsoft that have been driving this initiative.

How to learn and try out Dynamics 365 Commerce?

The Dynamics 365 Commerce team at Microsoft have created a great trial site where you can try out all their good stuff and get some guided walk through of their capabilities. Please check it out:


Here is your to-do list of what you should explore:

  1. B2C experience
  2. Product discovery and cloud powered search
  3. Curb-side pickup
  4. Intelligent recommendations
  5. Ratings and reviews
  6. Quick add to cart
  7. Customer Service
  8. Customer Voice
  9. 3D product visualization and augmented reality (The 3D surf board)
  10. Livestream shopping
  11. Try Adventure Works mobile app


  1. B2B experience
  2. Sign up to be a business partner
  3. Quick order entry using templates
  4. View account credit information

This site is also great for preparation to the MB-340 Dynamics 365 Commerce Functional Consultant Certification, and if you click on the link you can get some more information.

Take care friends

D365 : Use your data

When I observe the use of Dynamics 365, I often see that most often there are well-established processes and routines for getting data into the Dynamics 365 system. But using this data is often limited to retrieving financial reports that show everything in dollars and cents. However, the information contained in the system is often of high value, but to effectively use of the data has not been implemented. The reason for this is often simple; One does not know how. And often it can end up in overcomplicated enterprise scale solution that costs much more than needed.

Here is a small list of what is standard, and quite quickly can expose the data to reporting, like Power BI and Excel

  1. Data export
    This is the easy way, where you select data entities to be exported as Excel sheets, CSV or XML. Manual, simple and requires very little demanding in setting up.
  2. ODATA
    Odata is also a very simple and easy way to get access to Dynamics 365 F&O data, and can be consumed directly in Power BI. But it is slow compared to the other ways, and I don’t recommend using this to transactional data. Use of OData for Power BI reports is discouraged. Using entity store for such scenarios is encouraged.


  3. BYOD – Bring your own database
    In Dynamics 365, you can set up an Azure SQL database, as a destination when exporting data. Power BI can then read directly from this database. This makes it easy to access the data. But an Azure SQL database can be expensive, and in the long run this way of exporting data will probably become less common. Data Lake will be taking over more for this form of exposing data.
  4. Entity stores are analytic cubes that are already in place in the standard solution. When you go into the different work areas, there are already many Power BI embedded analyzes that can be used directly. But the very few are aware is that these cubes can be made available in a Data Lake, so they can be used in reports that you create yourself. Dynamics 365 updates data lake continuously and there is a short delay until the data is available in Data Lake (trickle feed). I’m a but surprised that very few customers are using this option to create additional Power BI reports, and even to be able to open the data flows directly in excel. You literally can just select your dimensions and measurements directly from the entity store data lake. Why are almost nobody using this standard feature?


  5. Dataverse og dual write
    Dual Write is a built-in solution, where the data in Dynamics 365 is synchronously updated between the various apps. Typically, this is used to have shared registers between “customer engagement apps” and “Finance and Operations apps”. But in reality, you can use the entities you want.


  6. Virtual entities
    With virtual entities, the data is in Dynamics 365 Finance and Operations, but they are exposed as entities in Data Verse. (It could be that you need to use the legacy connector to access virtual entities in Power BI)
  7. This is the solution that will really give the data value in the future. In a future release, it will be easier to set up which tables and entities are to be written in Data Lake in almost real time. But it is not only the data that is written, but also metadata that can describe the information and relationships. So keep an eye on the roadmap on this.

These ways of exposing data can be set up as data flows that can be subscribed to. Not just Power BI but also Excel or other services that need this data. In Power BI you can subscribe to several data sources, so you can build the visualization and analysis that is desired.

Then comes the big question; Is it a lot of work to set up? What you may not be aware of is that a lot of this is already part of Dynamics 365. It requires very few hours to set up, and Power BI is also something that is relatively easy to use.

One exciting area that comes in the wake of this is to link this to MachineLearning / AI directly from Power BI. So that the system can build up prediction models, which see the connection between the data and which come with predictions. Dynamics 365 Finance comes full of solutions that give good indications of when customers will pay, suggestions for the next budget or how future cash holdings will be. Within trade / retail, there have been solutions for product recommendations based on customer profile and shopping cart.

The value of your data is determined by how you use it, and the first step is to make it available for use.


D365 – Building a business value review

Latest & greatest tech, faster, better and new possibilities, are from a consultancy perspective fun and rewarding. But it is very seldom that these aspects are the deciding factors when it come to implementing Dynamics 365. Any organization have a finite set of resources in terms of people, knowledge, money, and time. How to best utilize your resources can be difficult, but by building business case you have a process to evaluate your different options.

A business cases can be many things, but in this blog post I will focus on performing a business value review where Dynamics 365 benefits can be subjectively calculated on a high-level.

The end-result of a business value review would give some clear indications on:

1. Benefits – scaled and phased

2. Costs – scaled and phased

3. Key ROI measures; Payback, Net Present Value and Internal Rate Of Return

4. Cashflow through an implementation and onwards

I’m used to quantify costs and benefits on a granular way to better see the value of a business case, and even to compare different business cases towards each other. The high-level end-result of a business case can be presented like this, where ROI are visualized, and as here shown that the payback time is 16,5 months.

“Net Present Value” is the current value of the benefit improvement after 36 months. Internal Rate of Return is the calculated return rate of the investment, and is relevant to compare against other investments (like building a new warehouse etc)

The cost of open tender visualizes the cost of delaying the implementation, and how much savings is needed to justify a delay and keeping an open tender process.  Doing nothing also have a cost!

The cost picture of the business case can often be presented based on non-recurring costs and recurring costs per month.

The benefits can be built from predefined templates.

The benefits available for analysis can be divided into suites, and the following list is some of the benefits that may be relevant to calculate on.

General benefits

Benefit Input values Assumption
Revenue Gain -Annual revenue ($)

-Increase in revenue (%)

-Gross margin (%)

For example, a D365 may increase cross and up-sell opportunities.
Reduce Revenue Leakage -Annual revenue ($)

-Current revenue leakage (%)

-Revised revenue leakage (%)

Identifies areas where invoices (revenues) are raised for services that would otherwise have been missed. This is 100% margin as costs were incurred to provide the service in any case, e.g. chargeable telephone calls to customers
Margin Visibility Improvement -Annual revenue ($)

-Current gross margin (%)

-Revised gross margin (%)

D365 gives visibility of order profitability that was not previously available. The orders can be dealt with on the basis of their profitability rather than passively dealing with every request that arrives.
Margin – Identify Chargeable Items -Annual number of projects

-Average revenue per project ($)

-Increase in chargeable revenue by passing on costs to customers (%)

A D365 identifies areas where costs have been incurred and not passed to customers. These costs once passed to customers result in margin increase.
Cost Saving -Current annual spend impacted by D365 ($)

-Reduction in spend (%)

The departmental spend will be reduced by D365.
Improve Cash Collection -Annual revenue ($)

-Debtor days – today

-Debtor days – after

-Annual interest cost (%)

Improved business processes that generate electronic invoices will lead to faster, more accurate and more frequent delivery of invoices to clients – inevitably leading to faster invoice approval and payment.
Reduce Cash to Cash Cycle Time -Current C2C cash requirements ($)

-Number of days between payment of accounts payable to collection of accounts receivable (C2C cycle time)

-Revised C2C cycle time (days)

-Annual interest cost (%)

Improved business processes will result in a reduction in the C2C cycle thus reducing liquidity requirements and freeing up cash.
Minimize Regulatory Costs -Current annual regulatory costs ($)

-Revised annual regulatory costs ($)

The customer has obligations to report and that the cost of reporting can be significantly reduced by the proposed D365. Alternatively, non-compliance may lead to fines which could be reduced by better monitoring.
Labor Saving -Number of direct staff currently engaged

-Direct efficiency improvement (%)

-Annual loaded cost per direct staff member ($)

-Number of indirect staff currently engaged

-Indirect efficiency improvement (%)

-Annual loaded cost per indirect staff member ($)

By making staff more efficient, a percentage of the staff can be re-deployed thus resulting in savings.
Improve Efficiency -Annual total departmental/company spend ($)

-Efficiency improvement (%)

Avoid soft savings such as improving employee effectiveness. Look for hard savings that can improve efficiency in areas such as: Information availability – remote access to network data, real time access, single and structured storing of all customer/vendor/business data, wider access through portals, business analytics, strategic enterprise management and business planning, improved network intelligence and thus decisions. Integrated systems will allow better inter-departmental, inter-company and inter-organizational efficiencies.
Process Time Savings -Time spent on manual processes (total FTE hours per month)

-Annual loaded cost per person ($)

-Efficiency improvement (%)

D365 automates and removes redundancy in manual steps such as information search, data integration etc. resulting in savings based on labor time.
Activity Time Savings -Time spent on activity (total FTE hours per month)

-Annual loaded cost per person ($)

-Efficiency improvement (%)

-Annual non-labor costs ($)

-Reduction in non-labor costs (%)

D365 reduces the average time spent on a business activity leading to savings based on both labor time and other non-labor costs.
Waiting Time Avoided -Current ‘dead’ waiting time per day (minutes)

-Revised ‘dead’ waiting time per day (minutes)

-Working hours per day per person

-Annual loaded cost per person ($)

-Number of people affected

The proposed D365 frees up a proportion of the current amount of wasted time.
Avoid Stationery Costs -Current annual stationery spend ($)

-Reduction in stationery spend (%)

The current annual stationery spend is known and the new D365 leads to a reduction in that spend.
Cost Reduction – Rental Items -Monthly rental costs ($)

-Percentage cost reduction (%)

Improved business processes and data visibility can lead to reduced purchasing spend through consolidation of existing contracts and improved access to information required for price negotiation.
Reduce Project Overrun Costs -Annual costs of project overruns ($)

-Reduction in overrun costs (%)

D365 enables faster project delivery.
Improve Staff Effectiveness -Number of staff engaged

-Current annual value per person ($)

-Improvement in effectiveness (%)

Staff generate value much greater than their loaded cost may imply and by making them more effective at their assigned task can lead to more value for the business.
Reduce Capital Costs -Budgeted capital spend ($)

-Proportion of spend avoided (%)

-Life or write-down period of D365 (months)

-Interest/Minimum return rate (per year %) (%)

The customer rents the equipment instead of purchasing or receives a managed D365 which avoids capital spend.
Cost of Mistakes -Current annual cost of mistakes ($)

-Reduction (%)

Improved systems can lead to better monitoring and control of all transactions and hence, better customer service and fewer mistakes leading to fewer penalties.
Litigation Avoided -Average cost to employer per case ($)

-Number of cases avoided per year

To quantify the savings that could be made by avoiding incidents which lead to litigation. Better ADR (Alternative Dispute ReD365) procedures can shorten time and therefore cost in litigation.
Assumptions that improved safety will lead to fewer accidents.
Penalty Avoidance -Current annual penalty value ($)

-Reduction (%)

Lower penalties or fines imposed by regulatory bodies result from improved tracking of information.
Reduce Professional Services Costs -Annual spend on professional services ($)

-Reduction (%)

Improved business process, data visibility, recording, retrieval and management lead to reduced requirement for professional services.
Reduce Marketing Costs -Current annual marketing spend ($)

-Reduction in marketing spend (%)

The current annual marketing spend is known and more accurate marketing leads to a reduction in that spend.
Protect Brand Value / Reputation -Annual value vulnerable to loss ($)

-Proportion of this value now protected by D365 (%)

Although brand is probably not valued directly on the balance sheet, the customer can use a percentage of the revenue to give a value. The new D365 protects the brand and inherent value by ensuring that legal and Corporate Social Responsibility (CSR) requirements are met.

Retail and Consumer Goods benefits

Benefit Input values Assumption
Competitive Advantage – Extra Revenue -Annual revenue ($)

-Percentage increase in revenue (%)

-Gross margin (%)

Improved access to and use of customer information will lead to an overall increase in revenue.
Sale Value per Transaction – Increase -Number of sales transactions per year

-Current average value per transaction ($)

-Increase in value (%)

-Gross margin (%)

The D365 helps the user increase their value of sales transaction through pricing, recommendations, AI/ML and omnichannel.
Inventory – Weeks of Supply -Annual Revenue ($)

-Gross margin (%)

-Current value of inventory ($)

-Revised weeks of supply

-Annual interest cost (%)

The D365 gives rise to better monitoring and handling of inventory transactions thus reducing the weeks of supply required.
Shrinkage Reduction -Annual revenue ($)

-Gross margin (%)

-Current shrinkage (%)

-Revised shrinkage (%)

Improved business processes give rise to better monitoring and handling of inventory transactions thus reducing the amount of shrinkage.
Improved Campaign Success -Number of campaigns per year

-Average revenue generated by campaign ($)

-Improvement (increase in number or success rate) (%)

-Gross margin (%)

Improved customer data availability will lead to better targeted campaigns and potentially additional ones within the existing budget leading to improved revenue.
Mark Down Reduction -Annual revenue (sales) ($)

-Annual value of mark downs ($)

-Revised marked down sales as percentage of annual sales (%)

-Margin on marked down sales (%)

-Average gross margin (%)

To quantify the additional revenue/margin that could be generated by being able to reduce the number of goods that are marked down during a sales season.

The D365 enables the user to identify and minimize the amount of mark downs

Mark Up Improvement -Annual revenue (sales) ($)

-Annual value of mark ups ($)

-Revised marked up sales as percentage of annual sales (%)

-Margin on marked up sales (%)

-Average gross margin (%)

To quantify the additional revenue/margin that could be generated by being able to increase the number of goods that are marked up during a sales season.

The D365 enables the user to identify and maximize the amount of mark ups.

Sales per sq m – Increase -Annual revenue ($)

-Sales space (sq m)

-Increase post implementation (%)

-Gross margin (%)

To quantify the extra margin that can be made by increasing the value of sales per m2 i.e. unit area of sales space.

The D365 helps the user to maximize their value of sales per unit area of sales space.

Reduce Inventory Holding -Current value of inventory ($)

-Percentage reduction in inventory holding (%)

-Annual interest cost (%)

Improved business processes can lead to better monitoring and handling of inventory transactions. Reducing the inventory could be brought about by improved vendor relations, supplier reviews, supplier self-service and price/quantity negotiations with suppliers based on data visibility and history. A more efficient, integrated system that looks at the whole business will inevitably allow greater control over re-ordering. Improved data visibility will lead to optimal inventory holding.
Maverick Buying Reduction -Current annual procurement value ($)

-Percentage of procurement that is uncontrolled (%)

-Uncontrolled percentage in D365 (%)

-Percentage by which controlled buying is cheaper (%)

To quantify the savings in purchasing costs as a result of better control over buying activity leading to reduced maverick or ‘rogue’ purchasing.

Improved business processes and data visibility can lead to reduced purchasing spend through consolidation of existing contracts and improved access to information required for price negotiation.

Better Procurement Margins Current annual procurement value ($)

-Price improvement (%)

Improved business processes and data visibility can lead to reduced purchasing spend through consolidation of existing contracts and improved access to information required for price negotiation.
Reduce Haulage Costs -Annual spend on haulage ($)

-Reduction (%)

Better information about route planning, loading, and/or capacity planning allows savings from delivery efficiencies, or savings from consolidation of costs to external suppliers or different shipment methods.
Reduce Track and Trace Costs -Current annual spend on track and trace ($)

-Reduction in spend (%)

The customer has obligations to know, and supply on demand, the exact production details of a product or batch of products and that the cost of this process can be significantly reduced by the proposed D365. Alternatively, non-compliance may lead to fines which could be reduced by better monitoring
Avoid Penalties, Returns, Credit Memos -Current monthly value of returns or credit memos ($)

-Reduction (%)

To quantify the savings that can be made by avoiding late delivery penalties or by reducing avoidable errors that lead to returns and/or credit memos. Many of these are penalties brought about by inadequate customer service. Improved business processes can lead to better monitoring and control of all transactions and hence, better customer service and fewer penalties.
Improve Revenue per Customer -Current number of customers

-Average revenue per customer per month ($)

-Increase in revenue (%)

-Gross margin (%)

To quantify the additional monthly revenue per customer that would be generated by implementing D365. Extra revenue is generated because D365 allows additional services or products to be sold to the existing customer base. The increased revenue may be as a result of cross-selling or up-selling.
Reduce Inventory Lead Times -Current value of inventory ($)

-Average lead time (calendar days)

-Revised lead time (calendar days)

-Annual interest cost (%)

Improved business processes give rise to better monitoring and handling of inventory transactions thus reducing the lead times. Since inventory ties up working capital, a saving of a few days can result in significant savings. The new lead times could be brought about by improved vendor relations, supplier reviews, supplier self-service and price/quantity negotiations with suppliers based on data visibility and history. An integrated system that looks at the whole business will inevitably allow greater control over inventory re-ordering.
Increase Number of Customers -Annual revenue ($)

-Current number of customers

-Current number of customers acquired per year

-Annual percentage improvement (%)

-Gross margin (%)

To quantify the additional monthly revenue and associated margin that would be generated if the proposed D365 helps acquire more customers. Each new customer brings extra revenue and associated margin. The number of customers increases year on year.

Customer relationship management benefits

Benefit Input values Assumption
Competitive Advantage – Extra Revenue -Annual revenue ($)

-Percentage increase in revenue (%)

-Gross margin (%)

Improved access to and use of customer information will lead to an overall increase in revenue.
Increase Customer Satisfaction -Annual revenue ($)

-Percentage improvement from D365 (%)

-Gross margin (%)

D365 leads to an improvement in customer satisfaction that in turn leads to improved revenue
Increase Order Value -Average order value ($)

-Annual number of orders

-Increase in order value with D365 (%)

-Gross margin (%)

Extra revenue is generated because D365 allows resources to be concentrated on customer facing activities for a greater proportion of time available. The D365 also provides faster access to information which may be used to improve sales effectiveness (price changes, new products, product queries, inventory queries etc.) and cross-selling or up-selling initiatives. The result may be measured in an increased average order value.
Improve Customer Retention -Annual revenue ($)

-Current number of customers

-Current annual retention rate (%)

-Target annual retention rate (%)

-Gross margin (%)

Improved business processes can lead to better monitoring and control of all transactions and hence, better customer service and happier customers. Improving the customer retention impacts the annual turnover and therefore increases the revenue year on year.
Customer Acquisition Cost Saving -Number of new customers acquired each year

-Current cost of acquisition ($)

-New cost of acquisition with D365 ($)

The current acquisition cost is known and a reduction will result from improved data availability and more efficient business practices.
Re-engage Customers -Number of lapsed customers contacted

-Estimated conversion rate (%)

-Average annual revenue per customer ($)

-Gross margin (%)

Lapsed customers are easier to re-convert than to find new customers hence contacting them should result in a high conversion rate. Experience shows that lapsed customers can be your hottest prospects.
Improve Bid Win Rate -Number of bids per year

-Average bid size ($)

-Current bid success rate (%)

-Revised bid success rate (%)

-Gross margin (%)

Improved information provided by the D365 increases the quality of bid responses, leading to higher conversion rates
Increase Customer Self-Service -Annual number of transactions

-Percentage of total that could be self-served (%)

-Current average cost per transaction (not self-serve) ($)

-Average cost for self-service ($)

-Percentage of possible transactions that could be encouraged to self-serve (%)

There is a defined cost for serving customer transactions which is greater than the cost that would be incurred if the transactions could be self-served. The D365 will move a proportion of transactions to the self-serve option.

Business Intelligence benefits

Benefit Input values Assumption
Improved Campaign Success -Number of campaigns per year

-Average revenue generated by campaign ($)

-Improvement (increase in number or success rate) (%)

-Gross margin (%)

Improved customer data availability will lead to better targeted campaigns and potentially additional ones within the existing budget leading to improved revenue.
Analytics Savings -Time spent on analytics (total FTE hours per month)

-Annual loaded cost per person ($)

-Analytics efficiency improvement (%)

-Annual non-labor analytics costs ($)

-Reduction in non-labor costs (%)

D365 reduces the average time spent on data preparation for analytics resulting in savings based on both labor time and other non-labor costs.
Model Deployment Savings -Annual number of models

-Average time taken per deployment (working days)

-Annual loaded cost per person ($)

-Reduction in deployment time (%)

-Annual non-labor cost per model deployment ($)

-Reduction in non-labor costs (%)

D365 reduces the average time to create and deploy models resulting in savings based on labor time and other non-labor costs
Reporting Savings -Time spent reporting (total FTE hours per month)

-Annual loaded cost per person ($)

-Reporting efficiency improvement with D365 (%)

-Annual non-labor reporting costs ($)

-Reduction in non-labor costs (%)

D365 reduces the average time spent on data preparation for reporting resulting in savings based on both labor time and other non-labor costs
Data Quality Savings -Time spent on data quality (total FTE hours per month)

-Annual loaded cost per person ($)

-Data quality efficiency improvement (%)

-Annual non-labor data quality costs ($)

-Reduction in non-labor costs (%)

D365 reduces the average time spent on data quality assurance resulting in savings based on both labor time and other non-labor costs.
Consulting Cost Savings -Current annual spend on consultancy or other 3rd parties ($)

-Percentage reduction in spend (%)

D365 can help optimize consulting costs and may be able to eliminate them.
Improved Legacy System Access -IT time spent accessing legacy systems (total FTE hours per day)

-Annual loaded cost per person ($)

-Improvement in legacy system access (%)

-Annual non-labor cost of legacy systems access ($)

-Reduction in non-labor costs (%)

D365 enables significantly faster access procedures which frees up IT time. Depending on the use of the data accessed the speed improvement can lead to more timely / accurate business decision making.
Fraud Reduction -Current annual fraud – missed revenues or refunds ($)

-Current rate of fraud detection (%)

-Rate of detection with D365 (%)

The D365 fraud protection can detect and therefore eliminate the fraud up to a maximum of 100 percentage of the current level.
Earlier Fraud Detection -Total value of annual fraud recovery ($)

-Current average age of claims to be recovered (months)

-Revised average age of claims (months)

-Annual interest cost (%)

The D365 fraud protection will allow the detection of fraudulent claims and will allow the business to take action sooner thus speeding up the recovery process.
Workforce Optimization -Total number of staff-Annual staff turnover (churn) % (%)-Annual loaded cost per person ($)

-Revised annual loaded cost per person ($)

The assumption that the staff that are lost throughout the year can be replaced by lower cost staff. Some D365s use familiar menus and structures which mean lower cost staff can be employed.
Reduce Skills Duplication -Number of staff currently engaged

-Efficiency improvement (%)

-Annual loaded cost per person ($)

D365 reduces the number of duplicated activities, and a percentage of the staff can be re-deployed thus resulting in savings.
Better Forecasting – Control Inventory -Current value of inventory ($)

-Percentage reduction with improved forecasting (%)

-Annual interest cost (%)

Improved customer information leads to improved forecasting which will allow better planning for inventory requirements. Not only will more inventory outages be avoided, but the amount of working capital reduced through the need for lower buffer inventory.
Better Forecasting – Liquidity -Current capital requirements ($)

-Reduction in capital requirements with D365 (%)

-Annual interest cost (%)

Improved business processes will result in a reduction in the liquidity requirements thus freeing up cash.
Avoid Non-competitive Propositions -Annual Revenue ($)

-Current gross margin (%)

-Revised gross margin (%)

D365 gives visibility of order profitability that was not previously available. The orders can be dealt with on the basis of their profitability rather than passively dealing with every request that arrives.
Earlier / Faster Responses to Losses -Monthly losses that may be avoided ($)

-Current time before losses can be avoided (calendar days)

-Revised time (calendar days)

-Additional permanent avoidable monthly losses ($)

D365 enables faster identification of impending losses that may be avoided.
Appropriate Customer Servicing Cost -Total number of customers

-% paying for and receiving ‘Gold’ service (%)

-Percentage of remainder incorrectly receiving ‘Gold’ service (%)

-Annual cost to serve ‘Gold’ customer ($)

-Annual cost to serve standard customer ($)

There is a defined cost for serving those customers at the higher level services and it is higher than that for standard level service. D365 will ensure that those that have paid actually receive the appropriate level of service. The savings come from ensuring customers who have paid for the standard service do not incorrectly receive the higher, more costly service.
First Time ReD365 – Escalations Savings -Number of escalations managed per month

-Total FTE time to manage an escalation (hours)

-Annual loaded cost per person ($)

-Other non-labor costs per escalation ($)

-Reduction in escalations (%)

The quantity of escalations can be reduced by ensuring calls are resolved first time round. This can be achieved by ensuring all agents have access to a 360degree view of customer. Making sure you have the right skills, with right equipment and parts the first time, reduces escalations.
Staffing Demand Forecasting -Number of permanent staff required to cope with peaks of demand

-Percentage of permanent staff replaced by temps to cope with peaks (%)

-Annual loaded cost per person ($)

-% premium paid for temporary staff (%)

-Percentage of time when temporary staff required (%)

Assumption that a percentage of the staff that is affected by the proposed D365 can be re-deployed.

IS/IT related benefits

Benefit Input values Assumption
Technology Refresh Avoided -Cost of planned technology refresh that will be avoided ($)

-Cost of the technology evaluation & research needed in preparation for technology refresh ($)

-Cost of training staff to operate the new technology ($)

-Cost of implementation of the new technology ($)

-How long until refresh due (months)

Every 3 to 5 years (36 to 60 months), new technology arrives which replaces current systems. With D365, it is possible to avoid spend on new technology
IT Maintenance Cost Reduction -Current annual support spend ($)

-Revised annual support spend ($)

Improved business processes will result in a reduction in the costs incurred in supporting them. This is brought about by improved efficiency, resilience, reduced numbers of discrete functions etc.
HW & OS Support Savings -Current annual support spend ($)

-Proportion of support spend saved (%)

Managed services result in a reduction in the costs incurred in supporting equipment. This is brought about by improved efficiency, resilience, reduced numbers of discrete functions etc. A general rule of thumb is that maintenance costs represent 80 % of the overall costs of an application throughout its life. A single data repository, improved impact analysis capabilities and drag and drop design interface could reduce integration maintenance time and costs by a factor of 2 or 3
Asset Management Cost Reduction -Value of assets under management ($)

-Assets not re-purchased annually (%)

-People required to manage assets

-Annual loaded cost per person ($)

-Management effort reduction (%)

Improved business processes can lead to better monitoring and control of assets. As a consequence, assets are not duplicated and there is a reduction in management effort.
Avoid Software License Costs – Current annual license costs ($)-Reduction (%) Licensing costs for the existing D365s are known and a proportion of those licenses are made redundant by the new D365.
Contractor Costs Reduction -Number of contractors / temporary staff engaged

-Number of days worked per year per contractor

-Daily rate per completed working day ($)

-Reduction with D365 (%)

Contractors are required for many reasons including supporting legacy systems. Their activity may not be needed with the proposed D365.
Help Desk Savings -Number of help desk calls handled per month

-Average time spent per call (minutes)

-Annual loaded cost per person ($)

-Non-labor cost per call of help desk services ($)

-Percentage reduction in calls (%)

D365 may significantly reduce the calls coming in to help desks by avoiding calls to retrieve old revisions of documents.
Reduce Trouble Ticket Volume -Annual number of trouble tickets

-Cost per trouble ticket ($)

-Reduction in trouble tickets (%)

Assumptions that the number of trouble tickets can be reduced following the implementation of D365.
Time Savings – IT User -Total number of IT users

-Waiting time per user (minutes per day)

-Annual loaded cost per person ($)

-Percentage waiting time saved (%)

D365 frees up users’ wasted time
IT Staff Savings -Number of ICT staff currently engaged

-Efficiency improvement (%)

-Annual average loaded cost per person ($)

Assumptions that a percentage of the staff that are affected by the proposed D365 can be re-deployed to other tasks
Outage Reduction – SLA Improvement -Outage cost per hour

-Current uptime (or SLA) (%)

-Revised uptime (or SLA) (%)

The outage cost is known and the SLA offered is better than the existing one. It is assumed that the service is provided for 24 hours per day and 365 days per year i.e. 8,760 hours per year.
Reduce Downtime Costs -Current downtime per month (hours)

-Downtime cost per hour ($)

-Reduction in downtime (%)

Improved business processes will result in faster more secure recovery of data in the event of a loss of system or the absence of a key person thus reducing downtime.
Servers Saved -Number of servers used for business processes

-Proportion saved/avoided with D365 (%)

-Annual external charges per server (3rd party etc.) ($)

-Annual internal charges per server (allocation) ($)

The proposed D365 uses existing servers more efficiently making a proportion of new server purchases unnecessary or re-deployable to other areas of the business. The monthly 3rd party and / or annual internal cost of maintaining the infrastructure is thus saved.
Datacenter Power Efficiencies -Total installed equipment power rating (kW)

-Current datacenter power usage effectiveness (PUE)

-Alternative datacenter PUE

-Cost per kWh ($)

-Carbon cost per ton ($)

Cloud services can reduce energy consumption and CO2 production.
Energy and Cooling Savings -Cost per kWh ($)

-Present consumption (kWh per month)

-Percentage saving (%)

D365 requires less hardware and therefore leads to reduced energy consumption and running costs.
Update / Patch Deployment Savings -Number of update and/or patch deployments per year

-Current time taken for each deployment (hours)

-Annual loaded cost per person ($)

-Revised time taken for each deployment (hours)

Deployments are performed multiple times throughout the year
Faster Application Deployment -Number of application deployments per year

-Current time taken for each deployment (hours)

-Annual loaded cost per person ($)

-Revised time taken for each deployment (hours)

Deployments can be performed faster on a cloud D365.
Internal Development Costs Avoided -Annual cost of developing D365 internally ($)

-Reduction in development costs (%)

The customer is considering developing a D365 internally. Costs such as manpower, tools etc. can be avoided with the proposed D365
Reduce Application Integration Costs -Annual spend on application integration (internal) ($)

-Reduction in internal application integration spend (%)

-Annual spend on application integration (outsourced) ($)

-Reduction in outsourced application integration spend (%)

-Other annual costs avoided (e.g. testing) ($)

D365 has the capability to integrate with other systems. This can reduce the need for expensive IT resource to integrate applications. The task can be done instead by a business administrator.
Avoid Systems Integration Costs -Annual spend on Systems Integration ($)-Reduction (%) Current SI costs are known and a proportion of those are made avoided by the proposed D365.
Disaster Recovery Savings -Time spent on DR capability maintenance (total hours per month for all staff)

-Annual loaded cost per person ($)

-Time saving (%)

-Annual non-labor DR costs ($)

-Reduction in non-labor costs (%)

D365 reduces the average time spent on DR resulting in savings based on both labor time and other non-labor costs.

MB-340: Microsoft Dynamics 365 Commerce Functional Consultant (beta)

Microsoft have released a new exam for the Commerce functional consultant.  I did take this exam in the beginning of July, and it can take a few weeks before the results come.  But I’m pretty sure I will fail.  The exam was quite hard and there was just too little time. I got 62 questions, but only managed to answer approx. 45 of them within the timeframe of 90 minutes.  If you think MB-300 was difficult, then MB-340 is harder!

This blogpost is therefore my study notebook to retake the exam(if I fail😊).  You will not find any of the exam questions or answers here, but if you follow this steps, I think you will be closer to achieving your goal of passing the MB-340 exam, and also to understand the topic better.  The chapter and headlines are organized accoring to the skills measured list of MB-340.

Disclaimer:  Some texts and pictures are copied from the Microsoft learn, docs. demo environments and other sites.  In most cases you should find a hyperlink to the original source for a deeper study, and there are no guaranties that all here is right.  You should always refer to the official Microsoft documentation and training.

Enjoy and god luck studying.

Configure Dynamics 365 Commerce Headquarters (25-30%)

Configure prerequisites and commerce parameters

Create employee and customer address books

Employee and customer address books are used to limit the personal to specific stores/POS or what information the employee should be see in HQ.  How to create an address book is documented here. One nice feature here is that the address book is the basis for the extensible data security (XDS) policy, that limits what stores and transactions a HQ user can access.  To make the XDS to work, remember to assign the correct Retail* roles to the users.  This will ensure that the address book filtering is working as expected.

What I normally recommend is to have one address book per store, one for the region, one global address book and one address book for customers.  If there are multiple brands, I also have customer address book for this.  Keep in mind that you can select multiple address books towards stores, employees, and customers.  I also like to think of address books as a hierarchy, and this allows for ensuring that regional managers can see their information and transactions across stores, while a store clerk only see information relevant to his job.     

Configure and manage retail workers

Retail workers are setup using features from the HR features, but the workers table is accessible from the retail menu.  Commerce lets you perform the following employee management functions:

  • Create entities for jobs, positions, and workers
  • Assign workers to more than one store
  • Configure different language settings for each user and user-based POS screen layouts
  • Limit the list of allowed operations

Additionally, you can configure the POS permission groups to associate different employees with different roles. The task management feature offers productivity enhancements for employees at the stores, where task lists can be assigned by the system.

Setting up the structure of the workers and responsibilities is an important task, and by doing it the right way this can save you form a lot of work.  The use of positions and jobs can ensure that the users get the correct level of access.  Let’s say we have the following retail organization shown in the next figure.

Here we have 2 stores, and it is relevant to ensure that the employees in one store only access the store they are working in.   We could model this hierarchy by assigning multiple address books to workers, customers and stores.

The workers can be connected to a position like “Store manager Bergen”.  The position is then connected to a Job like “Store manager”.  The job store manager can then be connected to a POS permission group.  Then you don’t need to individual POS permissions, but it is the position that decides what is allowed. 

Please spend time on this HR part, because it gives a lot of benefits when done properly.  And please add pictures of the employee.  It gives a more personal touch to the user😊

The permission group assigned to as job (or overridden on the worker) is the feature that is controlling what the worker is allowed to do in the POS. The fields are mostly self-explaining, but unfortunately not very well documented.

Here are some additional blogs on the subject.

Assign address books to customers, channels, and workers

Assigning address books to these records is easy.  Find the address book field, click on the lookup button and select the address books that the record should be a part of.

A small tip.  You cannot just copy/paste the address book values if you are marking the record with multiple address books.  You need to actually click on the lookup, and then select each of them.

Create email templates and email notification profiles

The email templates is located at Organization administration –> Setup –> Organization email templates.  The email template is can be defined per language.

If you click on the “edit” button there is an upload capability in the screen.  You also see a list of placeholders, where the system will fill in the specific information.  The exact placeholders available can be found here.

Here is a small, but valuable tip from me, that makes it easy to manually transfer your templates between installations.  Add the column “email”

Then a field with all the HTML is available for copy/paste.  Press CTRL-A to mark all contents, and then you can paste it into any HTML editor.  This is great if you want to just copy email templates from the Contoso demo data, and make some minor adjustments.

In a typical retail installation you would create quite a few email templates, and in the Contoso demo environment you can see a set that is relevant, and linked towards a commerce email notification profile. As you see here, there are emails for new order, confirmation, shipping, etc. You can also create separate emails per modes of delivery.

For configure the email batch, then the following docs is relevant to read.

Configure organizational hierarchies and hierarchy purposes

Organizational hierarchies are a powerful way of grouping a set of stores and then view and report from various perspectives.  I often make hierarchies based on reporting purposes and often mimic the hierarchy of the address books.  But you may also have geographical elements into the hierarchy.   Keep in mind that the levels in the hierarchy corresponds internal organizations of the party entity, and that the retail channel most often is the lowest level.  Therefore think through this setup carefully to ensure a correct structure.  Also take a look at the following video by André Arnaud de Calavon for more a valuable walkthrough of the feature. Remember that when you publish a hierarchy you select a data, and it is not allowed to make changes prior to this date.  I therefore recommend, that you publish the hierarchy on an earlier date, so that you have the possibility to make any corrections. 

The organizational hierarchies can be used a few places.  Like on the assortments where you can select what organizational entities should have a specific assortment.  You also use it for financial and transactional statement processing.  One that are relevant is the on the sales reporting, where the reporting is grouped according to the organizational hierarchy.

With some minor extensions it is possible to add additional levels, like region, franchisee etc, can be added by creating additional types of internal organizations.  I have done dis at a few customers with success 😊

Configure Commerce shared parameters

The shared parameters are common to all legal entities. And the most important setup here, is the number sequences and Identity providers.

Configure company-specific Commerce parameters

The commerce parameters are per legal entity, and contains the more specific setup and number sequences.  One important element is the “initialize” that creates the default configuration data.

Microsoft often have a set of configuration parameters for enabling features, like the “ProductSearch.UseAzureSearch”, that is essential when setting up the eCommerce.

The parameters here is actually not very well documented, and if often take some time “googling” or asking the community to find answers to each of the parameters.  But most of them should be self-explaining if you have been working with Dynamics 365 for a while.

Describe and configure additional functionality

Create and configure channel and sales order attributes

Attributes are documented here.  Channel attributes are attributes that are captured on the transactions that originates from the channel/store. These attributes can then again be used for information and reporting purposes. Lets say we want to record if the customer was happy when performing a purchase.  The first we need to do, is to create an attribute type:  

Then we create an attribute based on this attribute type.

I would then place the attribute into an attribute group, as I will probably later add more attributes I would like to record per sales order.

On the store/channel, I will add the happy group, and state that I would like to record this on the order header.

This will allow me record if the customer was happy when we performed the transaction. (Remember to run distribution job 1070 first)

When we then have performed a transaction, and run the P-job for importing the transactions to HQ, we can see the recorded attribute on the transaction.

As seen here, the attributes opens for many possibilities for recording information happening at your sales channel. Keep also in mind that when using call center, then the attributes also comes into play, and you can record attributes when creating the sales order.

Currently only simple strings are supported, but more types will be available in later releases.

 I hope you can see that the attributes in D365 is very flexible, and allows for a lot of scenarios.  I also see that the feature is underused in implementations, and hope that by using attributes we can avoid a lot of extensions related to data recording. 

Using attributes in power BI is also great for more depth analytics.

Configure commissions and sales representatives

Sales commission can be a nice way to reward hardworking employees, that manages to close the right deals.  The documentation is available here, and also take a look at the following blog for how to setup commission groups.  The way this works, is that when performing a sale, the sales order will be stamped with a sales representative, and on the sales group each sales representative can have a commission share.

I can then setup a commission calculation where I in this case give a commission of 2% on the revenue.

This will be posted to ledger according to the commission posting.

The process of paying out commissions to the sales representatives is not very well documented.  So, I guess the process is a bit manual, and solved by looking at breakdown on a sales order, click the invoice tab on the specified sales order and then select Invoice Journals, which will open a new form. In that new form, select Commission and click Commission transactions.

If you wat a list of all commission transactions, you may add the following to your D365 URL : “&mi=CommissionTrans” and this will then list up commissions per sales representative.

If you cannot see the menu item, it is because you have enabled the project operations integration at “Global project management and accounting parameters”.  It seams that there is a “collision” between the commission feature and the new project operations integration feature. I have no clue of why Microsoft have done this.

Configure payment methods and card types

The menu item for these are located at Retail and commerce à Channel setup à Payment methods.

Your start with defining the card types:

The next step is to define the card numbers, that is used to identify the card type based on the card number.

Payment methods lists up the acceptable payment methods. If you are looking for the Norwegian SAF-T cash register code you can find them here.

Configure payment methods and card types

The menu item for these are located at Retail and commerce –> Channel setup –> Payment methods.

Configure and manage gift cards

The menu items for gift cards is the following. In addition you have to define a service product that will be used when adding amounts to a card.

Configure and manage gift cards

The menu items for gift cards is the following. In addition you have to define a service product that will be used when adding amounts to a card.

The gift card is a number, balance

For gift cards, the following documentation gives some insight, especially if you need to integrate to an external gift card.  If a retailer’s operations are run entirely in D365, internal gift cards are the best solution. For complex retailers that span multiple countries or regions, and multiple point of sale (POS) systems, it’s often best to use a third party to manage gift card balances and enable gift cards to be used across those systems. The out-of-box payment connector for Adyen supports external gift cards through SVS and Givex in POS, the call center, and the e-commerce storefront.

Describe Omni-channel capabilities including payments, orders, and returns

BOPIS a term for buy online, pickup in store. The following omni-channel payment scenarios are supported:

  • Buy online, pick up in store
  • Buy in call center, pick up in store
  • Buy in store A, pick up in store B
  • Buy in store A, ship to customer

There is a lot to write about this, but lets show an example where we buy some snacks online, and pick it up in the store.  Here is my Vitalia site, where I have a snack.

When I click on this, I see that this item is available for pickup in store.

I select to pickup the item in the Bergen flagstore. Then I checkout

Then I can pickup this order in my CPOS:

I can then choose payment method, and the transaction is complete.  I even got some nice confirmation messages on email

Also checkout the omni-channel payments overview and  Payments FAQ for additional information on payment options in a omni-channel scenario.  Pay specially attention to Tokens that represent payment card numbers, payment authorizations, and previous payment captures. Tokens are important because they help keep sensitive data out of the point of sale (POS) system.

Configure data distribution

The database that stores data for a channel is separate from the Commerce database. The channel database holds only the data that is required for transactions. For example, master data is set up in Headquarters and then distributed to channels; on the other side of the transaction, transactional data is created in the point of sale (POS) or the online store and then uploaded to Headquarters. Microsoft have created an updated Commerce Data Exchange best practices that is quite new (15/7/2021), and discusses some nice topics worth exploring.  The following figure shows the flow of the distributions.

There is a set of data that require real-time direct access:

  • Issuing and redeeming gift cards
  • Redeeming loyalty points
  • Issuing and redeeming credit memos
  • Creating and updating customer records
  • Creating, updating, and completing sales orders
  • Receiving inventory against a purchase order or transfer order
  • Performing inventory counts
  • Retrieving sales transactions across stores and completing return transactions

Create info codes, sub-codes, and info code groups

Info codes are used to capture additional information behind actions that occur in the POS and call center Commerce channel. Depending on the input type, users can assign various limitations and restrictions for certain info code types. Some info codes can require input, only trigger once for each transaction (regardless of products), link multiple info codes together, and more.

Describe Dynamics 365 Fraud Protection purchase protection, loss prevention, and account protection

The following image shows several examples of what type of fraud can occur at different phases in the customer journey.

Fraud Protection offers three capabilities that can be integrated together or used individually, which provides merchants with the option to use the capability that best suits their business needs.  Here is a picture of how the application looks like.

Purchase Protection – Helps merchants preserve genuine purchases and decrease fraud. It helps protect online revenue by improving the acceptance rates of commerce transactions with insights and tools that help balance revenue opportunity versus fraud loss and checkout friction.

Account Protection – Helps merchants combat account creation and account takeover fraud. It helps improve customer experience at critical steps of the account life cycle by enabling merchants to block fraudulent activities and protect their customers’ accounts.

Loss Prevention – Helps protect revenue by identifying anomalies and potential fraud that are affecting returns and discounts that arise from omnichannel purchases, enabling store managers and loss prevention officers to quickly take action to mitigate losses.

Manage statements

Describe advantages of using trickle feed-based posting

With trickle feed-based order creation, transactions are processed throughout the day, and only the financial reconciliation of tenders and other cash management transactions are processed at the end of the day. This functionality splits the load of creating sales orders, invoices, and payments throughout the day, providing better perceived performance and the ability to recognize revenue and payments in the books in near real-time.

See also the following blog-post.

Validate retail transactions by using the transaction consistency checker

When a statement is posted, posting can fail due to inconsistent data in the commerce transaction tables. The data issue may be caused by unforeseen issues in the point of sale (POS) application, or if transactions were incorrectly imported from third-party POS systems. Examples of how these inconsistencies may appear include:

  • The transaction total on the header table does not match the transaction total on the lines.
  • The line count on the header table does not match with the number of lines in the transaction table.
  • Taxes on the header table do not match the tax amount on the lines.

Transactions that fail the validation check and transactions that have not yet been validated will not be pulled into statements. During the “Calculate statement” process, users will be notified if there are transactions that could have been included in the statement but weren’t. If a validation error is found, there is a capability so that users can fix the records that failed through the user interface. Logging and auditing capabilities will also be made available to trace the history of the modifications.

Configure and manage retail statement calculations and posting

The statement posting process uses the distribution schedule to pull a set of POS transactions into the headquarters (HQ) client. The following illustration shows the statement posting process. In this process, transactions that are recorded in the POS are transmitted to the client by using the Commerce scheduler. After the client receives the transactions, you can create, calculate, and post the transaction statement for the store.

When I set up these I normally schedule them as batch jobs

Troubleshoot statement posting issues

When posting retail statements, there is a lot of checks to ensure that the transactions are correctly posted. In addition to the retail transaction consistency check, we often see issues that is common with invoice updating a sales order.  This involves that the inventory posting is setup correctly, currency rounding, financial dimensions etc.  I have also several times encountered issues related to the fact that the inventory is closed when the transactions are imported.  Most of them can be fixed from within the D365 application.  The posting error should in most cases look like this:

A quite common error is that the cash declaration is wrong.  In such cases it means that there either are too much or too little cash in the EOD statement.  Then you need to figure out why there are missing cash. This can be faulty cash handling or even fraud.  When the reason have been found, then correct the statement, and manually create a journal/voucher that settles the differences.  If it becomes too complicated then a Microsoft Support ticket may be required. 

Configure Distributed Order Management (DOM)

DOM is a quite new feature, and if I should boil it down to a single sentence I would say that the end result will select what warehouse the sales order lines should we delivered from, and also being able to pickup the order in POS to ship from POS.  The concept is to have DOM process the orders through a set of profiles and then create a fulfilment plan.

I have not myself performed any implementations on DOM yet, and I also see that parts of the functionality soon will be integrated with the Dynamics 365 Intelligent Order Management order orchestration capabilities. Here are some videos on the topic on youtube : https://www.youtube.com/watch?v=CTAYXXj5Cak and https://www.youtube.com/watch?v=-0PvV3-7wZs

Configure fulfillment profiles

The fulfillment profiles are the main area, where you can control what store or fulfillment center should deliver the order.  In the picture under, I have created a very simplified rule stating that I don’t want to have more than 100 open orders per store.

Then add the rule to a fulfillment profile to balance the order volume across my physical stores to pick up and deliver, and ensure that no stores will be assigned more than 100 open sales orders in their que per day.

As seen in the documentation 7 rules that you can apply in combinations.

Configure cost components including shipping, handling, and packaging costs

The cost configuration feature lets retailers define and configure additional cost components that will be calculated and factored in to determine the optimal location to fulfill order lines from.  Here is a cost factor of having 3$ per order line.

And I can add this to the fulfillment profile.

The idea here is to have multiple profiles, so the optimalization engine can select the best possible outcome based on the rules/restrictions and cost factors.  So you should create multiple fulfillment groups that the engine can select among.  Typical one for DC(Distribution Centers) and one/more for other fulfillment centers or stores.  I have been creating several order processing engines in the past (AX2009 and AX2012), and the DOM seams to be touching the basis of these requirements.  But I still feel that the solution have a potential of having a lot more rules covering ATP, Wait/hold, parallelization, smart reservation, optimal order mix etc. But I guess much of this can be created as extensions. 

Configure management rules and parameters

The parameters for DOM includes two calculation types; Production solver and Simplified Solver. The Production Solver requires the special license key that, by default, is licensed and deployed in production environments. For non-production environments, this license key must be manually deployed. The Production Solver improves performance (such as the number of orders and order lines that can be handled in a run) and convergence of results (since a batch of orders might not yield the best result in some scenarios). Some rules like the Partial orders rule and the Maximum number of locations rule require Production Solver.

Monitor fulfillment plans and order exceptions

The DOM have a workspace where the current fulfillment can be monitored.  What is relevant to keep an eye on is the exception lines, because they show lines that are failing the fulfillment processing.

The result of the fulfillment plan shows how the order is fulfilled and where the fulfillment should happen from.

It is important to understand that DOM looks only at orders that are created from commerce channels(Call-center, POS and eCommerce). Sales orders are identified as sales orders when the Commerce sale option is set to Yes, and also keep in mind that DOM hasn’t been thoroughly tested with advanced warehouse management features.

Configure order fulfillment

Dynamics 365 have the capability to fulfill and deliver sales orders from both stores and distribution centers(DC).  This is one of the elements to create a true omnichannel experience.  Buy-online-pickup-in-store are one of these scenarios.  But also the capability of transferring goods to other stores or back to the DC to ease with overstock scenarios. The result of the fulfillment can be seen in the POS in the menu items “Orders to pick up and Orders to ship”.  I also suggest reading the following article, than explains the fulfillment statuses you get on the sales order lines.

In POS the sales orders will show up like this, and you may start the process of clicking on the fulfill button.

Then the lines show up, and you can accept all or some of the lines. 

After you have accepted the lines, you may Pick, Pack and Ship the order.

As you more through the pick, pack and ship steps, you will see the sales order fulfillment status being updated in real-time in the sales order screen. Packingslip and invoice updated just like handling a HQ sales order.

Configure modes of delivery including shipments, pick up, and carry out

In the modes of delivery you may specify what modes of delivery is applicable to what retail channel, products and address. You may also add a transport delivery.

On each store/channel you may specify what modes of delivery is applicable to this store.

In the commerce parameters you may specify which modes of delivery is made for pickup modes, meaning that the customer comes into the store to pickup the goods ordered online or by calling in.

Configure curbside customer order pickup

You have to turn on the Support for multiple pickup delivery modes feature. Keep in mind that DOM ignores any sales lines that are marked for store pickup.  Also check out the order pickup Time Slot Management to ensure that customers can select the timeslot that is the most convenient for them, and to ensure that the store does not become overcrowded. There are elements in the roadmap that is worth taking a look at on this matter. Also checkout how you can setup customer check-in notifications in POS, where a check-in confirmation task is created in POS. 

If you create the sales order using the call center, you will see that you per line can select the pickup time range, and also see how may available slots there are.

In POS this order will show up under the order fulfillment with the curbside pickup date/time. 

Also take a look at the commerce teams integration that allows for POS tasks to be published to appear in teams.  This means that tasks like curbside pickups will appear in the users teams app, and the tasks will also be available for the Microsoft Planner.

Configure charge codes, charge groups, and automatic charges

Charges are often used for add freight charges and handling fees to customer order or sales transaction created in the POS, call center, or e-Commerce channels.  If you use the Use advanced auto-charges parameter, the behavior can be automated.  Take a look at the documentation for more examples and how to use best setup automatic calculation of charges.

One thing that I feel is missing, is the ability to define the changes based on product weight.  Most often the freight is calculated based on weight, volume, and distance.  I think there is room for improvements in this area. Let’s hope the Microsoft Commerce product team have this on their radar for future releases

Configure and assign order fulfillment groups

Fulfilment groups are used in DOM, but also in POS to define whether the warehouse or warehouse and store combinations that are defined in fulfillment groups can be used for shipping, for pickup, or for both shipping and pickup. This allows for added flexibility for the business to determine which warehouses can be selected when creating a customer order for items to ship vs. which stores can be selected when creating a customer order for items to pick up.

To use these configuration options, turn on the Ability to specify locations as “Shipping” or “Pickup” enabled within Fulfillment group from feature management. If a warehouse that’s linked to a fulfillment group isn’t a store, it can be configured only as a shipping location. It can’t be used when orders for pickup are configured in POS.

Configure products, prices, discounts, loyalty, and affiliations (25-30%)

Product information is the backbone of supply chain and commerce applications across all industries. It’s crucial that shared product definitions, documentation, attributes, and identifiers be used correctly.  Do not take this process too easy, and it is essential that quality is put into these processes.  I have seen many projects struggling, because they think just integrating a 3’rd part PIM solution solves all problems.  You need to understand and correctly setup the structure and architecture of products to get the best possible outcome.  It is my experience that D365 product management is more than sufficient to describe products and prices.

Configure products and merchandising

Before you can offer products for resale in your commerce channels, you must create and configure the products. Commerce creates organization-wide products in the product master. You can create the products, define the product properties and attributes, and assign the products to commerce category hierarchies. To make the products available to your channels and add them to an active assortment, you must release the products to the legal entities where they are available. The documentation from Microsoft describes the steps quite well

Configure product category hierarchies

For product hierarchy there are 3 essential types that should be setup:

Product hierarchyUse this hierarchy type to define the overall product hierarchy for your organization. You can use this hierarchy type for merchandising, pricing and promotions, reporting, and assortment planning. Only one product hierarchy can be assigned this hierarchy type.
Supplemental hierarchyUse this hierarchy type for any additional category hierarchies that you want to create. For example, in the spring, you have a promotion for swimwear. Therefore, you include your swimwear products in a separate category hierarchy and apply the promotional pricing to the various product categories.
Navigation hierarchyUse this hierarchy type to group and organize products into categories so that the products can be browsed online or in POS.

When I’m implementing Commerce, I use to create quite a few supplemental hierarchies.  I often also mix in elements like a brand hierarchy, product-vendor hierarchy, season and campaign hierarchy.  This makes it simpler and easier to navigate through products or to group them in Power BI. The great thing is that hierarchies allows you to create strong relations into commerce price and discounts. 

Configure product attributes and attribute groups

Product attributes describes the product, and is visible in HQ, POS and eCommerce.  Here is some examples on how it looks on the different channels:



Dynamics 365 HQ (Several forms available, but here I show the channel categories and attributes screen to show that attributes actually can be defined per retail channel or organizational hierarchy)

Configure assortments and product catalogs

Assortments let’s you manage product availability across channels.  Keep in bind that organizational hierarchies and category hierarchy can be used to create more dynamic product assignments to stores.

Catalogs are currently more used in call center. Initially the catalog features were created to support third-party e-Commerce integrations.  I think in future releases catalogs will come back to ensure that you can create B2B based catalog, and use this for restricting products towards different customers.

The following capabilities are planned as part of future releases:

  • Associate a default catalog to the e-commerce website.
  • Associate one or multiple catalogs to the customer hierarchies associated with a business partner organization and set a default catalog.
  • Associate a default catalog to the customers in the customer hierarchy.
  • Render the e-commerce site with products as per the catalog associated with the customer that’s logged in.
  • Provide capability for a customer to change the default catalog to another valid catalog on the e-commerce site.

Manage product labels and shelf labels

Shelf and product labels are not a proud solution.  Its is a dinosaur that I think needs some additional investments.  If you start on this path, expect to do some extensions.  What I have been waiting for is a BarTender integration that allows for a much smoother design and printout of product and shelf labels.  

Describe uses cases for recommendation types including product, personalized, Shop

Product recommendations are available the following scenarios:

On any store page for browsing or landing page in e-CommerceIf customers or store associates visit a store page, the recommendation engine can suggest products in the New, Best Selling, and Trending lists.
On the Product details pageIf customers or store associates visit a Product details page, the recommendation engine suggests additional items that are also likely to be purchased. These items appear in the People also like list.
On the Transaction page or the checkout pageThe recommendation engine suggests items, based on the whole list of items in the basket. These items appear in the Frequently bought together list.
Personalized recommendationsMerchandisers can provide signed-in customers a personalized picks for you list, in addition to new functionality that allows for existing list scenarios to be personalized based on that customer. To learn more, see Enable personalized recommendations

What is important to understand is that certain parts of the recommendation solution are NOT included with the Commerce licenses, and have to be purchased as a separate SKU from Microsoft.  Only the algorithmic models are included.  Here is a list of the current available product recommendations.

NewAlgorithmicThis module shows a list of the newest products that have been recently assorted to channels and catalogs.
Best sellingAlgorithmicThis module shows a list of products that are ranked by the highest number of sales.
TrendingAlgorithmicThis module shows a list of the highest-performing products for a given period, ranked by highest number of sales.
Frequently bought togetherAI-MLThis module recommends a list of products that are commonly purchased together with the contents of the consumers current cart.
People also likeAI-MLThis module recommends products for a given seed product based on consumer purchase patterns.
Picks for youAI-MLThis module recommends a personalized list of products based on purchase patterns of the signed-in user. For a guest user, this list will be collapsed.

You can manually seed the product recommendations manually by creating curated lists

These curated lists can be added to eCommerce or POS :

Similar looks, and Shop similar descriptions recommendations

I have actually not managed to get this working with the AI based product recommentation.  It could be that a separate SKU is required for making this work.

Configure recommendations

Before any AI-ML recommendation starts being effective, you could manually configure the recommendations.  I have seen documentation that AI-ML do require a substantial amount of transactions before it becomes effective.

Configure warranty settings

The extended warranties are actually set up as a product that can be sold to customers. At the POS, sales associates are prompted to add an extended warranty when a related product is added to a customer’s cart. Therefore, an upsell or cross-sell opportunity is presented to sales associates as part of the sales flow. Here is an important note; A warranty is a service that is provided for a specific, unique product. In Dynamics 365, a product can be uniquely identified only by a serial number.

Configure inventory buffers and inventory levels

Inventory buffers and inventory levels that determine the messaging about inventory availability on Microsoft Dynamics 365 Commerce sites. Instead of showing actual inventory values in e-Commerce storefronts, many retailers prefer just to show messaging about inventory availability status (for example, “Available” or “Out of stock”) to inform customers whether an item is available for purchase or potentially out of stock. For this approach, inventory buffers and inventory levels that determine the inventory availability messaging must be made available and configured.

The inventory profile can be defined on the product as shown here:

The calculation of product availability can be executed on an hourly basis. The default cache is set to 60 seconds. After users post transactions in POS, they should wait 60 seconds before they verify that the on-hand inventory has been reduced. If your business scenario requires a smaller cache time you should create a support request.

There is a batch job named Populate product attributes with inventory level, where the inventory level is then being added to the products as a product attribute.

If I then take a look at the product attributes, I then get a new attribute containing the on-hand as an attribute.

This attribute can now also be shown directly on the POS or on the eCommerce

Configure products and variants including configuring barcodes

Barcodes can be created per released item and per variant/product dimension.

One thing I have always found annoying is that the barcodes are legal entity specific and towards released product.  I often see a requirement where barcodes are the same across legal entities. For the warehouse  management app Microsoft have in 10.0.21 made it possible to Scan using GS1 format standards.  It would be great if this feature also was made possible in commerce/CRT.

Manage pricing

Pricing is a huge topic, and is also very well documented.  I have blogged extensively on this topic before, and a lot have changed and improved in newer releases.

 You can set the price of a product in three places:

  • Directly on the product (base price)
  • In a sales price trade agreement (A trade agreement price is always used before the base price.)
  • In a price adjustment

Design and create price groups

Price groups are at the heart of price and discount management in Commerce. Price groups are used to assign prices and discounts to Commerce entities (that is, channels, catalogs, affiliations, and loyalty programs). Because price groups are used for all pricing and discounts, it’s very important that you plan how you will use them before you start. By itself, a price group is just a name, a description, and, optionally, a pricing priority.

The main point to remember about price groups is that they are used to manage the many-to-many relationships that discounts and prices have with Commerce entities.

There is a form named channel price groups that shows the relationship between the channels and discounts, that makes it easier to understand the relationships.

Configure pricing priorities

By itself, a pricing priority is just a number and a description. Pricing priorities can be applied to price groups, or they can be applied directly to discounts. When pricing priorities are used, they let a retailer override the principle of the best price by controlling the order in which prices and discounts are applied to products. A larger pricing priority number is evaluated before a lower pricing priority number. Additionally, if a price or discount is found at any priority number, all prices or discounts that have lower priority numbers are ignored.

The price and a discount can come from two different pricing priorities, because pricing priorities apply to prices and discounts independently.

Configure product pricing including smart rounding

Smart rounding can be used when generating prices to be posted to the trade agreements.  It ensures that we get consumer “interesting” prices typically ending with .99 or .95. When using the category price rules or working on trade agreements you can apply smart rounding.

In the Accounts receivable parameters you can also see that when using generic currency the smart rounding kicks in.  I have not tested how the generic currency works together with commerce.

Configure catalog pricing

Catalog pricing are using the same architecture for pricing as stores where you assign products and price groups to a catalog.

In the catalog you can click on the price groups to assign the pricegroups that is assigned to the catalog.

And if you click on the price list, you get the prices per product.

PS!  If you do this from the store channel (same feature) you can generate a nice excel based pricelist that is made available in the print archive.(at least in my 10.0.21 box)

Configure affiliation pricing

The definition of an affiliation is a link to or association with a group. In Commerce, affiliations are groups of customers. Affiliations are a much more flexible tool for customer pricing and discounts than the core concept of customer groups and discount groups. First, an affiliation can be used for both prices and discounts, whereas non-retail pricing has a different group for each type of discount and price. Next, a customer can belong to multiple affiliations but can belong to only one non-retail pricing group of each type. Finally, although affiliations can be set up so that they are linked to a customer, they don’t have to be. An ad-hoc affiliation can be used for anonymous customers at the POS. A typical example of an anonymous affiliation discount is a senior or student discount, where a customer can receive a discount just by showing a group membership card.

Although affiliations are most often associated with discounts, you can also use them to set differential pricing. For example, when a retailer sells to an employee, it might want to change the selling price instead of applying a discount on top of the regular price. As another example, a retailer that sells to both consumer customers and business customers might offer business customers better prices, based on their purchasing volume. Affiliations enable both these scenarios.

Affiliations is just a list, and can be connected to price groups

On the customer, you can add multiple affiliations to this customer.

If you want to apply customer-specific prices, we recommend that you not set price groups directly on the customer. Instead, you should use affiliations.

Configure category pricing rules

The category price rules feature in Commerce gives you an easy way to create new trade agreements for all the products in a category. This feature also lets you automatically find existing trade agreements for the products in the category and expire them.

It creates trade agreement journal lines that can be posted.

Manage discounts and promotions

There are many types of discounts:

  • Simple discount – A single percentage or amount.
  • Quantity discount – A discount that is applied when two or more products are purchased.
  • Mix and match discount – A discount that is applied when a specific combination of products is purchased.
  • Threshold discount – A discount that is applied when the transaction total is more than a specified amount.
  • Tender-based discount – A discount that is applied when the transaction total is more than a specified amount and a specific payment type (for example, cash, credit, or debit card) is used for payment.
  • Shipping discount – A discount that is applied when the transaction total is more than a specified amount and a specific mode of delivery (for example, two day shipping or overnight shipping) is used on the order.

When you set up a price adjustment or a discount, be sure to confirm that price groups are assigned to the correct channels, catalogs, affiliations, or loyalty programs that you want the discount to apply to.

Configure discount parameters

Calculating prices and discounts can be extremely heavy from a computational aspect.  Especially if you have many items in a sales basket and many promotions and discounts.  The following parameters allows for the right setting to you needs.

Microsoft have also hade the following flow schema available that explains some of the complexity involved to the sales basket pricing happening in CRT.

Configure channel or customer-specific discounts

Channel and customer specific prices are defined with price group.  At mentioned earlier I recommend using affiliation to create customer specific discounts, and then apply the affiliation to the respective price groups.

Configure quantity, shipping, tender-based, and threshold-based discounts

Quantity discounts

A quantity discount is a discount that is given to customers when they purchase a particular quantity of a product. For example, you can set up a 5 percent discount for the purchase of two products of a particular category or brand.

Buy 5 items get 20%, Buy 10 items get 40%

Shipping discounts

Free or discounted shipping is one of the highly influencing factors driving the customers’ online purchase decisions. Many retailers also leverage the free shipping benefit to motivate the customers to increase their basket size, thus increasing the revenue per transaction. With the 10.0 release of Retail, retailers can use “Retail shipping threshold discount” to define the thresholds, which once met, will qualify the customers for discounted or free shipping. For example, spend $50 or more to get free ‘Overnight shipping’ or sign up for the loyalty program and get free ‘Two-day shipping’.

This feature leverages the advanced auto charges capability that was available in the call center and e-Commerce modules but has now been made available in POS.

Unlike product discounts, the shipping discount does not create discount lines. Instead, the shipping discount edits the shipping charge directly and appends the name of the discount to the charge description.

Configure discount concurrency rules

When you have multiple discounts, pricing algorithm loops through the discounts across various priorities. The discount concurrency control model setting affects how all discounts compete and compound together.

On a discount you will find the discount concurrency mode.

When the value is Exclusive or Best price, only one discount can be applied to a transaction line. The only difference between Exclusive and Best price is the order that the discounts are considered and applied in. Exclusive discounts are always evaluated and applied before Best price and Compound discounts, if all other settings are the same. Therefore, Exclusive and Best price discount never compete for the best price. Two or more Exclusive discounts will compete for the best price, as will two or more Best price discounts.

When the value is Compound, the discount can be compounded with any other discount that is also set to Compound. Therefore, two or more Compound discounts will all be applied to a transaction line. When multiple Compound discounts are applied to a transaction line, they are applied in the following order:

  • Discount price discounts
  • Amount-off discounts
  • Percentage-off discounts

Compound discounts compete with Best price discounts when both types apply to a transaction line. Therefore, the Compound setting is used to determine which discounts are combined. Depending on the discount concurrency control mode used, two or more Compound discounts can be combined and compete with the Best price discounts that apply to the same products. The discount or discounts that have the largest total discount amount are applied.

Manage coupons

Coupons are codes and bar codes that are used to add discounts to transactions. Each coupon can have multiple codes, and each code can have its own effective dates. Each coupon is related to one discount. The price groups that are associated with the discount define the customers that can use a coupon or the channels where a coupon is valid.

In the following example I have a coupon number, that give 20% discount.  Each coupon may have multiple coupon code id’s to ensure that they are only used once.  I have also created a code-39 barcode that will trigger the coupon in the POS, and I may distribute coupon code to customers as a QR-code if needed.

Manage customers, loyalty, and affiliations

The documentation in the link above is quite updated and to the point, and covers how customers are handled from a commerce perspective.  Customers can be created in D365 HQ, POS and in the eCommerce solution.   What is important to also consider is that customers are one of the area where there are a good Dynamics 365 sales synchronization though the DataVerse, enabling a very good understanding of customers through Dynamics 365 Customer insights.

Configure client books

Building a long-term relationship with your customers can ensure long-lasting loyalty. If you know what the customers preferences, purchase history and other relevant information it is easier to target activities and promotions towards the right buying customers.  Client books is a customer card that where the preferences and activity log of the customer is shown.   Through the client book, each sales associate can have a list of “their” customers that they follow up.  This is most relevant for scenario’s where you have specialty retailing and close customer relationship. 

Clicking on the client card, opens the customer card, where timeline, recent purchases etc are available.

Configure customer attributes

The client book includes customer cards that show contact information for each customer, together with three more properties that are defined by the retailer and configured in Headquarters.

Configure customer affiliations

Customer affiliation have been discussed earlier in this document.  See Configure affiliation pricing.

Configure loyalty programs, loyalty schemes, and reward points

The documentation on this is quite detailed, but the main process of setting up loyalty can be presented like this flow. What is very nice, is that the loyalty features are a true omni-channel and is working on all channels. One aspect is that loyalty card, levels and points can be integrated with D365 Customer Service.

Manage loyalty tier calculations and processing

The calculation of loyalty tier can be processed in a set of batch-jobs.

Manage Point of Sale (POS) in Dynamics 365 Commerce (15-20%)

Configure retail stores

Create a retail store

A retail store requires a one-to-one (1:1) relationship with a warehouse and operating unit. The warehouse must be configured before the store is created. The operating unit is created automatically when the store record is created. If you have specific requirement on what the retail channel ID and operating unit number should be, then you have the option to set the number sequences to manual, and then manually create them.

Each store has some important store configurations to consider:

  • Legal entity – The legal entity where the store’s transactions will post.
  • Time zone – The time zone in which the store operates.
  • Language – The language that is used for the store data.
  • Currency – The default currency for the store.
  • Warehouse – The warehouse that is used for the store inventory. The POS can only sell out of a single warehouse location.
  • Functionality profile – Contains configurations for how the registers should operate.
  • Profile configurations – Configurations that are used to define technical architecture details, such as retail server and Cloud POS URLs, offline database schema, and more.
  • Sales tax configurations – A grouping of configurations that is used to determine sales tax configurations, such as the store’s tax group, and whether the store should use destination or customer-based taxes.
  • Default customer ID – Assigned for transactions where no customers are specified. All transactions will be aggregated in a sales order by using this default customer ID.
  • Screen layout ID – The default screen layout ID that is used for all registers and users, unless it is overridden.
  • Post as business day – Offsets the end-of-day time. For example, a store might close after midnight and wants transactions that happen until 2:00 AM to be posted as the previous day’s sales.

Configure POS registers and devices

Commerce supports two types of POS deployments:

  • Modern POS (MPOS) – Locally installed deployment on a Windows OS, iOS, or Android device and supports offline mode.
  • Cloud POS (CPOS) – A web application with nothing installed locally.

Microsoft is now also working on a third option called Store commerce that provides the benefits of both Modern Point of Sale (MPOS) and Cloud Point of Sale (CPOS).  Benefits of Store Commerce

  • Simplified Application lifecycle management (ALM) using Microsoft Store.
  • Extension or ISV code developed for MPOS or CPOS can be reused in Store Commerce.
  • Store Commerce provides the benefits of both MPOS and CPOS.
  • Better performance with the use of  Microsoft Edge WebView2
  • Easier POS and extension upgrades.
  • Support for dedicated hardware station (HWS).
  • Support for offline, in the future.

Registers contains the information of profiles etc.

The device contains information on the activation.

Configure retail profiles

Functionality profile differs from register and device configurations in that it specifies functionalities that aren’t tied to hardware or devices. In most cases, both devices can and should operate in the same way for a consistent experience for customers and employees across the entire store. The functionality profiles are defined at the store level, and you assign the profile to the store.

A visual profile will define the overall branding and theme for a register. For example, the sign-in background for a large monitor would require a different sign-in or lock screen than a phone or tablet would require. The theme might also be important. If it’s a customer-facing monitor, it might require more branding than one that is employee-facing only. The visual profile can help account for these requirements.

A receipt profile is a group of form layouts that can be assigned to point of sale (POS) printers via a hardware profile. A receipt profile provides a set of receipt templates for the printers at your registers. After you set up the receipt profiles, you must assign them to the hardware profile, so that the POS register can print the receipts.

The hardware profile is where you define the peripherals as printers, scanners etc.

Screen layout define different setup on how the POS screen should look like with buttons and different layouts based on screen sizes.  In the following example I have 2 layouts;  One for using POS as a mobile device, and one with larger screen.

Configure sales tax overrides

Sales tax groups can be used to override taxes for specific items that belong to the group. For example, food items are typically taxed differently from hard goods, and would likely have their own sales tax group. Sales tax groups are groups of taxes that are applicable to a particular channel. For example, if a channel sells both retail and business-to-business, different items sales tax groups may be used. All the applicable taxes would be mapped to the sales tax group.

Tax overrides will can be seen in POS as override buttons, and in this example I’m choosing if there is a high or medium tax on the line.

For tax in general related to D365 commerce, I recommend the following documentation.

Configure Task Management lists and parameters

In a retail environment, it’s always difficult to make sure that tasks are performed by the right person at the right time. Retailers must be able to notify workers about upcoming tasks and provide related business context, so that the tasks can be completed correctly and on time.

Task management is a productivity feature in Dynamics 365 Commerce that lets managers and workers create task lists, manage assignment criteria, track task status, and integrate these operations between Commerce back office and point of sale (POS) applications.

In D365 you will find the following menues:

The task management process allows you to see all active tasks, in this case a holiday season preparation wanted to be done in all stores.

And you can define additional tasks in the administration:

In the POS, the tasks will appear like this.

Define cash management processes

Cash management from a physical store perspective covers complete traceability and accountability of cash and its movement across the different registers and cashiers in a store. They must be able to reconcile any differences and determine accountability.  On the functionality profile you will find 2 settings that is managing this:

By enabling cash traceability, introduces the Safe entity and you will have safe transactions where you can reconcile and approve cash management transactions.  If there are exceptions you can tag the transaction with a reason code.

Cash reconciliation is always for a ‘Shift’. It is not for a Terminal / Register / Safe. With ‘shared shift’, a shift can be across multiple Registers. The Safe entity can be managed with a regular register or it can also be managed with a dedicated register.

‘Cash traceability’ feature by itself has not introduced any new GL postings and as such – like all safe transactions to post to the general ledger when money moves in or out of the safe – are not supported. Please do keep in mind that this parameter only supports GL postings for Safe drop transaction. Statement posting is the only way to get financial transactions for retail transactions created/posted in the store

The flow of cash can happen between the Safe  to Register and then back from Register to the Safe  With that:

1. At the end of any given day or shift, with Tender declaration we are closing the Register and before that the cashier performs the Safe drop operation which takes the cash from the Register and moves it to the Safe .This transaction does create a GL entry which increases the balance in the GL account linked to the Safe drop transaction. It may be common to have some money in Safe which they use every day to move to Register – e.g. from Safe to Register operation is happening every morning and not move money from Safe to Bank every day. This is like money circulating inside the store. And this move is not posted to GL. Posting entries to GL only happen with the use of Bank drop operation.

2. When the balance in the Safe (Shift) goes above a particular limit by which it is no longer a good practice to keep that much cash in the Safe, retailers typically transfers the cash from the safe to the bank and in the system they perform a Bank drop operation for the same. This also creates a GL posting where in the balance in the GL account linked to the Safe is reduced and the balance in the GL account linked to the Bank is increased. (essentially cash is deposited in the bank account).

The following steps are an example of how this works, and by having a exclusive shift for safe management:

To reconcile the cash transactions within a shift or across shifts, select the shift to reconcile, and then select Reconcile.

The view that is opened shows the list of reconciled and unreconciled transactions on separate tabs. From this view, users can either select unreconciled transactions and reconcile them, or select previously reconciled transactions and unreconcile them.

During reconciliation, if the selected transaction doesn’t balance, the user must enter a description of the reason for the unbalanced reconciliation. Users can select a single transaction and reconcile it with the relevant reason description as they require.

Users can continue to reconcile and unreconcile transactions until the shift is closed. After a shift is closed, the transactions can’t be unreconciled.

When a user chooses to close a shift, Commerce validates that there are no unreconciled cash management transactions in the shift. Users can’t close a shift if there are unreconciled transactions.

For safes that are defined in a store, users can manage operations such as declaring the start amount, doing a float entry, doing a tender removal, and making a bank drop.

Define shifts and shift management processes

The term shift describes the collection of POS transactional data and activities between two points in time. For each shift, the amount of money that is expected is compared against the amount that was counted and declared.

Typically, shifts are opened at the start of the business day. At that point, a user declares the starting amount that the cash drawer contains. Sales transactions are then performed throughout the day. Finally, at the end of the day, the drawer is counted, and the closing amounts are declared. The shift is closed, and a Z report is generated. The Z report indicates whether there is an overage or shortage.

In the posted statements form you have the record of the statement, declarations, transactions.

You can also print out the statement and use for booking if you have a manual external financial booking system (And I hope you don’t have this!)

Configure channel return policies

The channel return policy enables retailers to set enforcements on which payment tenders can be allowed for processing a return on a point of sale (POS) device. The scope of the policy is currently limited to setting the payment tenders that can be allowed for a channel. The “allowed” list is based on the payment methods used to make the purchase. For example:

  • If a purchase was made using a gift card, the store policy is to process refunds only to a new gift card or to give store credit.
  • If a sale is made using cash, the options allowed for refund are cash, gift card, and customer account, but not credit card.

Describe offline capabilities and limitations

There is a set of data that require real-time direct access:

  • Issuing and redeeming gift cards
  • Redeeming loyalty points
  • Issuing and redeeming credit memos
  • Creating and updating customer records
  • Creating, updating, and completing sales orders
  • Receiving inventory against a purchase order or transfer order
  • Performing inventory counts
  • Retrieving sales transactions across stores and completing return transactions

To be more precise, here are the operations that is NOT available in offline scenarios:

707Activate deviceActivate the current device by allowing an authenticated user to provide connection information and assign a device and register ID.
134Add affiliationAdd a preselected affiliation to a transaction. Select the affiliation on the Button properties page.
135Add affiliation from listAdd an affiliation to a transaction by selecting it in a list.
137Add affiliation to customerAdd an affiliation to a customer on the Customer details page.
138Remove affiliation from customerRemove an affiliation on the Customer details page.
643Add coupon codeAdd a coupon by entering its code in the POS.
141Add header chargesAdd a misc charge to the order header.
141Add line chargesAdd a misc charge to a selected sales line.
117Add loyalty cardPrompt the user to enter a loyalty card number that will be added to the current transaction.
136Add serial numberThis operation lets the user specify a serial number for the currently selected product.
1214Add shipping addressThis operation isn’t supported.
519Add to gift cardAdd money to the specified gift card.
6000Allow skip fiscal registrationThis operation isn’t supported.
1212Bank dropRecord the amount of money that is sent to the bank and other information, such as the number of the bank bag.
923Bank totals verificationThis operation isn’t supported.
915Blank operationThis operation represents a customizable button that a software developer can programmatically change for any specialized operation that the business requires.
1053Blind close shiftSet the current shift to blind closed, and sign the user out. A blind-closed shift is closed to additional transactions but is still open to drawer operations, such as tender removal and tender declaration.
310Calculate totalWhen discount calculation is delayed, this operation initiates the calculation for the current transaction.
642Carry Out All ProductsSet the mode of delivery for all lines to Carryout.
641Carry Out Selected ProductsSet the mode of delivery for the selected lines to Carryout.
647Change mode of deliveryChange mode of delivery for preconfigured shipping sales lines.
1215Change passwordThis operation lets the POS user change their password.
123Change unit of measureChange the unit of measure for the selected line item.
639Clear default sales representative on transactionRemove the commission sales group (sales rep) from the transaction.
106Clear quantityReset the quantity on the currently selected line to 1.
640Clear sales representative on lineRemove the commission sales group (sale rep) from the currently selected line.
121Clear salespersonThis operation isn’t supported.
1055Close shiftClose the current shift, print a Z report, and sign the user out of the system.
139Conclude transactionPrompts user to select payment method
620Create customer orderConvert the POS transaction to a customer order.
925Copy the bank checkThis operation isn’t supported.
620Create customer orderConvert the POS transaction to a customer order.
621Create quotationConvert the POS transaction to a sales quotation.
636Create retail transactionThis operation lets the user create a standard sales transaction when the default POS behavior is to create customer orders.
600CustomerAdd the specified customer to the transaction.
1100Customer account depositMake a payment to a customer’s account.
612Customer addThis operation lets the user create a new customer record.
603Customer clearRemove the customer from the current transaction.
602Customer searchThis operation lets the user search for a customer record by navigating to the customer search page in the POS.
609Customer transactionsThis operation isn’t supported.
917Database connection statusThis operation lets the user view the current connection settings, and switch between online and offline modes.
1200Declare start amountDeclare the amount that is in the cash drawer when the day or shift starts.
132Deposit overrideOverride the default deposit for customer orders.
913Design mode disableThis operation isn’t supported.
912Design mode enableThis operation isn’t supported.
1217Disassemble kitsDisassemble a kit into its component products.
624Display refund amountsThis operation isn’t supported.
513Display totalShow the balance of the transaction on the customer display.
623Edit customerEdit the current customer’s details.
614Edit customer orderRecall the selected order so that it can be modified in the POS.
615Edit quotationRecall the selected quotation so that it can be modified in the POS.
518Expense accountsRecord money that is removed from the cash drawer for occasional expenses.
919Extended log onAssign or remove permission to sign in by scanning a bar code or swiping a card.
1201Float entryThis operation lets the user add additional money to the current drawer or shift.
1218Force unlock peripheralThe system uses this operation internally to unlock POS peripherals.
520Gift card balanceShow the balance of a gift card.
708Inactivate deviceInactivate the current device, so that it can’t be used as a POS register.
804Inbound operationAccess the features of inbound store inventory management.
517Income accountsRecord money that is put into the cash drawer for a reason other than a sale.
801Inventory lookupLook up available, on order, and available-to-promise (ATP) quantities for the current store and other available locations.
122Invoice commentThis operation lets the user enter a comment about the current transaction.
511Issue credit memoIssue a credit memo to provide a voucher instead of a refund.
512Issue gift cardIssue a new gift card for the specified amount.
625Issue loyalty cardIssue a loyalty card to a customer, so that the customer can participate in the store’s loyalty program.
300Line discount amountEnter a discount amount for a line item in the transaction. This operation is used only for discountable items and only within specified discount limits.
301Line discount percentEnter a discount percentage for a line item in the transaction. This operation is used only for discountable items and only within specified discount limits.
703Lock registerLock the current register, so that it can’t be used, but don’t sign the current user out.
701Log offSign the current user out of the register.
521Loyalty card points balanceShow the balance of points for the specified loyalty card.
142Manage chargesView and manage misc charges applied to transaction.
918Manage shiftsShow a list of active, suspended, and blind closed shifts.
914Minimize POS windowThis operation isn’t supported.
1000Open drawerPerform a “no sale” operation, and open the currently selected cash drawer.
928Order fulfillmentThis operation allows users to pick, pack, ship, or recall orders for store picked up.
805Outbound operationAccess features for managing shipments of outbound transfer orders.
129Override line product taxOverride the tax on the selected line item, and use a different specified tax.
130Override line product tax from listOverride the tax on the selected line item, and use the tax that the user selects in a list.
127Override transaction taxOverride the tax on the transaction, and use a different specified tax.
128Override transaction tax from listOverride the tax on the transaction, and use the tax that the user selects in a list.
131Packing slipCreate a packing slip for the selected order.
710Pair hardware stationThis operation isn’t supported.
201Pay cardAccept a card such as a credit card or a debit card as payment.
200Pay cashAccept cash as payment.
206Pay cash quickComplete the transaction in one touch, and accept the amount that is due in cash (exact change).
204Pay checkAccept a check as payment.
213Pay credit memoAccept a credit memo (voucher) that the store issued.
203Pay currencyAccept payment in various currencies.
202Pay customer accountCharge the transaction to the customer’s account. This payment method isn’t valid for customer order deposits.
214Pay gift cardAccept a gift card that the store issued.
207Pay loyaltyAccept a loyalty card for payment, and redeem points toward qualified products.
634Payments historyShow the customer’s payment history for the current customer order.
803Picking and receivingOpen the Picking and receiving page, where you can select orders to pick or receive in the store.
632Pickup all productsSet the fulfillment method to Store pickup for all lines.
631Pickup selected productsSet the fulfillment method to Store pickup for selected lines.
400Popup menuThis operation isn’t supported.
101Price checkThis operation lets the user look up the price for a specified product.
104Price overrideOverride the price of a product, if the product has been set up to allow for price overrides.
1058Print fiscal XThis operation isn’t supported.
1059Print fiscal ZThis operation isn’t supported.
927Print item labelThis operation isn’t supported.
926Print shelf labelThis operation isn’t supported.
1056Print XPrint and X report for the current shift.
103Product commentAdd a comment to the selected line item in the transaction.
100Product saleAdd a specified product to the transaction.
108Product searchThis operation lets the user search for a product by navigating to the product search page in the POS.
633Quote expiration dateThis operation lets the user view or modify the expiration date on a sales quotation.
627RecalculateRecalculate all customer order lines and taxes, based on the current configuration.
143Recalculate chargesRecalculate the auto-charges applied to the order.
515Recall orderThis operation lets the user search for and recall customer orders and sales quotations.
504Recall transactionThis operation lets the user recall a previously suspended transaction from the current store.
305Redeem loyalty pointsThis operation isn’t supported.
635Refund shipping chargesThis operation lets the user refund shipping charges on a canceled order.
644Remove coupon codePrompt the user to remove coupons by selecting them in a list of coupons that are currently associated with the transaction.
1057Reprint ZReprint the Z report for the previous shift or a selected shift.
1216Reset passwordThis operation lets a user who has the password-reset permission reset another employee’s password by using a temporary password.
1219Open URL in POSThis operation lets a user to open an admin configured URL in POS.
109Return productPerform a return of individual products. The next scanned product is shown as a returned product that has a negative quantity and price.
114Return transactionRecall a previous transaction by its receipt number to return some or all of the products.
1211Safe dropPerform a safe drop to move money from the register to a safe.
516Sales invoiceThis operation lets the customer make payments toward the selected sales invoice.
502SalespersonThis operation lets the user set the Sales taker value on a sales order for customer orders in the POS.
2000Schedule managementThis operation is not yet supported.
2001Schedule requestsThis operation is not yet supported.
622SearchThis operation lets users preconfigure POS buttons to perform searches by item, customer, or category.
1213Search shipping addressThis operation isn’t supported.
709Select hardware stationThis operation lets the user select a hardware station in a list of available hardware stations.
637Set default sales representative on transactionThis operation lets the user select one of the eligible commission sales groups (sale reps) as the default sales rep for lines that are added later.
105Set quantityChange the quantity of a line item in the transaction.
638Set sales representative on lineThis operation lets the user select one of the eligible commission sales groups (sale reps) for the currently selected line.
630Ship all productsSet the fulfillment mode to Shipping for all line items.
629Ship selected productsSet the fulfillment mode to Shipping for the selected lines.
115Show journalShow the store’s journal. You can view transactions, reprint receipts and gift receipts, and recall for return.
802Stock countThis operation lets the user create or modify stock counting journals for physical inventory or cycle counts.
401Sub menuThis operation takes the user to another linked button grid.
1054Suspend shiftSuspend the current shift, so that a new or different shift can be activated on the current register.
503Suspend transactionSuspend the current sales transaction, so that it can be recalled later in the store.
1004Task recorderOpen Task recorder to record procedural steps in the POS.
1052Tender declarationThis operation lets the user specify the amount of money in the drawer for each counted payment method.
1210Tender removalThis operation lets the user remove money from the current drawer or shift.
920Time clockThis operation lets users punch in and punch out of work shifts and breaks.
302Total discount amountEnter a discount amount for the transaction. This operation applies only to discountable items and only within specified discount limits.
303Total discount percentEnter a discount percentage for the transaction. This operation applies only to discountable items and only within specified discount limits.
501Transaction commentAdd a comment to the current transaction.
922View product detailsOpen the product details page for the currently selected line item.
1003View reportsShow the reports that have been configured for the current user.
921View time clock entriesShow the time clock entries for all workers at the store.
211Void paymentVoid the currently selected payment line from the transaction.
102Void productVoid the currently selected line item from the transaction.
500Void transactionVoid the current transaction.
916Windows workflow foundationThis operation isn’t supported.
924X report for bank cardsThis operation isn’t supported.
311Remove system discounts from transactionsRemove all the system applied discounts, including coupon based discounts, from the transaction. This does not remove manual discounts.
312Reapply system discountsReapply system discounts on the transaction if they were removed using the Remove system discounts from transaction operation.

Manage store inventory

The POS provides functionality to manage store inventory, including store inventory replenishment capabilities, inbound and outbound store inventory operations, store stock counts, and store inventory lookup capabilities. The following Microsoft learn course gives a good overview, and touches on cross docking, buyer’s purch, Receive store inventory from POS and other inbound operations.

I really like the simplicity process of requesting goods from central warehouse or from other store.  It’s fast and easy to use.  The following screen shows how it looks when requesting goods from another store.  Scan barcode and specify quantity.

Configure availability calculations for products

It is important to understand that in a commerce architecture you have both the HQ database and the channel database.  This is done for performance and latency reasons, and data is often synchronized back and forth in an asynchrony way.  This can affect how the on-hand and availability calculations are being handled. Channel-side inventory calculation is a mechanism that takes the last-known channel inventory data in Commerce headquarters as a baseline, and then factors in additional inventory changes that occurred on the channel side that aren’t included in that baseline to calculate a near-real-time estimated on-hand inventory. The following post is important to understand and how to use the presented on-hand calculations.  If you are using product variations, like color and size , the onhand can be presented as a grid in POS.

When clicking on a variant or For pure SKU based products without variants,  you will see the inventory on other sites in the fulfillment group, and also on the central warehouse.

Manage inbound and outbound inventory operations

Store employees might want to transfer some of their store inventory out of their store and then send it to another warehouse (either a distribution center or another store). This transfer might be necessary in scenarios where the store contains overstock that another location can use. Store employees can initiate this process by first creating the transfer order through the Outbound inventory operation in POS.

Process customer pick-up and shipment orders

Customer orders can be used to capture sales where shoppers want to pick up products on a later date, pick up products from a different location, or have items shipped to them.  I recommend to read the following docs.  An important element if you want to demonstrate the process of a customer going into the store, buying some products(with cash) and then want them sent home is to follow the following process.  Pay especially attention to the red line, as this is the process when you want to demo with the order capture and the fulfillment/shipment of the order from the same POS.

  1. On the POS transaction screen, add a customer to the transaction.
  2. Add products to the cart.
  3. Select Ship selected or Ship all to ship the products to an address on the customer account.
  4. Select the option to create a customer order.
  5. Confirm or change the “ship from” location, confirm or change the shipping address, and select a shipping method.
  6. Enter the customer’s desired order shipment date.
  7. Use the payment functions to pay for any calculated amounts that are due, or use the Deposit override operation to change the amounts that are due, and then apply payment.
  8. If the full order total wasn’t paid, enter a credit card that will be captured for the balance that is due on the order when it’s invoiced.

This allows you to get through the process of picking, packing and shipping directly from the POS.

What I can think of here, is that it would be great to have extensions for printing delivery labels and even ensure that ASN is being sent to the freight forwarder.  Let’s hope some ISV picks this up.

Manage inventory processes including stock counts

Count journals are used to update and adjust physical inventory counts for a specific item within a specific warehouse. While the Commerce inventory logic is constantly tracking inventory that comes into and out of the warehouse, situations will occur where the physical inventory count that is being tracked by the application no longer matches the physical count of inventory on the shelf. This scenario can occur for a variety of situations, such as receipt of inventory was accidentally not processed, theft or breakage or other loss was not previously adjusted, and so on.

One quite new feature is the ability to perform stock adjustments, as this allows for adjusting up/down an exact quantity.  Let’s say when something was hungry and had to eat 😊.  I can then adjust out one tyrkisk peber.

Then this becomes available as an inventory adjustment journal.

Even the notes becomes available as notes to the transaction.

Look up product inventory

I think this have been covered earlier in this blogpost.

Process serialized items

Many retailers sell products that need to be serialized. These products are called serialized items. For inventory tracking purposes, some retailers may want to keep serial numbers in store or warehouse inventory. For service and warranty purposes, other retailers may want to capture serial numbers during the sales process.

Perform POS operations

Perform sales and order processes

Cash and carry transactions are the most common POS transactions where items are scanned. A customer might, or might not, be identified on the order, and all products are paid in full. When tendering out of the transaction, a customer leaves with the products. There are also more advanced processes of creating orders from POS, and then pick it up or ship the products to the customer. The order is then created and sent to Commerce Headquarters (HQ) for processing. The creation of the customer order to HQ typically occurs through an async process between the Commerce engine and HQ. You can also configure it to be created in real time, if preferred.

Perform end of day processes

Much of the end-of-day processes are related to closing the shift (If you are using shifts).  The process often starts with performing safe and bank drops of cash.  Then you would perform a tender declaration to specify the total amount of money that is currently in the cash drawer. Users most often perform this operation before they close a shift. The specified amount is compared against the expected shift amount to calculate the overage/shortage amount.  The last ting is to close the shift. This operation calculates shift totals and overage/shortage amounts, and then finalizes an active or blind-closed shift. Depending on the user’s permissions, a Z report is also printed for the shift. Closed shifts can’t be resumed or modified.

Blind-close can be used to free up a register for a new user or shift without first having to fully count, reconcile, and close the shift, and then later reopen the shift to perform closure. 

The way that shifts and cash drawer reconciliation are used in the POS differs from the way that transaction data is summarized during statement calculation. It’s important to understand this difference. Depending on your configuration and your business processes, the shift data in the POS (the Z report) and a calculated statement in the back office can give you different results. This difference doesn’t necessarily mean that either the shift data or the calculated statement is incorrect, or that there is a problem with the data. It just means that the parameters that are provided might be including additional transaction or fewer transactions, or that the transactions have been summarized differently.

Although every retailer has different business requirements, it is recommend that you set up your system in the following way to avoid situations where differences of this type occur.  The best way to have your statements is by shifts.  Period! This setup helps guarantee that back-office statements include the same transactions as shifts in the POS, and that the data is summarized by that shift.

Reconcile store cash

This have been covered earlier

Monitor store productivity by using task management and reporting features

Task management have been discussed before in this blog, but related to the reporting capabilities in POS there are some standard reporting ask shown here:

But my general recommendation is to invest in a power bi solution.  Combined with the P-job and recurring tickle-feed feature channel transactions are being imported quite frequently.  Then use the Azure data Lake integration to feed you power BI report with updated and refreshed data. Take a small look at the following presentation I did in 2018 about how to manually structure a cube for power BI based on existing data entities.

Configure and Manage Dynamics 365 Commerce call centers (10-15%)

The first time I took the exam, I felt that the call center functionality was over represented in the questions.  SO I guess to this exam, it requires that we need to take this topic more to the depth.  Orders that are created in a call center channel can take advantage of specific Commerce capabilities such as payment processing and retail pricing and promotions. Defining a call center channel also allows the organization to define specific order processing settings and data defaults to the sales orders that were created by call center users.  Som of the features include.

  • Full payment processing capabilities (Also with credit cards)
  • Use of catalog source code IDs to track marketing efforts
  • Ability for upsell/cross-sell prompts as sales lines are created
  • Ability to create and manage subscription orders by using continuity program features
  • Use of Commerce pricing and promotion configurations

A user can be linked to only one call center channel at a time and if you don’t associate the user to a call center, it will not trigger the Commerce-related features.

Sales orders that are created in the call center are also part of all Commerce omnichannel capabilities and can be used by the point of sale (POS) application to support cross-channel order fulfillment scenarios. This feature allows a call center user to create an order that can be picked up by the customer at a store location. Additionally, a customer order that is created in the POS or e-Commerce application can be further reviewed, edited, or managed by a call center user in Commerce HQ.

Configure call centers

During the configuration of a call center, three processing options have a great impact on the features that are available for call center orders based on whether these processing options are enabled on the call center channel or not:

  • Enable order completion – enforces a set of validation rules that the order must go through before it can be successfully submitted to processing.
  • Enable direct selling – upsell and cross-sell functionality
  • Enable order price control – enables call center users to change the price of an item on a sales order if that item has been configured to allow price adjustment. A specific commerce sales line workflow must also be created in Commerce Workflows to enable the price override approval process.

Create a call center

Configuration of a call center follows much of the same steps as creating a store channel.  You have to setup delivery modes, payment methods.  One thing that differs from the traditional sales orders is the introduction of order completion that can be enabled. When the Enable order completion setting is turned on for the call center channel, if line items are entered on a sales order and the channel user tries to close or navigate away from the sales order form without first selecting Complete, the system enforces the order completion process by opening the sales order recap page and requiring that the user correctly submit the order. If the order can’t be correctly submitted together with payment, the user can use the order holds functionality to put the order on hold. If the user is trying to cancel the order, he or she must correctly cancel it by using either the Cancel function or the Delete function, depending on the function that the user’s security allows. After a call center channel has been created, users must be linked to that call center to take advantage of additional order processing features that are exclusively available for call center order processing. Here is a screenshot of the “complete” feature mixed with a script asking if the customer did now there where a discount on teddy bears now 😊

Another important aspect is the customer service form where the call center sales process often starts.  Here it is easy to have a quick overview off all customers and orders.

Configure and publish product catalogs

Product catalog have been discussed earlier in the blog post.  Personally I think that catalogs are out, and there are very few that actually still use this.  It all has changed towards online.

Create product catalog scripts

Not much to say, rather than you can assign script towards the catalog or towards the products are in the catalog.

Configure fraud conditions, rules, and variables to trigger order holds

To use the call center order hold features, you must first define hold codes. To create a set of user-defined hold codes, based on your business requirements, go to Sales and marketing > Setup > Sales orders > Order hold codes

This hold codes can then be assigned to the order.

There are settings and rules that allow for validating and checking.  Here I have created a condition to put the order on-hold, if the quantity on the sales line is above 10.

I can then add the condition to a rule:

In the call center parameters page I can then setup the score of when the order should be flagged as fraud.

When the rules starts to kick in I get the “Order is on hold” message:

I can also see all order holds and them check out if it turns out that everything is OK:

In the sales order overview I also get some visual indicators/color coding of the salesorders that is marked as suspicious or fraud:

The last thing here, is that you can also build up a static fraud data containing variables that is known to be used in call-center fraud.

Configure fraud alerts

I covered this in the previous chapter

Configure continuity orders and installment billing

In a continuity program, also known as a recurring order program, customers receive regular product shipments according to a predefined schedule. Continuity programs provide the ability to create continuity schedules that will have a scheduled shipment and payment.

Set up continuity programs and parameters

To create a continuity program you specify details such as the payment schedule, the timing of the shipments, and whether billing is up front. You must also add a list of products that are included in the continuity program. Each product receives an event ID number that is assigned sequentially, beginning with 1. The event IDs determine the order that products are sent in. If customers receive a different product in each shipment, the products are sent in sequential order, based on their event IDs and beginning with the current event. If customers receive the same product in each shipment, the list contains only one product that has one event ID. The same event occurs repeatedly. You can specify how many times each event is repeated. Create a parent product that represents the continuity program that you created. If you add this product to a sales order, the Continuity page opens. You can then use that page to create the actual continuity order. The parent product doesn’t specify the individual products that the customer receives in each shipment.

Here I have a continuity order containing two lines.

On the release product you specify the Continuity schedule ID:

When creating the sales order, the continuity form pops up, where I can make adjustments to quantity, price, dates etc

Configure continuity order batch jobs

After setting up a continuity program as described above, you can create a continuity order for a customer. You might also have to perform the following additional maintenance tasks.

  • Update the current continuity event period – Set up a batch job that tells the system what the current event period is.
  • Create continuity child orders – Create child orders from the parent continuity order.
  • Process continuity payments – Process billing and notifications for payments that are associated with continuity sales orders.
  • Extend continuity lines (if required) – Extend the number of times that a continuity event can be repeated. The repetition of shipments can then extend beyond the limit that was set in the Continuity repeat threshold field in the call center parameters.
  • Perform a continuity update (if required) – Synchronize changes between the continuity program and the continuity parent sales orders.
  • Close continuity parent lines and orders – Close continuity orders.

The end result is that child sales orders are created and will then be processed in the normal fashion.  

Manage continuity child orders

To manage continuity orders there is a form under Retail and Commerce à Inquiries and reports à Continuity orders that lists up all sales order lines that is a parent continuity line.

In the Summary tab, you will see all child continuity lines, as they are being generated.

Manage call centers

Create, modify, and process sales orders

I think this area have been covered well earlier in the blogpost.

Process call center payments

In D365 Commerce, the configuration of the call center channel includes a setting that is named Enable order completion. This setting helps guarantee that all orders that users of the channel create are released to order processing only if they have a prepaid or pre-authorized payment that is within approved tolerances. If the Enable order completion setting is turned on, call center users can enter payments against sales orders for customers by using the payment processing features of Call center. If the setting is turned off, call center users can’t use the Call center payment processing features, but they can still apply prepayments to sales orders by using standard Accounts receivable functionality.

On the call center you can specify the payment methods that can be used.  Here you can also specify that cards are allowed:

Remember also to evaluate if the electronic payment setup is to specify that expiration date and pin is required.

When I then go through the complete and payment process and add the payment method card, the following happens:

An iframe from Adyen pops up in this case(missing setup on Adyen), and you can perform a “card not present” payment flow.  Even VIPPS is available here. 

Manage order holds

During order entry, but before order submission and confirmation, call center users might want to manually put an order on hold to prevent it from being released to the warehouse for further processing. For example, the customer who is placing the order might not be ready to commit to it, or critical data that is required in order to process the order might be missing.  The older hold form and color coding have been  have discussed earlier in the post.

Create return merchandise authorizations (RMAs)

Process returns, exchanges, and replacements

When a return order is issued, the Replacement order function can be used to generate a new sales order for the customer. This approach can be used in exchange scenarios. The Replacement order function creates another sales order for the new items that must be sent, but a cross-reference link on the RMA/Return tab of the Call center parameters page links the replacement order, the RMA, and the returned sales order.

The process can be described with the following flow:

The actual return order screen looks like this.  Also check out the following docs’s page for more information:

Manage e-commerce (15-20%)

Finally we have come to the “latest and greatest” part of this very long post.  E-commerce is actually an additional SKU you have to purchase, and is not included with the traditional D365 commerce SKU.  So you have to pay extra.  There are even additional SKU that you can select, as AI/ML recommendations, Rate&Review and solutions for fraud. The following learn site also gives a good overview.  A lot of very good tech-talk video’s have been also released, and here are some links to them.  I have also made available a list of known eCommerce sites running, so you may actually try it out, and even purchase 😊

Configure an e-commerce channel

To create an online store in Commerce, you must first create an online channel. Before you create a new online channel, ensure that you have completed the Channel set up prerequisites. Apart from what already have been discussed on other channels, you need to setup the online functionality profile.

Create an online store

To configure a functioning online store, you need to set up multiple components so that transactions can be successfully processed for that online store. When you have configured the online store and its components, you can link the channel to one or multiple Commerce sites or any other solution for a storefront that is compatible with Commerce. To finalize the online channel configuration and ensure that the products are discoverable in the e-Commerce store, you should also create a channel navigation hierarchy and sync the data to the online store database. Each channel can have a unique channel hierarchy.

If you wonder where you can define the category hierarchy for stores, then here it is : Retail and Commerce –> Channel setup –> Channel categories and product attributes:

Configure an e-commerce site

Much of the configuration of the e-commerce site is performed in the site builder.  This is a tool deployed and available from LCS, if you have purchased the e-commerce SKU.

The site builder have a set of site settings:

Configure channel assignments for an e-commerce site

To establish a new site and associate an online store with it, in LCS, select the link for the site authoring environment. Then, on the page for the site authoring environment, select New site. In the New site dialog box, you must provide some basic information about your site. Here are some additional details of the information you need to fill in.

Configure ratings and reviews

Also check out the topic on Microsoft learn. The ratings and reviews solution in Dynamics 365 Commerce uses Azure Cognitive Services to offer automatic moderation of profane words in 40 languages. Because human approval isn’t required, moderation costs are reduced. The system also offers moderator tools that can be used to respond to customer concerns, feedback, and take-down requests, and to address data requests from users.  But keep in mind that rating and review is an additional SKU from Microsoft.

Here is the moderation screen where moderators can respond and take input.  There is also a export to power BI for deeper analysis.

The rating and reviews can also be synced into the HQ and to the POS.  See the following article.

Manage e-commerce content

Building and maintaining content will be an ongoing process.  You therefore should invest in having dedicated content builders to create the best possible presentation and experience for the online users.

Configure URLs and aliases

When creating new pages, you won’t be required to specify a page URL. If you leave the URL field blank, the page is created in an unlinked state. In this case, customers won’t be able to access the page, even if it’s published. To make the page accessible, you must manually create the URL and link it to the page.

Configure product detail pages and category pages

The product details if a very central page, and can be customized for specific products.

The category page is the page that most often is used for browsing through the products.

Manage site themes, page fragments, templates, layouts, and pages

There is a lot of setup required when setting up the content.  We are using a DevOps template where we have all the elements defined and in place, and here is a view of some elements of setting this up.

And each of the setup of each element is documented in devops

Upload and manage digital assets including videos and images

The media library is where you store your pictures and video’s. The naming convention of your files  is very important to make sure you are linking the pictures with the right products.

Set focal points and attribute values for media assets

When an image is uploaded to the Commerce site builder Media Library, the system attempts to determine the focal point of the image. For example, if the image has a person on it, the system will set the focal point to the face of the person by default. In most cases the automatically set focal point works well for all viewports, but sometimes you may want to adjust the focal point to ensure that a specific part of the image is always visible. In the picture below, you see that I have a focal point on the head.

I can also change the view by module to only show parts of the image in certain settings.

Configure publish groups

E-Commerce websites are constantly updated with new content throughout the year. Updates are often published in batches around busy e-Commerce events such as holidays, seasonal marketing campaigns, or promotional launches. These updates often require that groups of website content (for examples, pages, images, fragments, and templates) be staged, validated, and published concurrently in a single action.

So you can setup publishing groups that changes the ecommerce site on a specific date.

Operate an e-commerce channel

Create e-commerce orders

The Microsoft documentation here is excellent and superb. The shopping experience is fast and intuitive.  Just what is expected from an eCommerce site. 

Synchronize e-commerce orders

To view the transaction in Commerce Headquarters (HQ), run the P-0001 job and Synchronize Orders to pull in the orders from the Commerce scale units.

Moderate ratings and reviews

This is covered earlier.  But check out the following learn session.

Configure business-to-business (B2B) e-commerce

Business-to-business (B2B) e-commerce sites provide some key capabilities that optimize the workflow for a B2B user.

Describe differences between B2B and business-to-consumer (B2C) solutions

It is possible to implement both a B2C and a B2B scenario’s.  The main capabilities that is relevant for B2B scenarios is the ability to purchase on account, and that you have tools to improve B2B account relations and partner management tools.  The capabilities available for B2B scenarios are:

  • Business partner onboarding
  • Order templates
  • Quantity thresholds (minimum, maximum, multiple)
  • On account payment method
  • Salesperson for business partner
  • Handling of customer deposits
  • Account statement and invoice printing
  • Payment of sales invoice
  • Quick order entry
  • Dynamics 365 Sales integration
  • Return order and return merchandise authorization (RMA)
  • Order cancellation
  • Matrix control for order entry

Allowing B2B customers to pay on the account is crucial for the B2B e-commerce solution. D365 Commerce will enable customers to buy within their pre-set credit limits. Invoices generate after the order is placed, and the customer can pay those directly from the eCommerce site with a credit card. Order templates allow users who may buy many of the same items during each order to have preconfigured lists of items they want to add to their cart. They can also access a quick order entry screen to make adjustments to quantities and SKUs they’re purchasing.

Describe use cases for organizational modeling hierarchies

Two new customer records are created in the system: a Type Organization customer record for the business partner organization and a Type Person customer record for the requestor (that is, the business partner user who submitted the request).

Manage business partners and business partner users

B2B e-commerce websites require that organizations register to become business partners. After an organization submits registration details to a B2B e-commerce website, it goes through a qualification process. If the organization is successfully qualified, it’s onboarded as a business partner.  New business partners signing up will show up as prospects.  On the next two screen, you see the signup form in ecommerce, and a screen of the all prospects where they end up:

Configure product quantity limits

Most products have a unit of measure that defines their grouping. The grouping affects how the products can be sold. Some products might have an additional grouping for quantities. This grouping determines whether the products can be sold as individual units or multiples, and whether there is a minimum or maximum order quantity limit that must be followed.

The quantity limiting feature ensures that the minimum, maximum, multiple, and standard quantities that are configured in Microsoft Dynamics 365 Commerce (in the default order settings or the Commerce site builder site settings) are applied to customer orders that are placed on an e-commerce site. Product quantity limits aren’t currently supported for the point of sale (POS) and call centers. On each product you may specify

  • Multiple – The quantity that the product can be bought in multiples of.
  • Minimum Order Quantity – The minimum number of products that must be purchased.
  • Maximum Order Quantity – The maximum number of products that can be purchased.
  • Standard Order Quantity – The default quantity that is automatically entered when the product is selected.

In the site builder à extensions, you can define for which type of customers this setting should affect.

Buy Commerce Scale Unit’s, and get Device licenses included

There is a small, but interesting element that retailers should be aware of. If you buy Dynamics 365 Commerce Scale Unit’s you get device licenses included.

It is available in the licensing guide.

What does this mean? As you scale up your installation, and deploy to multiple azure geo zone to achieve the best possible latency, then remember to reduce your number of device licenses according to the number of Commerce Scale Units you deploy.

No need to buy both Commerce Scale units AND devices.

Take care

*PS! Always read the licensing guide thoroughly, as there are conditions that needs to be followed.

D365 eCommerce sites live and running

Through search and connections, I have compiled a list of public sites that have implemented the Dynamics 365 eCommerce parts. This is by no means a complete list, and only represents a small subset. I have not been sponsored by any, and I only want to share with the D365 community sites that are live and running. Hopefully more can see the benefits of having a truly integrated omnichannel solution, and start investing knowledge in the capabilities. I hope also this can convince more to start enabling the eCommerce capabilities in the stack that they already have.

So here is the list, without any more comments:




Clothing and sports





Home and Kids





Food and Wine




























Demo sites :






Take care

D365 – CustTable – fast – faster – fastest – WOW!

I wanted to look deeper into an area that have troubled me for some time. Why are some forms very fast in D365, and some forms do not have the expected start-up time. At the end of this article you can see my finding, and I hope this will have an positive effect on user experienced performance.

The form I wanted to take a deeper look into is the custTable form, as this is one of the most used forms at customers. Over time we have seen that this form has increased in size, by additional features and code being added. New features are great, but it comes at a cost.

I wanted a simple test, where we are looking at a warm system, and time how long time it would take to open the CustTable form. I would like to test the opening of custTable a Cloud Hosted Tier-1 (DS12 V2), Tier-2 and PROD. This is benchmarked with a top-watch, and timing is from I click on menu item, until form is drawn and responsive. I will be using google chrome with F12, and measure until all network, and the main measurement will be TTFB (Time To First byte), as seen in the picture below. The actual waiting time tend to be beyond this, but it is the most concrete KPI I have found. The timing is therefore not the actual or experienced performance, but a KPI that can be used for comparing scenario’s.

The KPI represents the time the AOS/IIS is using to render and return the form object to the browser. Each “warm test” will be conducted 3 times, and the data is an extremely small dataset (just a few customers), as the purpose of this test is NOT to test the database, indexes or queries. It is about testing how the execution of code and caching on a form is performing.

Below is a screenshot showing where to find my performance KPI in the F12 Google chrome developer menu.

Test of architecture

In this test I’m testing how fast the custTable form is opened on Tier1, Tier-2 and on a PROD environment. The PROD/Tier-2 environments are on service fabrics(self-service), and the databases seams to be elastic pool based.

As seen on the table below the fastest execution happens on Tier-1, that is a one-box SQL, and the Tier-2 and PROD

Customer form

Warm execution

Cold execution

Tier-1 (DS12 V2)

1.50, 1.49, 1.49



2.20, 2.32, 2.20


Prod (6 AOS’s)

3.22, 3.25, 3.10 (20:00 CET)

2.37, 2.46, 2.40 (22:00 CET)

Not measured

What we here see is that a cold execution of the CustTable form is extreme, with a dramatic increase execution time. What we also see is that PROD differs on execution time. This can be because of different connection to another AOS, or affection of “noisy neighbor” caused by switch to Azure SQL elastic pool architecture.

On a simpler form like the “customer reason code” form, without much code, we see a very nice execution time on all tier-levels, and even cold executions are within acceptable range.

Customer reason code form

Warm execution

Cold execution

Tier-1 (DS12 V2)

0.11, 0.11, 0.12



0.26, 0.27, 0.26


Prod (6 AOS’s)

0.27, 0.28, 0.23

Not measured

The conclusion seams that complex forms, as the custTable are much more affected when opening a form in a cold state.

The complexity of the CustTable

As seen below, the CustTable contains 12 datasources, and quite many of them are joins. There are also 4 extension to the form.

We also see in the code in the CustTable is heavily regulated by code that controls features, country specific/regulatory elements, and display items. If we open the Customer form on a Tier-2 environment with 5 customers takes between 2-3s. In total there are 16.413 method calls, and of them 1.330 are unique method calls.

I did not get any meaningful information out of the recorded summarized tracefile analysis, so I must continue to more manually look into the actual execution of code.

Test of effect when reducing complexity CustTable

My next step in the analysis is to see what is affecting the execution time. In the following section I’m testing in a Tier-1 D12V2 environment. I have made 5 copies of the CustTable form, in each form, I’m removing more and more code and data sources. I name them:

  1. Standard, but no calls to feature enablement
  2. Fast : All code and data datasources removed, except custTable and DirParty
  3. Faster : All code and data datasources removed, except custTable. Display method on customer name
  4. Fastest; All code is removed except CustTable data source

To simulate a “cold execution” we can flush the cache by adding the following to the URL: &mi=ACTION%3ASysFlushAOD

CustTable form type

Warm execution (s)

Cold execution (s)

Standard 10.0.18

1.50, 1.49, 1.49


1.Standard, but no calls to feature enablement code

1.34, 1.43, 1,39


2.Fast : All code and data datasources removed, except custTable and DirParty

0.72, 0.72, 0.73


3.Faster : All code and data datasources removed, except custTable. Display method on customer name

0.56, 0.62, 0.57


4.Fastest; All is removed except CustTable data source

0.34, 0.34, 0.38


5.Customer reasoncode form

0.11, 0.11, 0.12


What we see in the table above, is that the main thing that is taking time, is the execution of code. The datasources do not affect the user experienced performance in this scenario. The results show that simpler forms with less code have a huge effect of the execution and the cold-start scenario.

WOW! – Other findings.

I have found one area that is affecting heavily the cold startup of forms. That is the office button, that is typically initiated when the form is loading. I tried disabling the office button code, a cold startup of CustTable went from 23s to 5s. And this button is used everywhere.

This “fix” does not seam to have a large effect on warmed up system. But keep in mind that with the one-version strategy and adding extensions we are clearing any cache quite often, that the end-users needs to rebuild on each AOS. As there are thousands of forms, you can multiply the warmup with the number of AOS’s, and you realize why manual warmup take days.

I have informed Microsoft, and hope for a positive response. Let’s continue to dig for code changes that can make the best ERP system even better, and share what you find.

I realized, that when debugging line-by-line, a small gray text pop’s up showing the actual elapsed execution time per statement. This allowed me to find the lines that actually are using a lot of time, by jumping from line to line. The timing here, is from when I did a debug of a cold system. On a warm system it will not show, as then it all is cached.

I’m really proud of finding this, as it have been on my bucket list to find some real good improvements. For more details on the chase for more performance, take a look at the Microsoft Yammer group (If you have access?) https://www.yammer.com/dynamicsaxfeedbackprograms/threads/1105410564505600

D365 B2B eCommerce, Things have changed (again)

Me and my colleagues have had the privilege the last few weeks to go really deep into the D365 offering, in terms of capabilities, pricing and roadmap. A few weeks ago, I wrote about the pricing of eCommerce. As this have changed per 1. April, I have removed this post. I recommend that partners find the session that Microsoft had last night, called “Dynamics 365 Commerce e-Commerce Licensing Changes (DYN963PAL)“. It gives some new insight where Microsoft brings a better pricing differentiation, and the pricing can better meet price points in the low-end pricing marked.

But the key headlines are that Microsoft are now capable of offering license cost per sales in the range of 0,36$-0,85$ for the low-end market (sales order value <50$). And that is a significant price reduction! I suggest you check out the pricing when it becomes available on 1. April. (Without any jokes )

In some meetings I attended I also learned that a significant number of B2B customers are in implementation already(including us). I do believe that the D365 eCommerce B2B brings some very nice offerings, and it is in this domain that I think we will see a lot of customer onboarding to D365 eCommerce.

Take care my friends, and I’ll keep you posted on more to learn.

D365 Commerce – tech-talk videos on YouTube

Do you have YouTube available on your TV, and have finished all series on Netflix and HBO? If you want to fill up with additional knowledge and still enjoy your cozy favorite couch, then know that many of the Dynamics 365 Tech-Talks are available on YouTube. You can also find them available here: https://community.dynamics.com/365/b/techtalks?c=Commerce

Here is a compiled list of D365 commerce video’s I recommend if you want to learn more of the eCommerce domain. All of them originates from the following YouTube channel: https://www.youtube.com/channel/UCBoCtfQN1aRB31xnEexj5yQ

Enjoy your couch



Dynamics 365 Commerce Overview Tech Talk


Dynamics 365 Commerce Architecture Overview Tech Talk


Unlock the Power of Dynamics 365 Commerce: B2B e-commerce


Unlock the Power of Dynamics 365 Commerce: Omni-Channel Order Management Flows


Unlock the Power of Dynamics 365 Commerce: Supporting Buy Online Pickup in Store/Curbside with POS


Unlock the Power of Dynamics 365 Commerce: Branding Your E-Commerce Site


Unlock the Power of Dynamics 365 Commerce: E-Commerce Module Library Overview


Unlock the Power of Dynamics 365 Commerce: Support Multiple Languages & Markets on E-Commerce Site


Unlock the Power of Dynamics 365 Commerce: Managing Omni-Channel Ratings & Reviews


Unlock the Power of Dynamics 365 Commerce: Best Practices for E-Commerce Customization Development


Unlock the Power of Dynamics 365 Commerce: E-Commerce Module Library Overview


Unlock the Power of Dynamics 365 Commerce: Managing E-Commerce Site Settings


Unlock the Power of Dynamics 365 Commerce: Commerce Deployments, Updates and Servicing


Unlock the Power of Dynamics 365 Commerce: Setup a B2C Tenant for e-Commerce Site Authentication


Dynamics 365 Commerce E-Commerce Architecture Deep-dive Tech Talk








Decompiling D365 retail components

Today I got a very nice tip from a colleague on how to better understand and see the source code for the retail components.

I was experiencing that I was not able to post a CPOS sales, and in the eventlog I got the following “Cryptical” error:

Customer with RECID 5637158076 is non-chargeable account.

at Microsoft.Dynamics.Commerce.Runtime.Services.CustomerPaymentService.ValidateCustomerForOnAccountPayment(Customer customerToPayWith, RequestContext context, Boolean isPositiveAmount)


The thing with the retail server, is that we don’t have the source code on all the components. But luckily there are a way around it:

JetBrains dotPeek (just google it, and download)

This tool let you decompile all components, and have an advanced search capability. Just all the K:\RetailServer\WebRoot\bin\Microsoft.Dynamics * files, and then search for the term ValidateCustomerForOnAccountPayment:


Solution to my problem, was that the InvoiceAccount on the customer record was blank.

So now you know how to see all sourcecode in Dynamics 365 Commerce/RetailServer components






D365 smarter search algorithm

Dear fellow community members.

First a small announcement as of first of December I am in a new job where new ideas and visions for our community will come to life. I am overexcited to share more on this later at LinkedIn, but as a true enthusiast I choose to celebrate this milestone with a fun knowledge sharing blogpost to the community. The community have always been there for me when I need knowledge and paying forward is how the community choose to reward each other.

Today I would like to show you a way to create smarter search in Dynamics 365 with some minor extensions. As you know, relevance search API is on the horizon, but I expect quite a few release iterations before this will materialize into the D365 F&S codebase.

First some context; I hope you have familiarized yourself with the fulltext search capabilities that exists under the “sales and marketing”.

This feature materializes itself on the sales order screen, where you can search for products across different and multiple fields. Some of the drawbacks of the standard solution is the Microsoft have limited the number of fields you can search on to only cover fields you have on the inventTable and a few other tables. Some requirements I often get is the ability to search for barcodes/GTIN, external item numbers, vendors, attributes, classifications etc. I also get the requirement that the user do not want to be restricted in the sequence of how the search terms are specified. They want “google search” capabilities, meaning that [A] + [B] results in the same results as [B] + [A].

With some minor adjustments we can make the search for products much more meaningful. First, I need to explain how standard is performing the item search to explain how you can adjust this. There is a table named MCRInventTableIndex, that consists of 2 fields. One as a reference to the product, and one field that is a concatenation of several fields. As shown under, the Searchtext field contains item number and item description, just concatenated together. Then there have been created a fulltext index on this field to speed up the search on this long string.

When typing in your search criteria there is added a wildcard “*Criteria*” making sure that the system can find the right criteria according to the search term. The drawback here is that the search terms must be specified in the right sequence. There is also the issue that the concatenated search team is missing lot of exiting terms to search for.

So, there are 2 problems that needs to be solved:

  1. Adding additional search terms
  2. Fixing that the search sequence is irrelevant.

Lets look into that:

Adding additional search terms

The way I choose to add additional search terms, is to create my own SearchIndex table, where I can concatenate all the search terms, and include the fields like barcodes/GTIN, external item numbers, vendors, attributes, classifications. The code here is just concatenating the SearchText into a very large string, and referencing this to the product/NOBBnr. It is also OK to add multiple barcodes or vendors.

This results in having a table that have much more information, and just visualized here to show that the string contains many kinds of information.

The next stem, is to include this string into the std D365 search algorithm, so that all my search strings are included in the search.

At this time, I can search on itembarcode vendor external item etc. just as I wanted. But our goal has not been met yet. Let us go the next challenge.

Fixing that the search sequence is irrelevant.

If I should visualize how I want the search to be conducted, let us take a concrete example. Here I am searching for a door (“dør”), that is white (“hvit”) and have the size (“10X20”). As seen below, this results in to possible results.

But what we want to achieve is that the search criteria sequence should be irrelevant. What we want is to have an inner join of the search criteria’s to only get those items relevant. We therefore have to split up the search criteria, and then perform an exists join for each search criteria based on each of the separate search elements.

To make this magic happen, I have the following code, that is an extension to the MCRInventSearch class that is building up a query for the search.

This allow me to get a much better search result:

This beneficial search algorithm also kicks into the MCRSalesQuickQuote form, and here I also also added a “iframe” directly toward an external product database NOBB, that shows all relevant data.

If you enjoyed this topic, please share it. And if you want to discuss it, please contact me

Take care and see you all in my next chapter.

D365 X++: I’m using InventABC as my template

Hi fellow developers.

Back in the early days I was at a good old Damgaard conference, and attended at a technical session. In this session one of the founding fathers of Axapta came up with the phrase “Copy with pride“. What he meant was to look at existing code and patterns in the X++ code, and to feel free to used and copy these patterns from the Microsoft code and use them in customer customizations and extensions. I have lived by this principle and “copied with pride” and encourage other to feel free to take my work and copy what they need.

One of the more common customization/extension requests is where we have a class that does some magic and have a dialog, query and can run in a recurring batch etc. I guess all developers have their own approach, and as the blog name suggest I’m using the InventABC* classes as a “copy with pride” for my code. I know this code is more than 20 years old, and I guess there are better and advanced patterns to use. I don’t say this pattern is the best, but for me it is a fast and easy way to create simple periodic recurring classes. Please any feedback is welcome, and feel free to share your approaches.

The InventABCUpdate do contain the code to creating a dialog, executing in batch, parameters and most of the code I need to create the code needed. After done, my final code does not resemble the original InventABCUpdate code, but for me it is a journey where I have fixed starting point and then make constant adjustments until I have reached the desired solution.

For any of you that is starting the journey of learning development in X++, and quite quickly being able to create solutions our customers it is well worth investment to study of the InventABC to classes and then speed up your coding experience.

Happy coding friends.


EDIT: The community responds quickly A better class to use could be Tutorial_RunBaseBatch

D365 commerce; The need for speed (replenishment)

A nightmare for retailers is situations like this, where customers are experiencing empty shelfs.

The normal process for handling these situations is to have store employees to constantly monitor shelfs, back office stocks and to order replenishment when needed. We also see a lot of number crunching demand forecasting systems being offered to the marked, and with a questionable success rate.

Up until now, we have seen systems that have a quite slow react time. Goods are replenished and received. When selling the products there where delays in getting an updated on-hand, and very often a nightly master planning created planned orders that is manually firmed and then sent to the vendors or the central warehouse. In essence the process to getting replenishment signals through the supply chain can take day’s.

With the Dynamics 365 Commerce I now see a maturity to speed up the replenishment process, making it possible to dramatically shorten the lead times in each step. Let me explain:

  1. Retail statement trickle feed
    I have covered this topic a few times before, but in essence this means that the sales transactions generated from a POS sale, is updated at a much faster rate. Having an updated on-hand is essential, and where the outbound sales transactions reflects the actual situation on the shelfs.
  2. Inbound inventory operations in POS/Handheld
    The ability to quickly post the arrival and receive directly on the POS ensures that inbound transactions are updated in real time. Also the ability to quickly manually request replenishment for processing can now be done directly in the POS.
  3. Outbound inventory operation in POS/Handheld
    Faster request goods from other places can reduce the lead time and prevent stock-out situations. If some products are available at a another near store/warehouse that having processes to make them available across other near sales channels can speed up the replenishment.
  4. Planning Optimalization
    Nightly master planning (MRP) is too slow for retailers, and are better suited for businesses with longer lead-times. Retailers are looking for speed. With the Planning Optimalizations features we have a close to real-time generation of planned purchase and transfer orders. As soon as needed the system can initiate the supply chain process based on the on-hand and future expected transactions. In standard D365 there are also automatic firming processes that will generate purchase and transfer orders. These orders can be further processed and automatically be sent to vendors or from other storage locations. But it can also be used to automate transfers from a back-office storage and into the store shelfs.
  5. Solution is still in public preview but are very promising some exiting capabilities that can visually monitor your store and enable triggers that start executing supply chain processes. The first abilities are “Display effectiveness“, “Queue management” and “Shopper analytics“, and where the data collection are based on camera technology.


By optimizing the flow, it should be possible within Dynamics 365 to speed up and automate the replenishment process to be executed within 10 minutes after the sale have been conducted. Also, with signals from the connected store solution, this can be used to automatically adjust the minimum on-hand/Shelfs based on actual observed data. The gains and possibilities towards what Dynamics now can offer can again bring profit back to modern Brick&Morter retailers.

I hope this can inspire people to look deeper into the capabilities we now can deliver.



D365 Outsourcing your master data (DaaS)

In Dynamics 365 implementation projects I often say that all we do can mainly be categorized into 3 headline topics.

As we know for Dynamics 365, Microsoft is providing the software and the platform needed. It is easy to buy as a service where only a monthly commitment is essentially required. This is the nature of the Software-as-a-Service cloud-based concept.

The implementation partners are the best in structuring an implementation project and guiding step-by-step through the jungle. There is a lot of knowledge needed to understand complex processes needed in an organization. The partners are typically working tightly with people and ensuing that the organizational machinery is oiled and running smoothly. Defining processes that follows the entire end-to-end processes like procure-to-pay or order-to-cash.

The third element of equal importance is master data. I have written some previous blogpost about the subject, that is relevant to check up. Traditionally building the master data have often been the responsibility of the organization implementing Dynamics 365 and have been regarded as the heart and soul of the organization. The data is often manually built/generated and maintained, and low quality in master data can have catastrophic effects in any organization. If you cannot trust your data, then you do not have the information needed to make good business decisions.

Traditionally this have been identified as an integration requirement, but the main “ownership” of the data have still been handled internally in the company. Here is where I see a change. Instead of maintaining your own master data, the master data is maintained through cloud based public services operated based on a monthly fee. Just like SaaS (Software as a Service), we see mature implementations of DaaS (Data as a Service), where Dynamics 365 customers is closely integrating and outsourcing much of the maintenance to vertical specific online services.

But one aspect I see, is that the data providers are not global actors, but tends to be more local and verticalized services to specific domains. To be specific towards some providers here in Norway, I would like to name-drop some providers that I have encountered that provide such services.

BREG – Brønnøysund Register Center

The Brønnøysund Register Center develops and operates digital services that streamline, coordinate and simplify dialogue with the public for individuals and businesses. They operate many of the Norway’s most important registers, that contains information about companies, roles, tax etc Many of the services is free, and you can read more about them. If you need validated and confirmed information about any organization on Norway, then this is the registers you need to integrate towards. My friend Fredrik S, from Microsoft have create many demo’s showing how easy it actually is to set this up.

BISNode – Integrated credit check and risk management

Knowing the commercial risk is essential for all businesses. By having updated information, the decisions become less risky and less labor intensive.

1881 – search and return person address information

1881 is Norway’s leading provider of personal and business information and is providing information on telephone numbers, names and addresses. By having lookup into databases like 1881 you instantly get address information that enrich your data and simplifies transaction handling.

GS1 – The Global Language of Business

GS1 is the main provider of a lot of supply-chain oriented master data. Here you maintain product GTIN/barcodes, and they also provide a GLN (Global Location Number) register. When working with delivery addresses, then this is a must-have, because it ensures that goods are shipped and received to the right places. For a small fee, you get access to updated addresses directly into D365, where the addresses are also enriched with GPS coordinates. One more relevant aspect of GS1, is the GPC (Global Product Classification), that makes it easier to search for products globally and is also a very good reporting/analytics structure.

TradeSolution – The Norwegian Grocery PIM

If you are going to sell or purchase products through the Norwegian grocery chain’s, you need to have a close connection with Trade solution. I have written about them previously, but they make sure you have a reliable source of product master data and properties of the products. If you are using their services, there is no need for a third part PIM solution. They also provide a media store for product pictures.

NOBB – The Norwegian Construction PIM

NOBB contains almost 1,000,000 articles from 700 suppliers. You will find a wide range of product information, e.g. lumber, building materials, hardware, tools, fasteners, paints, houses and gardens, water / plumbing, electrical etc. The database contains basic data, price, logistics data, images, documentation streamline the industry’s need for structured and quality-assured basic data. The quality of the product database is ensured through the industry bodies Quality Forum and the Standardization Committee. The item owner updates and maintains the information based on industry standards (ref quality forum and standardization committee). This is a unique quality assurance and proximity to the industry that no other players can offer.

Elfo – The Norwegian electronics PIM

Electronics Industry Association – EIF – is an industry association for Norwegian-based companies that runs electronics-related activities that are mainly aimed at the professional market, either as importer, manufacturer or developer.

Farmalogg – The Norwegian pharmacy industry PIM.

The product register, with few exceptions, covers all goods that are sold in pharmacies, and it contains information that is necessary for the safe and efficient handling of the goods throughout the value chain from manufacturer / supplier, through wholesaler and retailer, to end-user.

Prisguiden – Compare your prices

Price databases that allows you to compare your prices to competitors. You can also measure the popularity and trends that happens in the market. What does customers search for? By tightly integrating towards the market, makes decision making easier and can be made more automated.

Consignor – Easy shipping

Delivery Management is all about connecting your warehouse to your customers in the most efficient way. By making one standard integration to services like Consignor, they make sure that no matter what combination of carrier services you choose, customer will get the same high-quality feeling when receiving a delivery from you.

Currency exchange rate

This service is already present in standard Dynamics 365 – Start using it!

There are surly many other master data providers, and here I have list listed a few actors in the Norwegian marked. By outsourcing your master data maintenance, you will get much higher quality on the data and more return on investment.

Are you ready to outsource your master data ?

DaaS Leben ist kein Ponyhof



D365 – My Covid-19 10 day’s response story

Hi Friends.

I hope you all are hanging in there and can still work and deliver excellent experiences with Dynamics 365.

I wanted to share my Covid-19 10-day response story on how fast a reduced scope Dynamics 365 implementations has been made available. Some weeks ago, we and Microsoft were contacted by an important player in the health industry, that urgently wanted to establish purchasing- and supply-chain processes for medications and equipment’s. The key element here was the urgency because it was unclear in what directions the pandemic would take here. What the customer needed was tools that could process information about supply providers and what kind of supplies is needed for readiness stockpiling. Our first step was to setup Dynamics 365 (CRM) to store relations and this was done in a few days. Then the next step was to setup and go live with a “minimum viable product” of Dynamics 365 finance and supply-chain apps. We had a goal of doing this in 10 working days. This is the story I would like to share.

Day 1: Onboarding, tools, and deployment

In the initiation of a project, I always have a document named “Welcome to the [Customer]-project”. This is a great document, because it contains all the essential information about the onboarding to a project and can be shared to all participants. It is typically a 6-7 pages document explaining the onboarding process and the main objectives. It also contains references to LCS, SharePoint/Teams sites, DevOps and URL’s to environments. The most valuable element is a full overview of all the people that will somehow be involved in the project. In this project we decided on a small efficient 4 person team(POD), and fast-track support from Microsoft.

Microsoft quickly processed licenses, and we quickly deployed the LCS project. The first we started was to deploy the Tier-2 sandbox, and we named this the ‘UAT’ environment, and this was to be used as the master data/golden environment in the start. We also deployed the Tier-1 sandbox and named this “Test”, and would be used to have access to Visual Studio etc. The initial version we deployed was 10.0.10.

We have a ready implementation templates that is imported into DevOps, that contains the main structure of requirements and tasks. We scope this down to the actual processes we need.

We also have a ready folder structure for the team’s site where we can store and complete all documentation. By the end of the first day we had established the tools needed for starting the project.

Day 2: Working with the generic tasks in the backlog

We established a 30-minutes daily sprint meeting with main implementation major actors, where the plan is presented, and where the today’s tasks are prioritized. We did not have the time to create large word documents, to we decided to document the solution in DevOps, and organizing all the system setup around the entity templates as they can be extracted from D365. I exported the templates to Excel, and then import them to DevOps using the Azure DevOps Office® Integration, and this gives be 419 tasks to setup as much as possible in standard.

This makes it possible for we to have a step-by-step task list of all the elements I need to build the “Golden environment”. Also, each task is being assigned, and the actual setup is documented with a direct URL to the D365 form, and a screen dump of the actual setup.

On the first day we where able to process close to 200 tasks and setting up the most generic parts of the system.

Day 3: Working with the finance task backlog

When working on the finance setup we have a standard chart-of-accounts we imported, and we had to setup financial dimensions. We are also setting up the accounting structure, creating a few inventory posting profile and setting up tax parameters. Normally this is quite strait forward and we can use much from previous projects.

Day 4-5: Working with products

Now the Excel skills is put to the test. We have a excel sheet that contain most of the product master data. In total over 33.000 products, and each product have classifications, attributes, properties, and vendor/producer information. We quickly decided to use the same item numbering as was present in the excel sheet. Each column in the sheet was classified if:

  • This is a field we have in D365?
  • Should field become a category in a hierarchy?
  • Should the field because an attribute?

To get the products inn it was a very advanced copy/paste/merge of data into excel sheets that we then imported into Dynamics 365. At the end, we realized that all information we had could be imported, and without any information loss. It was hard work, but the end result was promising containing a list of all medical supplies available and classified into the medical ATC structure.

We also imported barcodes, vendors, producers, employees, address information, external items names/descriptions, attributes.

Day 6: Frist demo, UAT and deploy production environment

On day 6 we were ready to show the actual master data, and the initial view of the system. The customer was impressed by how fast we where able to build a system and processes that was familiar to their operation.

We decided to update the system to 10.0.11, and in parallel with the setup of the system we had been working closely with the Microsoft fast track solution architect to make the environments ready for production deployment. After a few iterations we got the production environment up and running and performed a DB-refresh of the production environment with the master data we had in the tier-2 sandbox. This meant that now we had an environment available to start performing transactional process testing and trimming the systems. I know that this is not the normal way of doing this, but thanks to Microsoft’s understanding of the urgency we where allowed to go this “fast-track” route. In DevOps we established the processes we wanted to test and optimize.

Day 7: Test dual write, business events and power platform

As earlier described, we also implemented some of the “CRM” elements first. Now we could enable the dual write, and synchronize vendors, employees, and other information into the CDS. Our first step was just to validate that it was working as expected in the UAT, and it worked as a charm We can share these master data across the D365 platform.

The next thing was to test how we could use the business event framework to integrate towards a 3’rd party WMS provider. Dynamics 365 have a business event that is kicking in when performing a purchase order confirmation. We decided to enable purchase order change management to have a strict workflow and ensure that we would rely on the purchase confirmation process.

This allows us to create a solution where the business event is catched by a power automate flow, that fetched all the lines of the purchase confirmation. And then transforms this into the format that the WMS provider needs. We can also enrich the data sent to the WMS provider, so that it is sufficient with all needed master data in their system. The next step is to import receive lines from the 3’rd party WMS provider. This will happen by power automate creating an arrival journal, and then a batch job in D365 is posting it, and then posting a product receipt. It all ends with a new business event being triggered (Purchase order received) that will send a message to the WMS provider that the goods now have been received. What we then archive is that the on-hand in each system is synchronized, and without any major delay caused by processing.

In total we have setup quite a lot of batch jobs, that handles all from cleaning, posting, and planning. We used the takings from the following blogpost as a template for batch jobs.

Day 7: Master planning and Planning Optimization

We do expect that quite a lot of requisitions and requirements will be processed through the system. So, using the new planning optimization engine from Microsoft suited the project well. Calculating the requirement on all products is extremely fast and done within minutes. This will allow for faster reaction time to new requirements and potentially reduce stockout situations caused by vendor lead time.

On day 7 we also imported all employees and created some approval position hierarches. This way we can extend the workflow processing for approvals.

Day 8-9: Testing, Testing, Testing in UAT

We started day 8 by refresh the UAT environment and executing testing according to key central the business requirements defined in DevOps. We found 3-4 issues, that was reported to Microsoft (Index performance etc), that was quickly fixed within hours by the excellent support architects. We also wanted to provide a bit visually nicer purchase order form-letter, that was more presentable, and decided to import the modern reports package from Microsoft. This makes it a bit easier to adjust.

We did try out the configurable business documents, but in this case it would take a bit more time to learn properly (that we did not have..) to set up correctly. Any issues we found, was also fixed in the PROD environment.

The main processes we focused on was the procurement processes, with approval steps, and manual coordination with vendors.

Day 10: Project closure and training

On day 10, we summarized on how far we had come, and created a project closure/summary report that also contains next steps and more backlog suggestions. We have suggested additional focus on Azure Data Lake, Power BI and implementation of a vendor portal. We also planned to perform training and making final changes to enable end-user onboarding. What we see is that making a system ready is not just setting up the system but implementing the use of the system in the daily operation. This is expected to take more time, and we are ready to respond

Final words and tips

I really hope this system will show it value and will be regarded as small but valued contribution to the covid-19 response. Microsoft have published the following page where there are resources that can help. Microsoft have also launched a program where you can get a 200 seat Dynamics 365 Customer Service system for free for 6 months to Covid-19 response related activities. Se https://dynamics.microsoft.com/en-us/covid-19-offer/

If you have any similar stories, please share them. The Dynamics 365 community cares and stands united in this Corona-19 fight!

D365 Importing JSON data the hard way!

I recently created a solution where I’m importing products and all related data for the grocery industry, and I wanted to share my experience so that others may follow. This is not a “Copy-Paste” blogpost, but more show my approach to the process that can be used when working with more advanced and complex JSON integrations. Many industries have established vertical specific databases where producers, distributor’s and stores are cooperating and have established standards on product numbers, product naming, GTIN, Global Location Number (GLN) etc. In Norway we have several, and the most common for the grocery industry here is TradeSolution. Most products is available to the public at VetDuAt.no, but they also have a Swagger API where the JSON data can be fetched and imported to D365.

One of the experiences I had when starting this journey, is that D365 is not modelled according how the data in these industry specific public databases. Much is different, and the data is often structed differently. We also see that the product databases are quite rich in terms of describing the products with physical dimensions, attributes, packing structure, allergens, nutrition’s etc.

To give you a small figure of the complexity you often can find, here is a subset of the JSON hierarchy:

I needed to decide how I should import this data. Should I just import what I have fields for in D365? Should I extend D365 will lot’s and lot’s of new fields? Or should I model according to how the external database is presenting the data? I decided on the latter and import the data as it was presented. This would give the best result and the least information loss in the process. I decided to go for a model where D365 is requesting a JSON file from the Swagger API, and then placing the JSON structure in a C# class structure. Then extracting the data from the C# objects and place the data into a new module I named EPD. The next step the process does is to take these data and populate the standard 365.

The benefit I see is that I’m not overextending the std Microsoft code. The data is available in D365, and can be used in Power BI etc. I would like to share some of the basic steps when fetching such large data structures from external services.

Fetch the JSON from the service.

To fetch a JSON file, I’m using some .net references, that helps handle Active Directory and http connections. The first method shows how to get an accesstoken, and this is relevant of the swagger services requires this. The next method is where the swagger URL is queried, and the JSON file is returned. In additional some success/error handling.

So at this time we have the JSON file, and we want to do some meaningful with it. Visual studio have a wonderful feature, where you can paste a JSON, and convert it into classes. To make this work, you will have to create a C# project.

This will generate the C# class, and in this example the number of sub-objects and the number of properties is in the hundreds, and the properties can be objects and event array’s of objects.

In addition I need to have a method that takes that JSON file, and deserializes the content into the class methods.

Store the JSON object data into D365 tables.

So at this time, we have been able to fetch the data, and in the following code, I’m getting accesstoken, getting the JSON, deserializing the it into an C#-object, and parsing it forward for more processing.


Now, let’s start inserting this data into a new D365 table. For simplicity reasons, I have created a D365 table for each data object in the JSON file. This allow me to store the entire hierarchical JSON structure into D365 tables for further processing. As soon as I have the data stored in D365, I can create the codes that moves it forward into the more functional tables in D365.

A lesson learn was that when creating sub tables to store hierarchical JSON data, it is sometimes needed to create relationship between the records in multiple tables. Sometimes also uniqueness is required, and the best way I have found (so-far) is to create a GUID field, and use this GUID to relate the data in the different tables. This can easily be accomplished with the following code.

Create the std D365 data using data entities through code.

At this stage I have ALL the data in D365, and I can start processing the data. Here is a subsection of how I create released products by using standard
data entities, where a table containing the JSON data is sent in, and I can create the products and all sub tables related to products.

This approach has resulted in solution, where it is easy for the end-user to fetch data from external systems, and import them into D365. Here is a form showing parts of the “staging” information before it is moved into D365 standard tables. (This form in in Norwegian, and showing a milk )

I would like to thank the community for all the inspired information found out there. Especially Martin Dráb (@goshoom) that have been very active in promoting the “Paste JSON as classes” in Visual studio.







D365 : Automatic license disablement and login reminder

When assigning licenses to a Dynamics 365 user, it would be beneficial if the system disabled and removed a license from a user if the user has not used the system for X days. X minus 5 days the system should send out a message to the user like this:

“This is a login REMINDER for Dynamics 365. Kurt Hatlevik has not logged into for at least 25 days. Your last login was 2/20/2020 12:10:00 AM. Login to Dynamics 365 is required at least once within a 30 days window or your account may be deactivated without notice. Please login within the next few days to ensure access is maintained.

Reactivation will require user administrator approval and will be dependent upon license availability.”

This would make the system more secure, and it will also free up licenses for users that are not using the system.

If you also think this could be beneficial, please vote on this idea her : https://experience.dynamics.com/ideas/idea/?ideaid=c12972cf-6a6c-ea11-b698-0003ff68dcfc# 

D365 and the supply structures in grocery retail industry

Today I will write a bit about the supply chain structure we see in the retail grocery industry, and challengers Dynamics 365 may face, and how to address them. The grocery industry has for many years seen that industry collaboration brings benefits and synergies throughout the value chain. We see industry collaboration that offers a range of services to its owners, customers and partners. In the country where I’m from, the main collaboration initiative is TradeSolution, and is owned by the main grocery chains in Norway. TradeSolution operates and maintains central registers, databases, and various IT, reporting and analysis services in Norway, but we see much of the same pattern in other countries and other industries also.

One essential element is to have a unification of how to identify products and how the products are packed, ordered and shipped. In Norway we have the term EPD (Electronic Product-Database), that makes it easy for the entire Norwegian grocery marked to purchase and sell products. Much of the information shown in the blogpost here is originating from TradeSolutions public pages here.

What is EPD?

In Dynamics 365, one of the most essential SCM elements are products and released products, and the associated master data tables related to this. In the grocery industry it is actually the packaging that is the center of it all. The products etc is actually properties of a packing structure. It would be an oversimplification to say that EPD is products. EPD is describing not only the products, but also the packaging of the products. The EPD standard is describing the products in up to 4 levels: basis, inner box, outer box and pallet(with SCCS). Each level identified with a GTIN. See also my old blogpost about SSCC.

So far so good. We can model this in Dynamics 365 by having a product defined as a “Basis”, and use the inner box, outer box and pallet as unit conversions. In D365 we also have the possibility to create barcodes for each unit of measurement (UOM). It would also be quick to assume that the EPD number is an external item description.

Unfortunately, the grocery industry is a bit more complex. Let’s take a quick look on the EPD numbers of Coca Cola. It is actually 7 packing structure/EPD numbers, and these are shown to the right(7digits). All of the represents different packaging of the same basis unit, and can have different properties and attributes.

What we also see is that some boxes are marked with a “F”, that means this is a consumer unit. So talk in D365 language, is can be sold to consumers. Some are also marked with a “B” that means that this is the unit that the EPD number is purchased in. So if we take a detailed look at EPD 4507224, we see that it is defined what units you can sell, and what units you can purchase. On a single EPD number there is only one level you can choose to purchase of. Here are 2 examples that describes the complexity. First example is an EPD, where the grocer can sell in basis unit and in inner box unit (EPD 4507224)

The next example is where the grocer can also sell basis unit and in another inner box unit type (EPD 2142941)

As you can see here, the conversion between inner boxes to pallet results in different quantities.

To further add complexity we can add the definition mix to the element. The ordering is happening on the inner box level, but it actually contains separate products that is sold through the stores.

On last element is also the concept of unmarked variants. Like this package of yogurts.

Summary EPD

  • A product is identified by a EPD number (EPDnr)
  • A unit is identified by a GTIN (Global Trade Item Number)
  • A unit is called «pakning» in EPD
  • A product can have up to 4 levels of units (hierarchy)
  • A product can be a mix of multiple «basis» or «mellom/innerbox» units
  • A “basis” unit can be shared by many products
  • The first level of the units is called «basis» in EPD (often referred to as a customer unit or base unit)
  • The top level of the units is called «topp» in EPD (often referred to as a load carrier unit)
  • The levels between «basis» and «topp» (if any) are called « mellom/innerbox/outerbox » units
  • A basis unit can consist of units without identification called unmarked variants («umerkede varianter»)
  • Within an EPD structure, only one of the packings is used for ordering.
  • Multiple packings can be used for sale.

Some key issues we have faced with Dynamics 365 on how the industry is modelling products is the following:

  1. Cost: As seen, a product can be sold in many different UOM’s, and we also see that the industry can have different purchase prices depending on which EPD number you choose to order. Meaning that a 4 pcs pack have a different cost than a 24 pcs pack. As the product can be purchased in multiple UOM with different prices, it is difficult to model the cost pricing correctly, because the inventory transactions will be on the lowest item. The inventory transaction costing is based on the lowest level, meaning basis. This costing problem is the reason why I suggest FIFO in retail grocery implementations.
  2. On-hand: Keeping track of how many basis units, or other consumer units is difficult, because you do not always know with the consumer is breaking up a coca cola inner box. Where should the cost come from, when having multiple purchasing units as shown in figure. This makes it difficult in Dynamics 365 to 100% correctly model the revenue per pcs sold.
  3. Unit conversion: As shown in the example, the same unit (like pallet) can contain different number of basis products. This means that it is insufficient to unify the UoM per product. UoM conversion is EPD dependent. Clear relationships between the UoM must also be modelled. A product may have multiple definitions of an inner box, outer box and pallet.
  4. External item descriptions: Dynamics 365 external item description cannot be used, because it only supports one external item description per vendor. UoM is not taken into consideration.
  5. Attributes: In the grocery industry, there may be different attributes per EPD number, and also different attributes per UoM.

How to model this in Dynamics 365?

To solve the distribution requirements, we see in the grocery industry, it is required to do some front-end remodeling of how products are represented. The grocery industry are focused on packaging and Dynamics 365 is product oriented. The key here is that EPD is Object Oriented, a product can be represented in several packaging structures.

The entities we have at our disposal in Dynamics 365 is the following:

  1. Products and released products
  2. Unit of measurement and conversion
  3. Barcodes
  4. External item descriptions
  5. BOM’s

But Dynamics 365 is what is it, and any change on the architecture of how products and transactions are handled is not on the near roadmap. We must try to model this structure in a way, such that the EPD standard and Dynamics 365 standard is modelled to work jointly together.

First, lets try to model how the EPD(Only subset) from a grocery supply perspective(Not D365!). An EPD can consist of multiple packaging structures, and a package main contain packages. At the bottom of the packing structure there is a reference to a basic package, that describes the product.



When importing EPD based products I see the following as a solution:

  1. EPD will be a separate entity/Table, and modelled as the grocery industry have it.(New tables in D365, the feeds the std D365 tables)
  2. D365 products will be defined as the “Basic Package”
  3. The EPD package structure populate the barcode table and the product specific unit of measurement table. Because there is several packaging, the traditional naming of the unit of measurement cannot be used. The unit of measurement conversion is actually dependent on the EPD number. In essence, this means having unit’s of measurement named :

    PCS – Basic unit for the lowest basis product
    IB-4507224 – Unit for the inner box
    OB-4507224 – Unit for the outer box
    LC-4507224 – Unit for the load carrier

    With this we can create the unit of measurement conversion between the different types.

Let’s say we have the following simple product:

This would be modelled in D365 with a released product:

I would here have to define 4 unit of measurements:

I would then have to define the following unit conversions to describe the unit conversions between the different EPD packing structures.

The more EPD packing structures present, the more unit conversions needs to be defined. (In the coca cola example there will be 6 more conversions)

We also need to store GTIN per packing unit per EPD:

We also have the Physical dimensions menu item, that now let’s us describe the physical dimension on the product per EPD unit.


In Dynamics 365 we can only select one suggested purchasing unit. So if you have multiple EPD associated with a product you will have to choose one, and this is the unit that is suggested.

The purchase order would then look like this, and where the unit is describing the EPD number.

To keep track of all unit conversions, GTIN/Barcodes etc will be an impossible manual job. Since EPD is an industry standard, all of these data is imported through WEB-services.

TradeSolution have their webservices that offer the possibility to send EPD structures to D365. This way, all packing structures of products can be automatically imported, distributed into std D365 and adjusted when needed.

The suggestion is not 100%, but it would make sure that grocery retailers can procure and sell the products, while also have the concept of packing structures in place.

Let’s conquer the grocery industry also








D365 – What have changed (pmfTablehasChanged)

This short post is for you hardcore X++ developers that create magic everyday. D365 have the following method, that allows you to validate if any fields on a record have been changed. If it returns true, then something has changed, and if false, then nothing has been changed. There are scenario’s where you would like to know if there have been any changes to the record before you update/write to the Db, to save some roundtrips to the Db.

Then this is nice, and 100% std

Happy coding friends.

Batch Jobs; Take control of the executions

Dynamics 365 can be automated quite a lot with the use of batch jobs. With batch jobs, your Dynamics 365 solution becomes “alive”, and we can set up the system to automate many manually processes. Lets say to have the following “vanilla process”, and wants to automate as many steps as possible.

This document covers the Batch jobs needed to be setup for this process to be as automated as possible. I wanted to put a structured system on all the batch jobs that is typically used in a production system. But this also generates a lot of data, that you don’t normally need. It is therefore common to create both functional batch jobs that processes and executes functionality, and also execute cleanup jobs that removes irrelevant data.

Batch job Naming conventions

To make it simpler to understand the batch jobs a simple structure of naming the batch jobs have been created. The first character is just “A”, to make sure that the sorting of the batch jobs is in the best possible way, and that the batch jobs can be sorted according to name. The next is a 3 digit number and at the last there is a then a description that explains the batch job.




System administration batch jobs


Data management batch jobs


General ledger batch jobs


Procurement and sourcing batch jobs


Sales and marketing batch jobs


Retail batch jobs


Inventory management batch jobs


Warehouse management batch jobs

Reach of these ranges are then set up as batch groups, and you can better control what AOS servers is executing what type of batch jobs:

In this blog post more than 87 batch jobs have been specified, and that keeps the Dynamics 365 system updated and as automatic as possible

Job description
A001 Notification clean-up
A002 Batch job history clean-up
A003 Batch job history clean-up (custom).
A004 Daily Diagnostics rule validation
A005 Weekly Diagnostics rule validation
A006 Monthly Diagnostics rule validation
A007 Named user license count reports processing
A008 Databaselog cleanup
A009 Delete the inactivated addresses
A010 Scan for orphaned document references.
A011 Report data clean up
A012 Cryptography crawler system job that needs to regularly run at off hours.
A014 Updates system notification states.
A015 Deletes non-active and orphaned system notifications.
A016 Database compression system job that needs to regularly run at off hours.
A017 Database index rebuild system job that needs to regularly run at off hours
A018 Deletes expired email history.
A019 Process automation polling system job
A020 Scan for document files that have been scheduled for physical deletion.
A021 System job to clean up expired batch heartbeat records.
A022 System job to seed batch group associations to batch jobs.
A023 System job to clean up unrecovered user session states.
A024 Change based alerts
A025 Due date alerts
A026 Email distributor batch
A027 Email attachment distributor
A103 Entity Store Deploy measurement
A103 Refresh data entity
A200 Clean up ledger journals
A201 Import currency exchange rates
A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.
A205 Update purchase and sales budget
A206 Source document line processing
A207 Source document line processing queue cleanup
A208 Ledger journal monitor
A300 Purchase update history cleanup
A300 Purchase update history cleanup
A301 Delete request for quotation
A303 Draft consignment replenishment order journal cleanup
A303 Run Forecast planning
A304 Run Master planning
A305 Post product receipt
A403 Sales update history cleanup
A405 Order packing slip
A406 Order invoice
A407 Calculate sales totals
A500 All retail distribution jobs (9999)
A501 Upload all channel transactions (P-0001)
A502 Process Assortment
A503 Update listing status
A504 Product availability
A505 Generate related products based on customer transactions
A506 Process delivery modes
A507 Synchronize orders job
A508 Update search Product data
A509 Update search Customer data
A510 DOM batch job
A511 DOM fulfillment data deletion job
A512 Default channel database batch job
A513 Recommendation batch job
A514 Retail scheduler history data removal batch job
A515 Create customers from async mode
A516 Retail transaction consistency checker orchestrator
A517 Retail transactional statement calculate batch scheduler
A518 Retail transactional statement post batch scheduler
A519 Retail financial statement calculate batch scheduler
A520 Retail financial statement post batch scheduler
A521 Process loyalty schemes
A522 Post earned points in batches
A523 Process loyalty lines for other activities
A524 Retail time zone information job
A600 Calculation of location load
A601 Inventory journals clean-up
A602 Inventory settlements clean up
A605 On-hand entries cleanup
A606 Warehouse management on-hand entries cleanup
A607 On-hand entries aggregation by financial dimensions
A608 Cost calculation details
A609 CDS – Post integration inventory journals
A700 Work creation history purge
A701 Containerization history purge
A702 Wave batch cleanup
A703 Cycle count plan cleanup
A705 Work user session log cleanup
A706 Wave processing history log cleanup
A707 WMS Replenishment
A708 Automatic release of sales orders

I will not go in detail of all the jobs, but here I at least refer to where you can find the menu item or what class is used in the batch job tasks. Also take a look at blog post by the D365 Solution architecture team, that is a subset of the batch jobs presented in this blog post.

System administration batch jobs

These are general system batch jobs that can perform cleanups and other general executions.


Name, path and recurrence

Description and recurrence

A001 A001 Notification clean-up

System administration > Periodic tasks > Notification clean up


This is used to periodically delete records from tables EventInbox and EventInboxData. Recommendation would also be if you don’t use Alert functionality to disable Alert from Batch job.

A002 A002 Batch job history clean-up

System administration > Periodic tasks > Batch job history clean-up


The regular version of batch job history clean-up allows you to quickly clean all history entries older than a specified timeframe (in days). Any entry that was created prior to – will be deleted from the BatchJobHistory table, as well as from linked tables with related records (BatchHistory and BatchConstraintsHistory). This form has improved performance optimization because it doesn’t have to execute any filtering.

A003 A003 Batch job history clean-up (custom).
System administration > Periodic tasks > Batch job history clean-up (custom)


The custom batch job clean-up form should be used only when specific entries need to be deleted. This form allows you to clean up selected types of batch job history records, based on criteria such as status, job description, company, or user. Other criteria can be added using the Filter button.

A004 A004 Daily Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation


Incorrect configuration and setup of a module can adversely affect the availability of features, system performance, and the smooth operation of business processes. The quality of business data (for example, the correctness, completeness, and cleanliness of the data) also affects system performance, and an organization’s decision-making capabilities, productivity, and so on. The Optimization advisor workspace is a tool that lets you identify issues in module configuration and business data. Optimization advisor suggests best practices for module configuration and identifies business data that is obsolete or incorrect.
A005 A005 Weekly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation


Performs a weekly validation and diagnostics.
A006 A006 Monthly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation


Performs a monthly validation and diagnostics based on the rules.
A007 A007 Named user license count reports processing

Class : SysUserLicenseMiner


Batch job that counts number of users that have been using the system. The data is used in the Named user license count report. D365 creates this execution automatically, but you have to rename it to fit this structure.
A008 A008 Databaselog cleanup

System administration > Inquiries > Database > Database Log


This job cleans up the database log, and makes sure that only (let’s say) 100 day’s of history remains. In the query criteria I set created date time less than “d-100”, to ensure that I keep 100 day’s of database log. This is general housekeeping and dusting in the system, and keeping the system nice and tidy.
A009 A009 Delete the inactivated addresses

Organizational administration > Periodic >Delete inactivated addresses


Deletes addresses that have been set to inactive.
A010 A010 Scan for orphaned document references.

Class : DocuRefScanOrphansTask


Batch job that is setup automatically by the system, and scans for document references where the source record is deleted.
A011 A011 Report data clean up

Class: SrsReportRunRdpPreProcessController


Cleans up any data generated for SSRS reports.
A012 A012 Cryptography crawler system job that needs to regularly run at off hours.

Class: SysCryptographyCrawlerTask

Every 3 days

Auto created at D365 setup …Not sure what this is, yet…..
A013 A013 Data cache refresh batch

System administration > Setup >

Data cache >Data cache parameters

Every 10 minutes

The data cache framework is used to cache data sets and tiles. Enabling of the data cache framework will redirect certain queries against a cache table instead of executing them against the underlying source tables.
A014 A014 Updates system notification states.

Class : SystemNotificationUpdateBatch

Every minute

Updates notifications,
A015 A015 Deletes non-active and orphaned system notifications.

Class : SystemNotificationScanDeletionsBatch


Deletes non-active and orphaned system notifications
A016 A016 Database compression system job that needs to regularly run at off hours.

Class: SysDatabaseCompressionTask


Compresses the database
A017 A017 Database index rebuild system job that needs to regularly run at off hours

Class: SysDatabaseIndexRebuildTask


Rebuilds indexes to ensure good index performance
A018 A018 Deletes expired email history

Class: SysEmailHistoryCleanupBatch


Deletes expired email history
A019 A019 Process automation polling system job

Class: ProcessAutomationPollingEngine

Every minute

Using business events, the polling use case can be re-designed to be asynchronous if it is triggered by the business event. Data will be processed only when it is available. The business logic that makes the data available triggers the business event, which can then be used to start the data processing job/logic. This can save thousands of batch executions from running empty cycles and wasting system resources.
A020 A020 Scan for document files that have been scheduled for physical deletion.

Class: DocuDeletedFileScanTask


Scan for document files that have been scheduled for physical deletion
A021 A021 System job to clean up expired batch heartbeat records.

Class : SysCleanupBatchHeartbeatTable


Cleans up the new internal monitoring BatchHeartbeatTable table (Only after PU32), and used for priority-based batch scheduling.
A022 A022 System job to seed batch group associations to batch jobs.



See priority-based batch scheduling.
A023 A023 System job to clean up unrecovered user session states.



Cleans up sessions that is unrecovered.
A024 A024 Change based alerts

System administration > Periodic tasks > Alerts > Change based alerts

Hourly (or faster)

Events that are triggered by change-based events. These events are also referred to as create/delete and update events.

See also Microsoft docs.

A025 A025 Due date alerts

System administration > Periodic tasks > Alerts > Due date alerts

Hourly (or faster)

Events that are triggered by due dates.

See also Microsoft docs.

A026 A026 Email distributor batch

System administration > Periodic tasks > Email processing > Email distributor batch

Send emails. See also Microsoft docs.
A027 A027 Email attachment distributor Send emails, with attachments. For workflow.

Data management batch jobs

Data management executions can generate a lot of data, and to maintain performance and avoid data growth, it is relevant to clean up staging tables and job executions. Also document any of your recurring executions to make it easy and simple to maintain a overview of your data imports and exports that are recurring.


Name, path and recurrence



[Cannot be executed in batch]

Data management workspace > “Staging cleanup” tile


Data management framework makes us of staging tables when running data migration. Once data migration is completed then this data can be deleted using “Staging cleanup” tile.


A101 Job history cleanup

Data management workspace > Job history cleanup


The clean up job will execute for the specified amount of time. If more history remains to be cleaned up after the specified about of time has elapsed, the remaining history will be cleaned up in the next recurrence of the batch job or it can be manually scheduled again.


A102 BOYD Data management export

Data management workspace >export in batch


If you have a data management export to BYOD, then this can be executed in batch. There are other options that also can be evaluated for this purpose. See A102 BOYD Data management export


A103 Refresh data entity

System administration à Setup à Entity Store


To refresh the entity store (the built in embedded power BI). The refresh updates the aggregated measurements, and is only relevant of there are updates or changes that affect these.

General ledger batch jobs


Name, path and recurrence



A200 Clean up ledger journals

Periodic tasks > Clean up ledger journals


It deletes general ledger, accounts receivable, and accounts payable journals that have been posted. When you delete a posted ledger journal, all information that’s related to the original transaction is removed. You should delete this information only if you’re sure that you won’t have to reverse the ledger journal transactions.


A201 Import currency exchange rates

Currencies > Import currency exchange rates


Automatically imports exchange rates from the bank.


A202 Purchase budget to ledger

Inventory management > Periodic tasks > Forecast updates > Purchase budget to ledger


Posts the purchase budget to ledger


A203 Sales budget to ledger

Inventory management > Periodic tasks > Forecast updates > Sales budget to ledger


Posts sales budget to ledger


A204 Update purchase and sales budget

Inventory management > Periodic tasks > Forecast updates > Update purchase and sales budget


Updates the purchase and sales budget.


A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.

General Ledger > Periodic tasks > Batch transfer for subledger journals


Batch transfer for subledger journals


A206 Source document line processing

Class: SourceDocumentLineProcessingController

Every 10 minutes

Used for accounting distribution. See Microsoft docs.


A208 Source document line processing queue cleanup

Class: SourceDocumentLineProcessingQueueCleanupController


Used for cleaning up accounting distribution. See Microsoft docs.


A208 Ledger journal monitor

Class: LedgerJournalTableMonitorController

Every 6 hours

Monitors if ledger journals should be blocked or opened.

Procurement and sourcing batch jobs


Name, path and recurrence



A300 Purchase update history cleanup

Periodic tasks > Clean up > Purchase update history cleanup


This is used to delete all updates of confirmations, picking lists, product receipts, and invoices generate update history transactions.


A301 Delete request for quotation

Periodic tasks > Clean up > Delete requests for quotations


It is used to delete requests for quotation (RFQs) and RFQ replies. The corresponding RFQ journals are not deleted, but remain in the system.


A302 Draft consignment replenishment order journal cleanup

Periodic tasks > Clean up > Draft consignment replenishment order journal cleanup


It is used to cleanup draft consignment replenishment order journals.


A303 Run Forecast planning

Master planning > Forecasting > Forecast planning


Demand forecasting is used to predict independent demand from sales orders and dependent demand at any decoupling point for customer orders. See also at Microsoft docs, where using additional azure services to perform the calculation.


A304 Run Master planning

Master planning > Master planning > Run > Master planning


Master planning is used to generate planned (purchase) orders, based on the coverage settings. We expect this service to be enhanced with more real-time oriented planning engine. The master planning batch job execution is located at. Also check out the Microsoft docs on this (large) subject.


A305 Post product receipt

Procurement and Sourcing > Purchase orders > Receiving products > Post product receipt

Automatically post purchase receipt when all lines have been registered,

Sales and marketing batch jobs


Name, path and recurrence



A400 Delete sales orders

Periodic tasks > Clean up > Delete sales orders


It deletes selected sales orders.


A401 Delete quotations

Periodic tasks > Clean up > Delete quotations


It deletes selected quotations.


A402 Delete return orders

Periodic tasks > Clean up > Delete return orders


It deletes selected return orders.


A403 Sales update history cleanup

Periodic tasks > Clean up > Sales update history cleanup


It deletes old update history transactions. All updates of confirmations, picking lists, packing slips, and invoices generate update history transactions. These transactions ca be viewed in the History on update form.


A404 Order events cleanup

Periodic tasks > Clean up > Order events cleanup


Cleanup job for order events. Next step is to remove the not needed order events check-boxes from Order event setup form.


A405 Order packing slip

Sales order > Ordershipping > Post Packingslip


Set up automatic packingslip posting of the sales order is completely picked. (If this is the process). This means that as soon as the WMS have picked the order it gets packingslip updated.


A406 Order invoice

Accounts payable > Invoices > Batch invoicing > Invoice


Set up automatic invoice posting of the sales order is completely packingslip updated. (If this is the process).


A407 Calculate sales totals

Periodic tasks > Calculate sales totals

Recalculate the totals for the sales order. This is typically used in scenario’s when the sales order is part of a “Prospect to cash” scenario. See docs.

Retail batch jobs


Name, path and recurrence



A500 All retail distribution jobs (9999)

Retail and Commerce > Retail and Commerce IT > Distribution schedule


This batch job is sending all distribution jobs to the retail channel database. This data like products, prices, customers, stores, registers etc. The distribution job is a “delta” distribution, meaning that only new and changed records are sent. There is a lot of more to be discussed on how to optimize the 9999-distribution job, and for really large retail installations some deep thinking is required. For smaller installations it should be OK to just use the setup that is automatically generated when initializing D365 retail/Commerce.

A501 upload all channel transactions (P-0001)

Retail and Commerce > Retail and Commerce IT > Distribution schedule


The P-0001 is sending the retail transactions back from the POS to the D365 HQ, where the retail transactions can be posted and financially updated.

A501 Process Assortment

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process Assortment


This job processes the assortment based on the assortment categories set on an item, and based on the assortment set up, puts the items in the relevant stores’ assortment. When defining an assortment, you have in D365 the possibility to connect organization hierarchies to retail category hierarchies. The process assortment will perform the granulation of this, so that D365 have a detailed list of each product that is present in each store. The assortment is setup under Retail and Commerce à Catalogs and assortments à Assortments and more details is available on Microsoft docs.

A503 Update listing status

Retail and Commerce > Retail and Commerce > Products and Inventory > Update listings


The listing status is related to publishing a retail catalog to an online store. The Microsoft documentation is not the best in this area, and the closes explanation I have is that it is related to the listing status on the catalog.

A504 Product availability

Retail and Commerce > Retail and Commerce > Products and Inventory > Product availability


The batch job for product availability is calculate if a product is available on online store. Checkout this blogpost for further details. SiteCore eCommerce integrations can benefit from this, and in essence it populates the data needed for distribution job 1130, and that maintains the following tables into the channel database

A505 Generate related products based on customer transactions

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Generate related products


This job will automatically populate related products based on sales transaction purchase history. The two relationships created are ‘customers who bought this item also bought’ and the ‘frequently bought together’ relation types. This data can then further be used in eCommerce scenario’s. Fore deep details, take a look at the class ‘RetailRelatedProductsJob’

A506 Process delivery modes

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process delivery modes


This job sets up delivery modes on a new store when added to organization hierarchy ‘retail store by department’. On the modes of delivery you can assign a organizational hierarchy, and this batch job assigns the specific modes of deliveries to each store. The modes of delivery is used in omnichannel scenario’s where the customer can have their products sent home etc.

A507 Synchronize orders job

Retail and Commerce > Retail and Commerce IT > Synchronize orders


If you have setup your channels to create sales order asynchrony, this job will create the sales orders and post payments. Also take a look at the following Microsoft docs on how sales orders and payments are synchronized from an online store.

A508 Update search Product data

Sales and marketing > Setup > Search> Search criteria


Create an indexed search of products, that makes it faster and easier to search for products in the call center.

A509 Update search Customer data

Sales and marketing > Setup > Search> Search criteria


Create an indexed search of customers, that makes it faster and easier to search for customers in the call center.

A510 DOM batch job

Workspace > Distributed Order Management > Dom processor job setup


Run distributed order management on retail sales orders to determine what warehouse should deliver the sales order

A511 DOM fulfillment data deletion job

Workspace > Distributed Order Management > DOM fulfillment data deletion job setup


Cleans up the DOM data that is no longer the valid calculation.

A512 Default channel database batch job

Class : RetailCdxChannelDbDirectAccess

Every 3 minutes

This job main duty is to check all Download sessions and Upload sessions with status “Available”, then it will apply the data to respective target DB’s (AX or channel DB). See also this blog.

A513 Recommendation batch job

Class FormRunConfigurationRecommendationBatch


Se Microsoft docs.

A514 Retail scheduler history data removal batch job

Retail and Commerce > Headquarters setup > Parameters > Retail scheduler parameters

Class: RetailCdxPurgeHistory


Deletes CDX history. Typical only keeping 30 days of CDS history

A515 Create customers from async mode

Retail and Commerce > Retail and Commerce IT > Customer > Create customers from async mode


If customers should be created async (parameter), then this job will create the customer.

A516 Retail transaction consistency checker orchestrator

Retail and Commerce > Retail and Commerce IT > POS posting > Validate store transactions


Performs validation on the unposted POS transactions. See Microsoft docs.

A517 Retail transactional statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Creates transactional statement. Se the following blog post.

A518 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Create and posts sales orders. Se the following blog post.

A519 Retail financial statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate financial statement in batch


Retail statement Trickle feed financial statement calculate. Creates financial statement. Se the following blog post.

A520 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post financial statement in batch


Retail statement Trickle feed financial calculate. Posts shift declaration Se the following blog post.

A521 Process loyalty schemes

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty schemes

Processes loyalty schemes. See Microsoft docs.

A522 Post earned points in batches

Retail and Commerce > Retail and Commerce IT > Loyalty > Post earned points in batches

Loyalty points should be posted in batch. See Microsoft docs.

A523 Process loyalty lines for other activities

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty lines for other activities

Other Loyalty points in batch. See Microsoft docs.

A524 Retail time zone information job


Generates timezone information up until 2054. Ensures that timezone used in the store does not causes inconsistent dates.

Inventory management batch jobs


Name, path and recurrence



A600 Calculation of location load

Inventory management > Periodic tasks > Clean up > Calculation of location load


WMSLocationLoad table is used in tracking weight and volume of items and pallets. Summation of load adjustments job can be run to reduce the number of records in the WMSLocationLoad table and improve performance.


A601 Inventory journals clean-up

Inventory management > Periodic tasks > Clean up > Inventory journals cleanup


It is used to delete posted inventory journals.


A602 Inventory settlements clean up

Inventory management > Periodic tasks > Clean up > Inventory settlements cleanup



It is used to group closed inventory transactions or delete canceled inventory settlements. Cleaning up closed or deleted inventory settlements can help free system resources.

Do not group or delete inventory settlements too close to the current date or fiscal year, because part of the transaction information for the settlements is lost.

Closed inventory transactions cannot be changed after they have been grouped, because the transaction information for the settlements is lost.

Canceled inventory settlements cannot be reconciled with finance transactions if canceled inventory settlements are deleted.


A603 Inventory dimensions cleanup

Inventory management > Periodic tasks > Clean up > Inventory dimensions cleanup


This is used to maintain the InventDim table. To maintain the table, delete unused inventory dimension combination records that are not referenced by any transaction or master data. The records are deleted regardless of whether the transaction is open or closed.

Inventory dimension combination record that is still referenced cannot be deleted because when an InventDim record is deleted, related transactions cannot be reopened.


A604 Dimension inconsistency cleanup

Inventory management > Periodic tasks > Clean up > Dimension inconsistency cleanup


This is used to resolve dimension inconsistencies on inventory transactions that have been financially updated and closed. Inconsistencies might be introduced when the multisite functionality was activated during or before the upgrade process. Use this batch job only to clean up the transactions that were closed before the multisite functionality was activated. Do not use this batch job periodically.


A605 On-hand entries cleanup

Inventory management > Periodic tasks > Clean up > On-hand entries cleanup


This is used to delete closed and unused entries for on-hand inventory that is assigned to one or more tracking dimensions. Closed transactions contain the value of zero for all quantities and cost values, and are marked as closed. Deleting these transactions can improve the performance of queries for on-hand inventory. Transactions will not be deleted for on-hand inventory that is not assigned to tracking dimensions.


A606 Warehouse management on-hand entries cleanup

Inventory management > Periodic tasks > Clean up > Warehouse management on-hand entries cleanup


Deletes records in the InventSum and WHSInventReserve tables. These tables are used to store on-hand information for items enabled for warehouse management processing (WHS items). Cleaning up these records can lead to significant improvements of the on-hand calculations.


A607 On-hand entries aggregation by financial dimensions

Inventory management > Periodic tasks > Clean up > On-hand entries aggregation by financial dimensions


Tool to aggregate InventSum rows with zero quantities.

This is basically extending the previously mentioned cleanup tool by also cleaning up records which have field Closed set to True!

The reason why this is needed is basically because in certain scenarios, you might have no more quantities in InventSum for a certain combination of inventory dimensions, but there is still a value. In some cases, these values will disappear, but current design does allow values to remain from time to time.

If you for example use Batch numbers, each batch number (and the combined site, warehouse, etc.) creates a new record in InventSum. When the batch number is sold, you will see quantity fields are set to 0. In most cases, the Financial/Physical value field is also set to 0, but in Standard cost revaluation or other scenarios, the value field may show some amount still. This is valid, and is the way Dynamics 365 for Finance and Operations handles the costs on Financial inventory level, e.g. site level.

Inventory value is determined in Dynamics 365 for Finance and Operations by records in InventSum, and in some cases Inventory transactions (InventTrans) when reporting inventory values in the past. In the above scenario, this means that when you run inventory value reports, Dynamics 365 for Finance and Operations looks (initially) at InventSum and aggregates all records to Site level, and reports the value for the item per site. The data from the individual records on Batch number level are never used. The tool therefore goes through all InventSum records, finds the ones where there is no more quantity (No open quantities field is True). There is no reason to keep these records, so Dynamics 365 for Finance and Operations finds the record in InventSum for the same item which has the same Site, copies the values from the Batch number level to the Site level, and deletes the record. When you now run inventory value reports, Dynamics 365 for Finance and Operations still finds the same correct values. This reduced number of InventSum records significantly in some cases, and can have a positive impact on performance of any function which queries this table. 


A608 Cost calculation details

Inventory management > Periodic tasks > Clean up > Cost calculation details


Used to clean up cost calculation details.


A609 CDS – Post integration inventory journals

Inventory management > Periodic tasks > CDS integration > Post integration inventory journals

Fetches journals from the CDS (Common Data Service) and posts them. This applies only of the CDS is in use.

Warehouse management batch jobs


Name, path and recurrence



A700 Work creation history purge

Warehouse management > Periodic tasks > Clean up > Work creation history purge


This is used to delete work creation history records from WHSWorkCreateHistory table based on number of days to keep the history provided on dialog.


A701 Containerization history purge

Warehouse management > Periodic tasks > Clean up > Containerization history purge


This is used to delete containerization history from WHSContainerizationHistory table based on number of days to keep the history provided on dialog.



A702 Wave batch cleanup

Warehouse management > Periodic tasks > Clean up > Wave batch cleanup


This is used to clean up batch job history records related to Wave processing batch group.


A703 Cycle count plan cleanup

Warehouse management > Periodic tasks > Clean up > Cycle count plan cleanup


This is used to clean up batch job history records related to Cycle count plan configurations.


A704 Mobile device activity log cleanup

Warehouse management > Periodic tasks > Clean up > Mobile device activity log cleanup


This is used to delete mobile device activity log records from WHSMobileDeviceActivityLog table based on number of days to keep the history provided on dialog.


A705 Work user session log cleanup

Warehouse management > Periodic tasks > Clean up > Work user session log cleanup


This is used to delete work user session records from WHSWorkUserSessionLog table based on number of hours to keep provided on dialog.


A706 Wave processing history log cleanup

Warehouse management > Periodic tasks > Clean up > Wave processing history log cleanup


This is used to clean up history records related to Wave processing batch group.


A707 WMS Replenishment

Warehouse management > Replenishment > Replenishments

Calculate location replenishments on the warehouse locations.


A708 Automatic release of sales orders

Warehouse management > Automatic release of sales orders

Releases sales orders to the warehouse so that the picking can start.

Monitoring Distribution jobs

The Retail IT workspace is specifically created to monitor all distribution jobs, sending data to RCSU and POS. If there are failed sessions, they will be seen here. Also the current download (To RCSU) and Upload (From RCSU) is shown here.

Monitoring Batch jobs

The best place to monitor all current batch jobs is through the system administration workspace. Here all failed, running, waiting and withheld batch jobs are shown. This workspace also has additional system administration features.

D365 – To exist or not, that is the question!(part 2)

Some years ago I created a free community solution for “Not-Exists Join“. Not exists join means that we can filter and search on data that does not have any relational records. This answers questions like;

– Show me all customers that have no sales orders the last X days

– Show me all items with no inventory transaction. Show me items with no movement last 30 days.

– Show me all items that have no price.

Countless community friends have used this for AX 2012. But since Dynamics 365 was released this solution could not be applied. To make it properly I have decided to push a request through the CDE (Community Driven Engineering), and hopefully making it available to all D365 customers as part of the standard solution. All code is ready and checked-in , and I’m just waiting for Microsoft review.

The way the CDE works, is that partners and customer that have code or bugfixes can work together with Microsoft on implementing changes. It is Microsoft that have the final decision, and they will also make it part of their IP. But for all you community friends, here is a sneak peek of what I’m working on together with Microsoft.

The advanced filter and query in Dynamics 3656 are a very powerful tool. Here you can search and filter on most fields and add join relations to the query.

But there is one area that the advanced query screen is not handling. That is “not-exist-join”. Let’s say I want a list of all the customers that don’t have sales orders. The standard D365 will not help here. The purpose of this document is to show how to implement “not-exists-join” into standard.

Functional Solution

In the joins form, a new section of relations has been added that represents the tables that can be “not-exist-join” added:

In this sample the customers will no sales orders will be in the query result/form. But the feature are generic, and all 1:n relations can also be selected as a “Not exists” relation.

When will you have this in standard? Maybe 10.0.10?? It depends on Microsoft and final approval of the code and feature. But hopefully it should not be in the far future. But “cheer and share” and maybe we as the community can accelerate this very requested feature.

D365 community ROCK’s and Happy DAX’ing!!

Microsoft Bookings and Microsoft Graph

One common feedback we get when implementing Dynamics 365 is the ability to handle appointments and booking. There are many very good 3’rd party solutions, but did you know that Microsoft have an easy to use booking system that works online and integrated with Outlook. It’s called Microsoft Bookings, and is worth taking a small look at especially if you have the need of booking your customers for appointments and simple services. Microsoft Bookings provides online and mobile apps that make appointment scheduling simple and efficient for small businesses and their customers. Any small business that provides service on an appointment basis, such as auto repair shops, hair salons, and law firms, can benefit from having their bookings managed so as to free up time for the more important task to grow their business. Microsoft Bookings is available to businesses that have an Office 365 Business Premium subscription.

Here is a small live demo for you my friends: https://outlook.office365.com/owa/calendar/DXCCommerce1@dxccommerce.onmicrosoft.com/bookings/

The first page an online customer arrives at is the following screen, that can be published on Facebook or any social media sites. Here I choose to order my haircut from my favorite hairdresser. (Full manual is available here)


When booking I will get a confirmation email, and the booking coordinator will also get an email. The booking is also available on my phone:


On the back-office side, Microsoft have created a simplified view of managing and setting up your bookings:

Here you manage the calendar, customers and staff.

Here is the calendar for a specific day showing all appointments and bookings for today. Drag and drop of appointments between staff and dates is of course possible.

You can also manage you staff.

And the services you offer and map them towards your staff.


If you are a functional person, then just stop reading here, because here comes the good part: There is a complete API interface your you to integrate towards booking. (See also this link) Connecting this towards Dynamics 365 or commerce apps can be done by a developer, and makes it possible to expose booking services to POS, call-center and with tight integration to your Dynamics 365 solution.

Check out Microsoft Booking and Microsoft Bookings API in Microsoft Graph.

Here are some sample pictures on how to access the Booking system using Microsoft Graph. First, here I list all the booking sites listed in my tenant:

Pay attention to the fact that it returns and “id”, that identifies my booking on a specific store. If I now queries for bookings at the ID like this:

https://graph.microsoft.com/beta/bookingBusinesses/DXCCommerce1@dxccommerce.onmicrosoft.com/appointments (You will not get access to this link, but you are welcome to click it )

I get the following, where the service is listing up all bookings posted into Microsoft Bookings. A consent through the azure portal must be setup. And the great thing is that is actually is a two-way service. I can post bookings in.

BOOM! Take that! We now have a complete interface towards all services that Microsoft Graph can expose and can let us integrate on a completely new level.

If I wanted, I can now connect my bookings to any planning engine that would add more value to the service. Like picking me up in a golden limo-cab when I book my hairdressing hour. The possibilities are endless. Also remember that this is not restricted to bookings, but all services that azure may provide. We in the Dynamics partner community have just scratched the surface of the possibilities that Microsoft now provide.

Happy DAX’ing friends.

Dynamics 365 Branding and Commerce (Preview) Firsthand experience

PS! Remember to read the last lines in this blogpost

As I hope you have seen in your never-ending twitter/news feed is that Microsoft again adding lot’s of new apps and features to Dynamics 365. Microsoft are delivering on the communicated vision of Dynamics 365. We now have apps where we have a holistic approach to business processes. To solve business requirements users will be using a combination of apps that works natively together. We see how the entire solution is being connected, and further split up into specific areas. In the old days when we had large ERP suites, and we sold functional modules. We are now implementing connected apps that enable business processes per user. If anyone wonder what the “new” hashtag is, it is easy: “#MSDyn365“, and get used to it. We no longer a need to put things into additional silo’s to explain the legacy, and to succeed we must embrace and deliver the right combination of apps that solves the business requirements per business process.

One of the most exiting news in the current wave 2 release is the delivery of Dynamics 365 Commerce(preview). I have been privileged to validate and try of this solution the last days. My current experience is that: This Rock’s! Microsoft finally can deliver a complete suite to give a true omnichannel experience. One interesting finding is that Microsoft will rebrand their “Dynamics 365 for Retail” offering to “Dynamics 365 Commerce”. Why? Because what is now being offered extends the binderies of the traditional retail solutions. As seen in the following figure you can get a complete integrated end-to-end system. And this is not just for retailers, but for all companies that want to digitalize their processes and offer true omnichannel can benefit of this.

1 : Picture from Microsoft presentations

To try out this new solution, you can request a preview. You ask for a preview here. When/if accepted you will receive an email from Microsoft containing instruction on how to deploy this preview. This guide is also available her and it is important that the guide is followed very carefully. To complete the guide, you need to get some assistance from you Azure AD tenant administrator. Also, the preview is currently only deployable to US Azure datacenters, and this put’s some latency into the commerce experience.

One interesting thing with the commerce, is that even though this is a tier-1 environment, you get the possibility to deploy RCSU and the e-Commerce Server. The data set is basically standard Dynamics 365 for Retail, where the configuration key for retail essential enabled. So we can showcase that the Dynamics 365 Commerce also can be delivered as a standalone app or be extended with the finance and supply chain management apps.

The preview commerce solution is what you expect an e-commerce solution to be:

The back-end editor for the editor is easy to use, and it is easy configure your

To get a full understanding of the solution also head over to Microsoft DOC’s to learn more : https://docs.microsoft.com/en-us/dynamics365/commerce/

But I’ll do something better for you; You can check-out the preview solution yourself right now: https://d365commerceus2ecom-eval.commerce.dynamics.com/DXCCommerce (I expect that the site will be available for a only few days, so hurry)

If you want to buy something use Card number: 4111-1111-1111-1111, Expiration: 10/20, CVV: 737 . Also remember that this is a US-based Azure Datacenter and NOT a production grade scaled system.

Happy DAXing and DXCuLater!

D365 Retail – Buzz Alert !

Microsoft is launching several new product lines for retailers.

Dynamics 365 Commerce

Empower your business to create exceptional, insightful shopping experiences for every customer with Dynamics 365 Commerce—built on our proven Dynamics 365 Retail solution.


Microsoft Connected Store

Empower retailers with real-time observational data to improve in-store performance. From customer movement to the status of products and store devices, Dynamics 365 Connected Store will provide a better understanding of your retail space. (Check out the video)




D365F&O, Lots of new high value content on DOC’s

The Microsoft Dynamics team have been quite buzzy after the vacation producing a lot of valuable content to Dynamics 365. I would like to highlight some of the latest additions that is worth checking out and to share in the Dynamics 365 ecosystem. Just this year alone, 714 articles have been published, and just the last 2 months close to 300 articles are made available. With this amount of information, I do get questions if there are some hidden gems on docs. And here some of them are:

1. Learning catalog

There are now more tailored learning paths towards customers and partners, with references to free, self-paced online learning path, Tech-talks, and formal instructor-led training. Here you will find articles, videos, and all you need to start learning Dynamics 365.

2. Test recorder and Regression suite automation tool for Retail Cloud POS

Now we can start creating regression testing on Retail POS. Cool stuff, and in my mind where we actual see the true value of regression testing. Retail is Detail, and this delivers quality.

3. Master planning setup wizard

Setting up master planning involves taking many decisions and here you can read how this is done in 10.0.5.

4. One Version service updates FAQ

This page answers a lot of question on the One Version strategy, and what this means for you. At many customers I see that extensive, time-consuming and costly testing processes are being manually executed each time Microsoft is releasing a new monthly updated. Why? I do not see the need to perform full testing on all modules on a monthly basis. Yes, it is a fact that nobody releases flawless code. (Even not Microsoft), but if you follow the procedures and guidelines from Microsoft, the monthly updates should be safe to deploy. There are several release rings and programs in place ensuing that quality is in place at GA. (General Available). Please align to the release cadence updates, and focus on your essential core processes. If you find painful bugs, report them asap.

5. Environment planning

I have seen several projects where the focus is to save costs on implementation environments. This page explains a lot on Microsoft’s take on this. My simple advice is use Tier-1/One box for development on a cloud hosted CSP subscription, and the rest of the environments as Tier-2 or higher (my recommendation is to have 2 additional Tier-2 environments for larger projects). The benefit to use self-service processes is priceless. Also keep in mind that Azure costs are very cheep compared to consultancy hours trying to maintain and manually transfer databases between environments. Also take a look at the great Denis Trunin’s blogpost on development VM’s performance.

6. Business events overview

This is the future and start adopting this feature into your business processes. This is also a key enabler for working closer with the Dynamics Power platform.

7. Regulatory updates

Here you find localized information for your country, and how to comply to specific local requirements. This is being updated very often.

8. Unified product experience

Do you want to keep the products from D365F&O synced with D365Sales ? This article explains how to achieve a near real time bi-directional integration with CDS. Great stuff also explaining dual write capabilities.

9. Adyen payment processing with omnichannel experience

Payment connector is far more versatile than just for retail. Also check out the FAQ.

10. Asset management

Great stuff on the horizon. Keep track of your stuff

11. Franchising

No longer in the official 2019 Wave 2 release. So, we must keep waiting for this in the future.


Take care, and

DXC you later



Analyzing Cloud POS performance in Dynamics 365 for Retail

It is a constant requirement that systems retailers are directly interacting with should be Bigger, Better, Faster, Stronger (BBFS). In this blog post, I will dig into how the POS performance can be analyzed to better understand the transactional performance of the Dynamics 365 POS. What I’m specially interested in is how perceived performance is towards actual. What we think is good performance is relative to the observer. The average reaction time for humans is 250 ms to a visual stimulus, but newer studies shows that we can identify visual stimulus down to 13 ms. Your screen has a refresh rate of 17 ms. As time is relevant and the expected performance is close to real-time, this can sometimes lead to performance expectations that is actual irrelevant towards what is wanted to be achieved. We as humans cannot go beyond 250 ms visual response time, so this is important to keep in mind.

As you can see in the following video, 4 items is scanned and then a quick cash payment is done. The total time taken to complete this example transaction in CPOS is approx. 5 s.

But as you can see on the screen, there is a lot happening, and when the user interface is being redrawn. I wanted to go deeper to understand exactly what is happening when scanning. More specifically on what’s happening when adding the sales lines in the POS.

As the CPOS is a 100% web based application, we can use Google Chrome to take a deeper look into exactly what is happening. By pressing the F12(Or CTRL-Shift-I), you get up the developers tools.

Then start the recording (CTRL-E), add a line in POS, and stop the recording. Then you will see:

1. CPU load, Activity bars, Network calls
2. The actual animation on the POS display each millisecond
3. Exactly how long calls to the Retail Server is taking.
4. The entire REST-call stack being executed on the CPOS client.

Here you see an example where I added one line to the POS basket, and this resulted in 2 calls to the retail server.

If we look at one of the calls happening:

ScanResults() (*.dynamics.com/Commerce/ScanResults(‘07100’)?$expand=Customer&api-version=7.3) – This scans the product/item barcode and sends it to the retail server. In google development tool, we can analyze exactly what is taking place on this call. Here we see that the total time was 559.54 ms but the actual waiting time for the RSSU to respond is 263,69 ms(Waiting TTFB). The browser is waiting for the first byte of a response. TTFB stands for Time-To-First-Byte. This timing includes 1 round trip of latency and the time the server took to prepare the response.) I have measured the network latency to this Tier-2 with RCSU system to be 40 ms.

If I scan the item again, we see that the caching of DNS etc kick’s in the TTFB lowers to 132,80 ms.

As you can see you can really go deeeep, and analyze all what is happening, from client execution to server execution, without any debugging tools. Down to the milliseconds, and better understand the performance. The profile created can be exported and imported for deeper analysis. We can see that there are many factors that influence performance, from network delay’s to form refresh. Microsoft could have the pleasure of shaving milliseconds of the animations, server calls and J-scripts, but this is an ongoing investment from and R&D perspective.

My honest opinion is that the Cloud based Dynamics 365 for Retail POS is performing good. Network elements and the speed of light is a fundamental restriction. The use of animations also seams to affect how performance is perceived, but it does not affect the general performance and usability. Legacy system that is on-prem have the benefit of not having latency, but the cloud solution brings so many other positive elements. If you choose MPOS instead, these tools are not available and you can use fiddler for analysis. But a small tip is to have a CPOS client available when performance testing, as this also will affect MPOS.

Bigger, Better, Faster, Stronger !

Retail statement trickle feed (public preview)

Retail statements are one of the most important (and complex) processes a retailer have. It’s where the retail sales and transactions are being transformed to become physical and financial transactions so you can see the sales in finance and in inventory. Retail statement calculation and posting have been covered many times in my blog posts and Microsoft have a large set’s of article on doc’s on the matter. The amount of transactions retail statements calculates and post is to my knowledge THE most complex and intense feature and business process in the entire Dynamics 365 solution. Imagen that every sale, in every store is being processed. For larger retailers Dynamics 365 for Retail are processing millions of transactions daily. This area really put’s computational pressure in the systems and is also one of the areas where Microsoft is investing heavily.

Since the start of D365 there have been done hundreds of improvements on retail statement posting, and the next “big thing” is Retail Statement trickle feed. One of the pain’s in today’s solution is a significantly delay between the when the retail sales have been conducted, and when the inventory transactions have been financially posted. And in short, when the inventory transaction gets a financial status like “Sold”. Why is this important? Because the inventory transactions define on-hand values, as again it defines how the master planning/replenishment is calculated. We want this to be as accurate and up to date as possible. Any delays in having accurate on-hand influences planned purchase orders. Also the ability to spread out the processing of transactions through the day will reduce the amount of “spikes” in the Azure SQL load, making the nightly timeslot more open for other high intensive transaction processing tasks.

Another critical benefit of trickle feed is the decoupling of transactional statements and financial statements. Now you can post transactional statements without even posting a financial statement, and the other way around. Together with the increase of posting frequency that produce small bundles of transactional statements, it will address the main reason for the compounding effect that prevents a series of statement from being posted due to a single invalid transaction. Right now, the only validation that impact financial statements is that all retail transactions for a given shift must be present in HQ in order for a financial statement to be posted. However the transactions don’t need to be successfully posted for a financial statement to be posted.

There is also a new aggregation strategy, where unnamed transactions are always aggregated and named(customers) transactions are never aggregated. There is no more option available to turn aggregation on or off.

Microsoft have made the following improvements to the statement posting process:

  1. Deprecate the “inventory job” that creates temporary reservations.
  2. Create a new job that will, at a predefined schedule, create sales orders, invoice them, and create, post, and apply payments for all the transactions that are synchronized to the HQ at that point of time. In addition, it will also create any ledger journals that need to be created for discounts, gift cards, and so on.
  3. The statement document that gets created at the end of the day will only be used to calculate and post any counting variances.

To enable the new preview (10.0.5) trickle feed solution you have to enable the Retail Statement (trickle feed) – preview configuration key. Also remember to disable the other retail statements configuration keys, and that you don’t have any open statements when doing this.

When finally released (GA) I hope that the new the new feature management is used for enabling this.

When this is done, you will see a set of new menu items. Under the menu \Retail\Retail IT\POS Posting.

The sequence of these batch jobs is to be able to financially post most of the transactions, and the financial statement posting will only be used to calculate and post any counting variances. There is no need to run the “Post inventory” job anymore. But in reality, there is a decoupling, and the transactional statement and financial statement can post independently if the other have not been posted. The only actual requirement is that the P-job have fetched the retail transactions from the retail channel database.

If we look into the Retail Statement form, we now have the possibility to manually create transaction posting and financial reconciliation (That in the essence is the financial statement).

When creating a “Transactional posting”, we see that the form is a bit changed compared how it was before. There a no lines related to payments.

When posting the transactional statement, the following steps are performed:

When calculating and posting a financial statement, you see the more traditional statement posting screen, where you have the payment lines:

The steps in the posting is the following:

The summary of this, is that Dynamics 365 will with trickle feed support a much faster updating frequency to get proper on-hand values and scalability. Since the transaction statement will be running more frequently it also means that there will be less retail statement posting in the evening/night. The transactions will be smaller and therefore also easier to post. But there are a few things to keep in mind. If you trickle feed too often, you will miss out on the transaction aggregation on the unnamed transactions, and will have to process more sales order invoicing per day. This can again slightly increase the load in your system.

This feature will also increase scaling of the system, as posting of transactions can be better load balancing among multiple AOS-batch services. I also have a feeling that there will be more features in this area to come, that will further enable close to real-time master planning, inventory services, and close to real-time power-BI reporting.


Next on customers wish list is a super-duper-fast invoicing service of sales orders(retail), as this still is the most resource demanding task in the processing of retail transactions. It is also in the roadmap the ability for the store manager to perform and generate the financial statement when a shift is closing in POS.  The financial statement in HQ in this case will post whatever the financial statement generated in POS defines, breaking the requirement of having all transactions uploaded to HQ db. And beyond this Microsoft is as always improving general performance by working close with customers and partners. We see that the data distribution and different usage of retail statements require different indexes and Microsoft invests heavily in improving how queries are executed.


Great work to the Microsoft team working on the retail statement processing.


Here is a small joke for all of you that don’t care about retail statement posting



D365: Search for code with Agent Ransack

When supporting customer’s we often can get small fragments of information on an issue, like a form is not performing as expected, or an error message. The procedure is then often to log into LCS and find traces of the issue. Often we end up with a query that is the source of the issue. But to better understand and analyze how to fix the issue we often need to find exactly in the source code where the query is executed. By also being more exact and precise towards Microsoft support you also get quicker response.

Searching through the code in Visual Studio can be time consuming, and the built in Cross reference is not always updated, but there is an alternative I can recommend. Agent Ransack is a free file searching utility that quickly can scan most D365 source code (the *.XML files placed in K:\AosService\PackagesLocalDirectory\).

Let’s say I see in the LCS that the current query is what I need to find out from here it is executed.

From the query I can then search for the text “Join RetailEODTransactionTable”, and I get 25 results, and even where the exact table is not specified as

I can then open the file in explorer and then validate to see if I need to go into Visual Studio for further analysis.

This speed up the process of finding the source code that you are looking for. It is free and download it from https://www.mythicsoft.com/agentransack/ and install it in you development environment.


Take care Daxer’s.

Meetings: Every minute counts, and snooze to 1 minute before meeting starts

As a consultant I’m used to having a lot of “back-to-back” meetings, and when the next meeting is near, I typically get an outlook reminder 15 minutes prior to the meeting.

Then using the “Snooze” button is good. If I snooze until 5 minutes before I am too early. 0 minutes before and I am too late. You know that in the drop-down, the minimum selection is 5 minutes? That is too much for me. I would like to have a new reminder when it is 1 minute before the meeting start. But did you also know that you can type into the field? You can actually write “1 minute”, and this will then remind you when it is 1 minute to the meeting start.

A smaller more advanced way is to set the default reminder to 16 minutes, prior to the meeting

And then when the reminder “pop’s” up, to can select to “Snooze” and select to be reminded in 15 minutes. That is exactly 1 minute before the meeting starts.

Now I have just “earned” 4 more minutes where I can create D365 customer value before the meetings starts

D365F&O – Address performance tips

Sometimes the smallest thing can make a huge difference. At a customer we experienced a huge load (DTU +70% average), and the LCS shows that there was a single SQL query that was the reason for the load. The data composition here was that there was close to a half million customers in the customer table, and most of them had addresses, email and phone numbers assigned to them. Except of the customers used for retail statement processing.

In LCS environment monitoring you can see this as spikes in the overview.


The query you typical see looks like this:


By downloading the query plan, we see that there is a index seek on the table LOGISTICSELECTRONICADDRESS.


This results in that the indexes don’t get a good “hit” on the logisticselectronicaddess.type.

The solution was surprisingly easy. Add Phone, Email address and URL to the customers.


Then the DTU drastically goes down, and normal expected performance was achieved.


Conclusion; Remember when having many customers, to fill inn contact information.

This just must be shared

D365F&O – Community Driven Engineering

I have previously blogged about the importance of reporting new ideas, issues and bugs to Microsoft, and also why the community will benefit from sharing. I see that experienced engineers have the solution available and are more than willing to give it for free to get the fixed-up code into the standard solution to benefit customers and future projects.


But the formalized support path does require time and energy and remember that not all Microsoft support consultants are engineers that you can discuss X++ topics with. But how can the process of contributing to the D365 community be easier?

But did you know that Microsoft have a program for Community Driven Engineering with Dynamics 365 F&O? This covers not only bugs, but also new features. Community driven engineering (CDE) is a Microsoft effort to make external engineers more efficient at providing recommended bug fixes as minor features to Microsoft, as well as to make Microsoft more efficient in accepting fixes from the community. If the fix is accepted, it will be merged into the main Dynamics 365 F&O branch. I have tried the program, and reported in a fix for auto-report as finished, and the fix was accepted, and hopefully in the near future the entire community can benefit from it.

How to start?

If you have the right skills and the willingness to share and give away your fixes (or features) you can sign up at https://aka.ms/Communitydrivenengineering. You need to be accepted into the program, and your user must be whitelisted before you can access. The CDE also have a private Yammer group, that you get access to when accepted. But I must warn you. This program is meant for the most experienced and technical people we have in our community, and that are deep into X++ and AzureDevOps. You must have approval from CxO-level in your organization that you can share code with Microsoft. (Lawyer stuff)

Here is the overall flow for the external engineer:

  1. You create a bug or a Feature in CDE Azure DevOps
  2. The bug or Feature is reviewed by the MS team and accepted or rejected
  3. You create a branch for this work and commit in this branch
  4. When done you create a Pull Request
  5. The Pull Request is reviewed by the MS team and feedback is provided
  6. After some iterations the Pull Request will be approved and complete
  7. The MS team will take over the code and include in a future release

Here are the more technical details of how it works.

The following text is copied from the onboarding documentation of the CDE.

It takes approximately one hour to get started with CDE, the majority of which is the initial build time.

  1. Obtain a development VM from LCS with build (app 8.1, platform update 22) or later. The latest branch I have access to is, that basically is 10.0.2 PU 26.
  2. Make sure you have opened Visual Studio at least once on the VM to sign in and pick default settings.
  3. Install Git on the machine from https://git-scm.com/downloads . The default installation options should work fine.
  4. From an administrator command line instance, clone this repo to a location on the machine.
    pushd k:\
    mkdir git
    cd git
    git clone https://dev.azure.com/msdyncde/_git/cde

  5. Define your user name and email in Git
    git config –global user.name “John Doe”
    git config –global user.email johndoe@example.com

  6. Mount the git repo into the F&O deployment
    pushd K:\git\cde
    powershell .\Mount.ps1
  7. Open Visual Studio as administrator and rebuild the following models

At this point you can start development(in the SYS layer actually)

How to submit a change?

Changes submitted by the community are committed to the same REL branch matching the version on the dev VM. Once the pull request (PR) is completed, that signals that Microsoft has officially accepted the change and it will show up in a future official release, usually the next monthly release (depending on what day of the month the release closes). The change will only enter the master branch of msdyncde through a future official release. Syncing to the tip of a REL branch will pull in other community changes submitted from that version.

  1. Create a Bug or Feature depending on whether the change is related to incorrect behavior of existing code, or new behavior.
    New work item > bug
    Fill in the title, assign it to yourself, and set the Area to your best guess as to where the behavior belongs (will help us review appropriately)
    In repro steps and system info, provide information on why this change is necessary
  2. In Git, create a topic branch to work on. Branches are usually named by username/bug number.
    git checkout -b johndoe/482
    git push –set-upstream origin johndoe/482

  3. In Visual Studio make changes to Application Suite SYS code as normal. Changes are actually being made directly in the Git folder.
  4. Push your changes to VSTS.
    git add -A
    git commit -m “Message explaining what is being changed”
    git push

  5. Send a pull request from VSTS
    New pull request
    Source branch = johndoe/482, Destination branch = Rel_8.0.30.8022 (or whatever version you have)
    Fill in the title and description, link the work item > Create

Any feedback from Microsoft reviewers (or other Community reviewers) will show up in the PR. Changes can be made to the PR by editing in Visual Studio, and doing git add / commit / push again. Once Microsoft has signed off, all comments have been resolved, a work item is linked, and all other polices have been met, then you can click Complete to complete the pull request. When a PR is completed, that is official acceptance by Microsoft that the change will become part of a future official release, usually the next monthly release.

Behind the scenes

  • The powershell script starts by checking what version of source code exists on the VM by examining the K:\AosService\PackagesLocalDirectory\ApplicationSuite\Descriptor\Foundation.xml file.
  • It then checks out the REL branch associated with that version, which matches the platform and other model versions currently on the machine.
  • The development config files are updated to allow changes to SYS models, which is normally disallowed on dev VM’s.

In addition to having an accelerated approach to get fixes into main branch, participants also have some more benefits. You will have access to the latest & greatest code changes through all code branches that Microsoft makes available. You can search through the code and see if there are code changes that affects extensions or code that is local to you installations. You can also see how the Microsoft code is evolving and improvements are made available in the standard application. You will also build gradually very valuable network towards the best developers in the world, where you will discuss technical topics with the actual people creating the world’s best ERP-system.

One final joke for those considering going into this program: Git and sex are a lot alike. Both involve a lot of committing, pushing and pulling. Just don’t git push –force

D365F&O – Auto-report as finished in a Retail scenario

For many years I have had the opportunity to work on Dynamics 365 topics involving Kitting, Value Added Services(VAS) and Bill-of-Materials(BOM). Today I would like to write about the released product parameter “Auto-report as finished” in a retail scenario, and you can read more about report as finished at the Microsoft docs. To explain the business scenario, let’s take hot-dogs. A hot-dog is normally assembled as the customer wants, but in this scenario, we have a standardized hot-dog with 4 ingrediencies.

As a retailer, I would like to sell the finished product, but keep track of the raw materials. To do this you need to create a BOM, and when the hot-dog is sold, Dynamics 365 will automatically report a hot-dog as finished, and draw the ingrediencies from the store warehouse. It is possible to use a production order, but for retailers this is overkill. Something much easier is needed. Instead of exact BOM’s, then average BOM’s can also be used, since knowing exactly how much onion or mustard the customer will apply is not an exact science.

Dynamics 365 have a nice feature for this; Auto-report as finished.

What this parameter does, is then when the product is physically deduced (or Sold) a BOM journal will be created and posted. This will create issue-transaction (sold) from your inventory.

Here I have created a BOM for my hot-dog:

When creating a sales order and posting a packing slip you will see that a Bom journal is automatically created and posted.

The posted BOM Journal looks like this, and here we see that a hot-dog is added to the warehouse, while the ingrediencies are subtracted from the warehouse.

For retailers, this means that we can sell goods in the POS, and when the statement posting process is creating and posting sales orders, the auto-report as finished functionality will be posted. So, no need of any production order, or manually posting Report as Finished journals. So, Dynamics 365 have an alternative to retail kit’s, if a more standardized BOM’s are used. The BOM can then also be used for cost calculations on food and retail produced items. Comparing the counting and the actual transactions will also help to know how accurate the BOM are for describing the cost picture of the products. Master planning will also catch this, and you can get replenishment to work on ingrediencies.

BUT!!! There are some issues.
As a workaround and to make this work you will have to specify default warehouse per site per item in the default order settings.(I know this is an impossible task if you have 500 products and 500 stores, as this would mean you have to create 250.000 default order settings). I have a support request going with Microsoft to change this, so that this is not needed, and that the warehouse can be inherited from the parent transaction. So, if you get error like this, then you have done nothing wrong, and hopefully it will be fixed on future releases.

STOPP HERE, unless you like X++

Here is something for the “technical” guys; The code that automatically triggers this auto-report as finished is actually the class InventUpd_Physical.updatePhysicalIssue(). For those of us, that have worked quite some time with Dynamics, we understand that this class is very central, because all physical inventory transactions are posted through this class. The behavior of auto-posting BOM’s will therefore influence all places where a physical transaction is posted.

Microsoft have created a method on the movement classes named ” canBeAutoRepAsFinished()”, that let’s them refuse this behavior on certain transaction types.

If you don’t want to wait until Microsoft fixes the feature where the warehouse dimension is inherited from parent BOM, then you do have an option to extend the class BOMReportFinish.initInventDimFromBOMRoute(), and here set the InventLocationId from the parent. Her is at least my suggestion to fix the issue in the standard code(without extension):

Here is the code for validating that warehouse storage dimension is used on the BOM-line, and sending this back to the report as finished class.

Take care and I’ve got to get back to work. When I stop rowing, the mothership just goes in circles.

Dynamics 365 F&O – Selecting the correct Tier level on your sandboxes

When purchasing Dynamics 365 F&O, you a get of Microsoft managed (but self-service) environments that is included with the standard offer. (Production, Tier-2 Standard Acceptance Testing and a Tier-1 Develop/Build and test environment. Microsoft have described this on the environment planning docs. I will not discuss Tier-1 environments here, as these environments is optimized for development experiences. Do not perform performance testing on a tier-1 environment.Tier-2+ environments is based on the same architecture as a production environment and uses the Azure SQL Database service.

When running an implementation project, it is common to purchase additional tier 2+ environment that is used of different purposes as shown in the table below (from Microsoft Docs)

Selecting the correct level is important and is depending on what the environment is going to be used for. As a guidance, Microsoft have the following baseline recommendation:

On the projects where I have been involved, we most often have 3 or 4 Tier 2+ environment and the purpose are changed through the project.

The flow of data between these environments can be included into a Sprint Cycle. The process will start with defining the general parameters in the golden configuration environment (1). Here all system setup, number sequences, and master data will be uploaded/entered from the legacy systems. The Test/Stage/Migration environment (2) will be created based on the golden environment + transactional data packages/initial startup data. Then there will be a database refresh from Test (2) à UAT (2), where all test scripts will be run and approved. The results and configuration changes/master data are then fed into the golden environment ready for the next data movement cycle. The reason why we do this, is to ensure that the golden environment and the migration environment is not corrupted through testing. At Go Live, and when the UAT is approved (after a few iterations), then the Migration environment will be copied to the production environment. This can only happen once. Subsequent updates to the production environment must be done manually or using data packages.

(1)- Tier-2 Golden environment (before PROD have been deployed). This environment is often changed to become staging environment that contains an exact replica of the production environment. I prefer golden environments as a Tier-2, as this simplifies the transfer of data using the LCS self-service database refresh.

(2)- Tier-2 data migration. This environment is used for making transactional data ready for being imported to the production environment at Go-Live.

(3)- Tier-2/3 User acceptance. Here the system is really tested. Lot’s of regression testing and running test scripts. The focus is functionality. If there are concerns on performance, a Tier-5 environment can be purchased for a shorter period to validate that system can handle the full load of a large-scale production environment. For performance testing, it is recommended to also invest in automation of the test script. (Unless you ask the entire organization to participate in a manual test).

The performance of a system is a combination of the raw computing power of the VM’s hosing the AOS, and the sizing of the Azure SQL. With Dynamics 365 we don’t have any way’s of influencing the sizing of this. It is all managed by Microsoft, and they will size the production environment according to number of users and transactions per hour. But the Azure SQL boundaries that Microsoft is most often related to the following sizing steps.

I don’t exactly understand how Microsoft is mapping the Tier-2..5 towards these steps, but I have experienced that a Tier-2 level in some cases are a P1, P2, P4 and P6. More information on the DTU capacity can be found here, and the summary is that we can expect 48 IOPS per DTU. So, a P6 will provide 48000 IOPS. If you want to check your DTP limit, then open SQL manager towards the Azure SQL database, and execute the following script:

sys.dm_db_resource_stats ORDER
BY end_time DESC;

And then the DTU limit should be shown here: This is from a Tier-2 environment belonging to the initial subscription, and this seams to have 250 DTU’s(P2)

But what puzzles me is if I go into another Tier-2 add-on environment I have 500 DTU (P4)

And in the third Tier-2 add-on environment I have 1000 DTU (P6)

So there seams not to be a consistency between the DTU’s provided and the Tier-2 add-on purchased. As far as I know, in this case the production environment is 1000 DTU’s(Or P6) in some of my customers.

The AOS’es on the Tier-2 environment seams to mostly be D12/DS12/DS12_v2 with 4 CPU and 28 Gb RAM and 8x500Gb storage, capable of giving out 12.800 IOPS.

What also puzzles me is the number of Tier-2 AOS’s that is deployed. Some environments have one AOS, and one BI server.

While other Tier-2 environments have two AOS’es and one BI server

I assume that the differences are related to how the subscription estimator have been filled out, and that this may have an impact on what is deployed as sandbox Tier-2 environments.

Dynamics 365 do have some performance indicators under the system administrator menu, that gives some numbers, but I cannot see a clear correlation between the environments and the performance. Maybe some of you smart guys can explain how to interpret these performance test results? What is good, and what is not?

If we take the “LargeBufferReads”, how does your environments perform?

Dynamics 365F&O – Enabling new hidden functionality (SYSFlighting)

With Dynamics 365 version 10, the innovation wave from Microsoft is continuing to accelerate. All customer will use the same base source code of the Dynamics 365 solution, and it will be maintained and updated every month. But for many customers, stability also have its value. New functionality every month is not always what existing customers want to implement. New functionality could mean new trainings and new testing. Me on the other hand loves new features, because it enables new possibilities and solutions.

Microsoft have a solution for this, and that not all new functionality is enabled by default. Instead, the new functionality must be manually enabled based on support request through LCS. Two specific functionalities that is already documented is new functionality in Data Management framework and Business events. In the documentation pages you can see how to enable this hidden functionality, but the essence is that you have to run a SQL commend (only available for non-production environments) :


PS! This is NOT something you can enable by your self in a production system.

A small tip, to search for in Docs.microsoft.com is the term “SYSFLIGHTING“. And then you will see the articles on documented hidden features.

But there are more, but undocumented features in two categories; Application and Platform. And these can be seen as two macro’s in the source code, named ApplicationPlatformFlights and ApplicationFoundationFlights. I have taken a snapshot of them here and based on the names we do get some indication of what they are used for. What they are, and how to use them I expect will be documented in the future.

PS! I look forward in exploring the “AnalyticsRealTimeReporting“, “DMFEnableAllCompanyExport“, “AnalyticsReportWebEditor“, “BusinessEventsMaster“, “ApplicationPlatformPowerAppsPersonalization“.

Happy Flighting

Near real-time replenishment in Dynamics 365 F&O

There is a lot of good stuff on the horizon for Dynamics 365. I highly recommend that you check out the following article of some new planning services that will come in the April 2019 release.


To make this happen, I would expect the planning to go deeper into the SQL stack, and also to maximize the utilization of in-memory processing of the transactions.

For Retailers, this will be highly appreciated, where limited space in the stores means that shelf replenishment several times each day is common. Especially for perishable products with limited shelf-time. Keeping things fresh and presentable is a necessity for the customer to buy. The ability to more quickly react to customer demands ensures that the customers actually find the products in your store. And the same aspect, when there are a slower sale, the ability to adjust down the replenishment according to activity. This saves cost and increases profit. In Retail, it is the small improvements that in sum creates the big results.

For the planning service to work, it needs the transactions to take action on. In Dynamics 365 for Retail we must choose between the ability to aggregate the transactions coming from the POS/Channel databases, or more quickly posting the statements. I’m looking forward to many good discussions on this area.

The future is faster

Retail Enterprise Architecture mapping using ArchiMate and ARDOQ

The warning; The blog post is High Level, but the benefits can be mind-blowing.

Enterprise Architecture is about understanding and change. In today’s business, change is everywhere and the essential part to survive. But change is not easy. To have insights and understanding of your own organization is essential for change and risk assessment. Understanding how people, processes and technology are connected will give focus to achieve high value benefits. In my profession we use the Microsoft Dynamics technology stack as a main driver for implementing improvements. But we also acknowledge that Dynamics 365 is not the only system at work. Even though Dynamics 365 is a central component, there always will be many other systems, processes and technologies that is included in the enterprise architecture (EA). We need a way to describe all these connections in uniformed way, that allows us to communicate a model for enterprises dynamically.

But why should EA mapping be a central part of your business? here are 6 business motivators and benefits of having a structured approach of the EA mapping:

Increased stability and availability. It is critical vital that all central systems have a near 100% availability. POS and back-end systems must always work, and the supporting processes must be streamlined to secure that risks related to business improvements and changes are minimized and understood. The EA mapping documents the relationships and show consequences changes.
Guaranteed Performance. Having acceptable system response 24/7, that can deal with business pikes must be planned and built around the system. Systems must deal with a variable load, handling that the sudden event changes the transaction volume. Any disruptions quickly result in customers walking away. The EA mapping must document components central for performance compliance, and the business actors involved
Scalable capacity. New stores or changes in the business model can quickly change the requirement for transaction and processing capacity. To be cost effective, the capacity scalability must dynamic according to the actual need. Both in terms to scaling up and down. The EA mapping documents components central for scalability, and the business actors involved.
Strong security. Cyberattacks are increasing and it is vital important to secure information and transactions. Being GDPR compliant puts demands on systems and internal processes on how to handle own and customer information. Security, tractability and audit trail builds trust into the system and documenting compliancy. The EA mapping documents governance and role compliance, and the business actors involved.
Right focus. There are always new business opportunities and process improvements. Keeping track on where to focus will lead to better and faster implementation of changes in a secure and stable manner. New ideas must be analyzed, and risk assessed, and also to understand the implications. The EA mapping can assist in focusing on what changes have the highest priorities and benefits.
Cost control. Being a retailer involves large investments in technology like POS, Mobile apps, customer portals and enterprise systems. Moreover, there may be large fluctuations in system usage throughout the year. By purchasing these features in the subscription form, it is possible to equalize the operating costs and that you only pay for what is needed. Good liquidity is archived by balancing cost full investments towards the revenue stream and securing actual return on these investments

To move forward a “language” is needed to describe an enterprise architecture model where you can visualize, plan, implement and maintain all relationships that exists today, in transitions and the final vision.

Architecture Layers using ArchiMate

The overall mapping can be modelled in 5 main layers; Here I would like to focus on the symbolism used for identifying. The notation here is ArchiMate, that is open and independent enterprise architecture modeling language to support the description, analysis and visualization of architecture within and across business domains in an unambiguous way.

Motivation Elements defines the overall drivers and goals that the enterprise have. Much of the vision is located here. The Motivation elements can also be seen as a vertical layer, in close relationship to all layers.

The Strategy layer defines the overall course of action and a mapping towards resource and business capabilities.

The Business layer defines the business processes and the services the enterprise is providing, and the here the main business processes are defined. To simply the modeling it is relevant to start with the Business Objects, Business processes, Business Roles, Business actors, Business events, Business Services and Business Rules and Logics.

The Application layer contains application services and capabilities, their interactions and application processes. Here Dynamics 365 and much of the power platform is located. To simply the modeling it is relevant to start with Data objects, Application functions and Application components.

The Technology and physical layer describes the software and hardware(physical or virtual) capabilities that are required to support the deployment of business, data, and application services; this includes IT infrastructure, middleware, networks, communications, processing, standards, etc. The underlaying structure of Microsoft Azure would typically be described here. To simply the modeling it is relevant to start with Artifacts, System Software, Technology Service, Device and Communication network.

Architecture Relationships using ArchiMate

The real beauty comes, when the relationships between architecture elements are being defined. But to do this, a set of predefined relationships needs to be defined. The most common used is the following one

If putting this together in a combined setup I get the following relationship diagram of what is relevant to document.

(*Credits to Joon for this visualization)

As seen here, the business processes are a realization of the application functions, and this clarifies how a proper Enterprise Architecture modelling is documents. With this model, we can what business actors is assigned to what Business roles. This again shows the business process assignment to the role. The Business processes are there to realize business services.

Building the Architecture model using Ardoq

The architecture relationships can be challenging to describe using tools like Visio. Often, we see that great work is done, but not used to the potential. An alternative is to use cloud based mapping tools as ardoq, that covers most aspects in documenting relationships between business processes, applications, roles, risks and transitions. This is not a commercial for this tool, but I find it great. So, I decided to try to use Ardoq to model the Contoso demo data.

Here I will focus on the Application Layer, as this is the layer where the application functionality and data are located. First, I create the application components:

Then I create the Application Functions, and I also import the Business Roles that is available in the Contoso demo dataset.

Next job is to build the relationship between the application functions(D365), business processes(vertical processes) and business roles. This will allow me to visualize and to trace dependencies across all the EA mappings. Let’s take an example looking into the responsibilities of an employee named April Mayer.

I can here see that she is related to the business roles; Accounts payable clerk and manager. If I click on the “Accounts payable clerk” I jump into the view of this business role, and I can see that it is related to the business processes of accounts payable, and an association to April Mayer.

Jumping to accounts payable allows be to see the business processes involved.

I can also visualize the entire Enterprise Architecture Map will all objects and relations,

And zoom into specific on the relations; This graph shows me that April Meyer belongs to the role “Employee”, Accounts payable manager and clear. The Accounts payable clerk is associated with the business process “Accounts payable”. The clerk role is associated with the Financial management modules in Dynamics 365.

Here is another visualization, that shows the how the business objective of “Marketing” can be achieved, and what Business roles are involved, what Business processes, Application functions and what application components are also involved.

Knowing the relation and the ability to communicate is a key to happy Enterprise Architecture mapping.

Give is a try, the result can be very powerful.

Additional information

1. A high value blogger on Enterprise Architecture is http://theenterprisingarchitect.blogspot.com/.

2. Homepage of archimate: http://pubs.opengroup.org/architecture/archimate3-doc/toc.html .

3. Homepage of ARDOQ : https://ardoq.com/ Give it a try !

MPOS – Open full (kiosk) screen mode when having dual display

For a retailer, every saved “click” is appreciated, and the ability to remove any noise appreciated.

When starting MPOS in maximum mode, you will often see that you have a title bar at the top, and the app-bar at the bottom.

In windows 10 you can also use the “tablet-mode” to get the MPOS into full screen mode.

BUT! If you have a dual display setup, it the tablet mode does not work.

If you want to remove them, there is a smart keyboard short-cut:


This will put the MPOS in full screen mode, and giving a nicer appearance without the bar’s.

Then the questions is how to make this always happen, when starting the MPOS ? This was actually not a easy task, but a colleague of me (Espen) made it possible , du using a powershell script.

The following page contains a small powershell script, that opens a UWP app in full (kiosk) screen mode:

Add this to a “start up folder”, and create a new powershell script containing ;



Then create a shortcut towards this new powershell app.

How initial investigations (by Sven Erik) shows that the MPOS app ID is Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App and let’s hope this ID stay’s permanent.

Then the MPOS looks nicer for the user, without noice.




Retail assortments and planned orders extensions

Microsoft have created an excellent description of this in the Assortment management doc-page. Retail assortments are essential to define what products that should be available across retail channels. Assigning a product to an assortment, will assign the product to the stores that have the assortment. This makes it possible to sell the product in the store.

But there is something missing, and that is to use assortments for replenishment and procurement. Retailers want the master planning to only suggest procurement and transfers on products that belongs to the stores assortments. You can control requirement parameters per warehouse, but there is no standard way to use the assortment to control the generation of planned orders.

This blog post is to show how to make a very small extension that allows you to make sure that only products that belongs to a stores assortment will generate planned orders. The solution I will make use of is to look into how the product lifecycle state functionality is working, and extend this with an assortment planning parameter. I have called this parameter “Is lifecycle state active for assortment procurement

What it will do, is to validate if a product is in the assortment of the store, and if it is, then the product will be requirement calculated and will generate planned orders. If the product is not in the assortment of the store, that no planned orders will be generated.

To make this happen, I needed to create 4 extensions. The three first is adding a new field on the product lifecycle form.  For an experienced developer this is easy to create, and no need to spend time on in this this blog-post.

The real fun is how to manage and control the planned order generation. This happens in in an extension to the ReqSetupDim class.  Here there is a lookup to the assortment on the product, and checks if the store have this assortment.  If not, then the masterplanning will not generate any planned orders. I therefore create an extention class and use the new method wrapping/CoC feature to add some additional code.

/// Contains extension methods for the ReqSetupDim class.
/// </summary>

final class ReqSetupDim_extension

    /// Validates if a product should be assortment planned
    /// </summary>

    /// The parm of the ReqSetupDim class.
    /// false if the product is not assortment planned; otherwise, return default value.
    public boolean  mustReqBeCreated(InventDim _inventDimComplete)
        Boolean ret = next mustReqBeCreated(_inventDimComplete);

        if (ret)
            if (inventdim.InventLocationId)
                InventTable                 inventtable;
                EcoResProductLifecycleState ecoResProductLifecycleState;

                //Fetching fields from  inventtable
                select firstonly ProductLifecycleStateId, Product from  inventtable where inventtable.ItemId == this.itemId();

                //validating if the product is active for planning and that also assortment planning is enabled.
                select firstonly RecId from ecoResProductLifecycleState
                        where   ecoResProductLifecycleState.IsActiveForAssortmentPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.IsActiveForPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.StateId == inventtable.ProductLifecycleStateId;

                    RetailStoreTable                    store;
                    EcoResProduct                       product;
                    RetailAssortmentLookup              assortmentLookupInclude;
                    RetailAssortmentLookup              assortmentLookupExclude;

                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupInclude;
                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupExclude;

                    //Finding OMOperatingUnitID from the inventlocationId
                    while select firstonly OMOperatingUnitID from store
                        where store.inventlocation == inventdim.InventLocationId
                        //Check if the product is in the assortment of the store in question
                        select RecId from product
                            where product.RecId == inventtable.product
                        exists join assortmentLookupInclude
                            where   assortmentLookupInclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupInclude.lineType == RetailAssortmentExcludeIncludeType::Include
                        exists join assortmentLookupChannelGroupInclude
                                where   assortmentLookupChannelGroupInclude.OMOperatingUnitId == store.OMOperatingUnitID
                                    &amp;&amp;  assortmentLookupChannelGroupInclude.AssortmentId == assortmentLookupInclude.AssortmentId
                        notexists join assortmentLookupExclude
                            where   assortmentLookupExclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupExclude.lineType == RetailAssortmentExcludeIncludeType::Exclude
                        exists join assortmentLookupChannelGroupExclude
                            where   assortmentLookupChannelGroupExclude.OMOperatingUnitId == store.OMOperatingUnitID
                                &amp;&amp;  assortmentLookupChannelGroupExclude.AssortmentId == assortmentLookupExclude.AssortmentId;

                        if (!product)
                            ret = false; //The product does NOT belong to the stores assortment, and should not be planned
        return ret;

I also have code to restrict creation of manual purchase orders, and where simular code can be used, but let’s hope that Microsoft can further extend standard Dynamics 365 with assortment based procurement planning.

Copy with pride, and let’s hope next year will give us 365 more opertunities.

POS Invoice Pay – #Dyn365F&O

A very nice omnichannel capability made available in Dynamics 365 version 8.1, is the ability for customers to pay their invoices directly in the POS. A scenario is that a customer is allowed to purchase “on-account” and then later pay all the invoices. Let’s say that the customer is in a hotel, and allows the customers to buy food, drinks and services throughout the stay. At the end of the stay the customer pays for all the services at the reception. Like “pay-before-your-leave”.

There is no requirement that the goods have to be sold on a POS. It is fully omnichannel capable. So, the orders can be created in the call-center, WEB or in stores. I would like to share this with you and how you can set it up in the Contoso demo data set. If you open the functionality profiles, you will find the possibility to enable paying:

  • Sales order invoice
  • Free text invoice
  • Project invoice (Yes! Even project invoices!)
  • Sales order credit note

The next thing you need to do is to add a “Sales invoice” – button to the transaction screen. (I’m using Houston store, and button grid F2T2)

This will add a sales invoice button to the POS design, that allows for paying invoices in POS.

The next thing is to create a POS transaction/order. First select a customer (like Karen), and then use the on-account button to sell the goods.

On the payment screen you can say how much you would like to put on account, and you also see that the credit limit and balance is available.

The next step requires that the there are some periodic batch jobs, that needs to run;

1. Run the “P-job”, to fetch the transactions from the channel database.

2. Run the “Calculate statement” (manually or in batch)

3. Run the “Post statement” (This process will create the sales order and the invoice)

!Make sure the statement is posted and invoiced before continuing!

The option you now have is to continue to the process in Dynamics 365, and create an automatic sending of the invoice to the customer through print management, or have the customer come to the “reception” and pay for the goods directly.

To pay the order, select the Karen customer, and use the Sales Invoice button.

If you have done all right, you should find the invoice in the list now. (If you have enabled aggregation in the parameters, you will have a single invoice per customer)

I can then select the invoice (or multiple), and pay it using cash, card, loyalty (And even on-account again)

This opens up for some very nice omnichannel processes, and I hope that Microsoft invests further in this. It would be nice to actually see the actual lines on the invoices that is being paid, and to even print-out the invoice if the customer requires this. Also I suggest that for retailers, use the modern report possibility to make the invoice look awesome.

Take care friends, and thanks for all your support and encouragement!

Retail category managers, Simplify your import of released products in #Dyn365FO

It is a category manager’s job to try to maximize profit from selling products within a specific category. This may be looking after a broad category such as ‘confectionery’ or they may focus closely on a more specific category, such as ‘snacking’. A category manager will analyze complex data collected on shopper behavior from a range of different sources, and then translate it into meaningful information. The category manager’s duty is to ensure that their company is providing the market with the products that consumers desire.

Retail Category managers love Excel. It is used for almost everything, and they perform much of the analyzing, lookup, data collection and decision making in Excel. When implementing Dynamics 365 we are often faces with large set of excel spreadsheets that needs to be imported. I have seen users import 8 different excel spreadsheets for importing products. This blog post is about how to simplify the process of keeping retail master data in single excel sheet and easily importing and updating products. For this, Dynamics 365 data management framework is used. One of the problems I often se uses are struggling with, is the issue that the source excel spread sheet is a single spreadsheet, but it needs to be imported into several data entities. For a retailer some of the most common master data entities are:

Data entity

Description of data entity

Products V2

Contains Product number, Product name and dimension groups

Released products V2

Contains most fields on the released product

Item – bar code

Contains the item barcodes used for scanning

Default order settings

Contains information like minimum purchase quantity etc.

External item descriptions for vendors

Vendors item numbers and descriptions

Product category assignments

The connection to the retail category hierarchy.


It is possible to create a single excel spreads sheet that overs all of these entities, and in a single run import or update the retail products.

So how to exactly do this?

Create an Excel spreadsheet with exactly the following.

I recommend creating two sheets. First one is a “read me” sheet, that explains the “template” sheet.

Use exactly the column names as described here. This will make the mapping between the columns and the data entity automatic. Here I also use color coding to show what entity each column mainly belongs to.


Example Value





Product number

Released Products, Products



Product number

Released Products


Jalla Coffee 500G

Item name

Released Products, Products


4001392 Jalla Coffee FILTER 500G

Seach name

Released Products, Products


Jalla Coffee FILTER 500G

Seach name

Released Products


Jalla Coffee Original is a useful coffee that can be enjoyed on most occasions. A carefully selected mix of coffee types, mainly from Brazil, guarantees a round and full-bodied coffee with long aftertaste

Full item description

Released Products, Products



Should always be “product”

Released Products, Products



Item or Service

Released Products, Products



Name of the storage dimension group

Released Products, Products



Should last purchase price be updated automatically

Released Products



Should cost purchase price be updated automatically

Released Products



WHI(warehouse controlled) or, SRV(service)

Released Products



Inventory unit

Released Products



Purchase unit

Released Products



Sales unit

Released Products



Latest purchase price in local currency

Released Products



Latest cost price Sin local currency

Released Products



Default sales price in local currency

Released Products



Weight of the product

Released Products



Primary vendor

Released Products



Purchase item tax groups

Released Products



Sales item tax groups

Released Products



Grouping related to buyergroup

Released Products



Tracking dimension

Released Products, Products



Base sales prices on purchase price ?

Released Products



Standard verdier

Released Products



item model group

Released Products



Coverage group

Released Products



Gcount group

Released Products



Purchase price quantity

Released Products



Cost price quantity

Released Products



Financial dimensions(=”-D30-320—“&B34)

Released Products

Product Dimension


Just a helping colum

Help column for DefaultLedgerDimension


Retail category

Retail hierarcy name

Product category assignments



Category node

Product category assignments



Vendors item number

External item descriptions for vendors


Jalla Coffee FILTER 500G

Vendors item name

External item descriptions for vendors



Vendor number

External item descriptions for vendors



Barcode type

Item – Bar Code, Released products




Item – Bar Code



barcode unit

Item – Bar Code



Scanning yes/no

Item – Bar Code



Barcode quantity

Item – Bar Code



Purchase under delivery percentage allowed

Released Products



Purchase over delivery percentage allowed

Released Products



Minimum purchase quantity

Default Order Settings



Maximum purchase quantity

Default Order Settings



Standard purchase quantity

Default Order Settings



Multiple purchase quantity

Default Order Settings


The template excel spread sheet columns should contain exactly the columns as listed above:

Then start building the excel spread sheet (this is the time consuming part). This can also be regarded as the “master file” for products. And mass update and mass import of products is done using this file. Remember that you can add more columns and also include calculated fields. Like in this case, the default dimension (used for financial dimension have the formula like =”-D30-320—“&B34 making sure that cell B34 is merged into the financial dimension.

Create the data management import project.

In the data management workspace, create a import project, and use the “+ Add file”, and select the excel file by using the “upload and add”. Then select all the entities and what page in the excel spread sheet that should be imported.

– Select file
– Select entity name
– Select sheet lookup
– Then repeat by select entity name and sheet lookup until all date entities needed are selected

After done this correctly you should have an import project with the following entities:

You should also click on the “view map” symbol if there are a warning, and just delete the lines where there are no mapping generated. Like what I have done here to the “Products V2” entity.

The mapping will be done automatically for you, and will only select the fields that is relevant for each data entity.

Your data entity is now ready to be used. I recommend to use the data management workspace, and select the import project and then “run project”

Then for each data entity I upload exactly the same excel spreadsheet :

And then click on the “import”. If there are any errors, then fix them in the excel sheet or make changes to the staging.

What we then have accomplished is to have a single excel spreadsheet that the category manager can maintain and work with, and it can uploaded(several times) into the import project. For trade agreement sales and purchase prices I normally recommend creating a separate excel spread sheet

Then the excel loving category managers will be happy, and they can import thousands of products in a very short time








D365F&O Retail: Combining important retail statement batch jobs

The Retail statement functionality in D365F&O is the process that puts everything together and makes sure transactions from POS flows into D365F&O HQ. Microsoft have made some improvements to the statement functionality that you can read here : https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/statement-posting-eod. I wanted to show how to combine these 3 processes into a single batch job.

The following drawing is an oversimplification of the process, but here the process starts with the opening of a shift in the POS (with start amount declaration), and then start selling in POS. Each time the job P-0001 upload channel transaction is executed, the transactions are fetched from the channel databases, and imported to D365F&O. If you are using shift-based statements, a statement will be calculated when the shift is closed. Using shift-based closing can be tricky, but I highly recommend doing this! After the statement is calculated and there are no issues, the statement will be posted, and an invoiced sales order is created. Then you have all your inventory and financial transactions in place.


What I often do see, is that customers are using 3 separate batch jobs for this. The results in the user experience that the retail statement form contains many calculated statements waiting for statement posting. Some customers say they only want to see statements where there are issues (like cash differences after shift is closed).

By combining the batch jobs into a sequenced batch job, then the calculated statements will be posted right again, instead of waiting until the post statement batch job is executed. Here is how to set this up:

1. Manually create a new “blank” batch job


2. Click on “View Tasks”.

3. Add the following 4 classes:

RetailCDXScheduleRunner – Upload channel transaction (also called P-job)

RetailTransactionSalesTransMark_Multi – Post inventory

RetailEodStatementCalculateBatchScheduler– Calculate statement

RetailEodStatementPostBatchScheduler – Post statement

Here I choose to include upload of transactions, post inventory, calculate statement and post statement into a single batch-job.

Also remember to ignore task failures.

And remember to click on the “parameters” to set the parameters on each task, like what organization notes that should be included.

On each batch task I also add conditions, so that the previous step needs to be completed before the batch-job starts on the next.

Then I have 1 single batch job, and when executing it spawns subsequent tasks nicely.

The benefit of this is that when you are opening the statements workspace you mostly see statements where there are cash differences, or where the issues on master data.

Take case and post your retail statements.




A quick look at download Retail distribution jobs (CDX)

Commerce Data Exchange (CDX) is a system that transfers data between the Dynamics 365 F&O headquarters database based and retail channels databases(RSSU/Offline database). The retail channels databases can the cloud based “default” channel database, the RSSU database and offline databases that is on the MPOS devices. If the look at the following figure from Microsoft docs, this blog post is explaining how to practically understand this.

What data is sent to the channel/offline databases?

In the retail menus you will find 2 menu items; Scheduler
and scheduler subjobs. Here the different data that can be sent is defined.

When setting up Dynamics 365 the first time, Microsoft have defined a set to ready to use scheduler jobs that get’s automatically created by the “initialize” menu item, as described here.

Scheduler jobs is a collection of the tables that should be sent, and sub jobs contains the actual mapping between D365 F&O and channel database fields. As seen in the next picture, the fields on the table CustTable in D365 is mapped towards the AX.CUSTTABLE in the channel database.

To explore what is/can be transferred, then explore the Scheduler jobs and scheduler subjobs.

Can I see what data is actually sent to the channel/offline databases?

Yes you can! In the retail menu, you should be able to find a Commerce Data Exchange, and a menu item named “Download sessions”.

Here you should see all data that is sent to the channel databases, and here there are a menu item names “Download file”.

This will download a Zip file, that contains CSV files, that corresponds to the Scheduler
and scheduler subjobs.

You can open this file in Excel to see the actual contents. (I have a few hidden columns and formatted the excel sheet to look better). So this means you can see the actual data being sent to the RSSU/Offline channel database.

All distribution jobs can be set up as batch jobs with different execution reoccurrence. If you want to make it simple but execute download distribution job 9999 to run every 30 minutes. If you have a more complex setup and need to better control when data is sent, then make separate distribution batch-jobs so that you can send new data to the channel databases in periods when there are less loads in the retail channels.

Too much data is sent to the channel databases/offline database and the MPOS is slow?

Retail is using change tracking, and this makes sure that only new and updated records is sent. This makes sure that amount of data is minimized. There is an important parameter, that controls how often a FULL distribution should be executed. By default it is 2 days. If you have lots of products and customers, we see that this generates very large distribution jobs with millions of records that will be distributed. By setting this to Zero, this will not happen. Very large distributions can cripple your POS’es, and your users will complain that the system is slow, or they get strange database errors. In version 8.1.3 it is expected to be changed to default to zero, meaning that full datasets will not be distributed automatically.

Change tracking seams not to be working?

As you may know, Dynamics 365 have also added the possibility to add change tracking on data entities when using BOYD. I have experienced that adjusting this affect the retail requirement for change tracking. If this happens, please use the Initialize retail scheduler to set this right again.

Missing upload transactions from your channel databases?

In some rare cases it have been experienced that there are missing transactions in D365, compared to what the POS is showing. The trick to resent all transactions is the following:

Run script: “delete crt.TableReplicationLog” in the RSSU DB. And the next P job will sync all transactions from RSSU DB (include missing ones).


Using Cloud POS as your retail mobile device

Handheld functionality for retailers is a question I get a lot. Then typical in the area of counting, replenishment, receive and daily POS operations. In version 8.1 Microsoft have taken a small step forward to make it easier to use any handheld device that supports a common browser. Because Cloud POS (CPOS) runs in a browser, the application isn’t installed on the device. Instead, the browser accesses the application code from the CPOS server. CPOS can’t directly access POS hardware or work in an offline state.

What Microsoft have done is to make the CPOS change according to the screen size, to work more effectively on your device. To make it simple, I just want to show you how it looks on my iPhone.

Step 1: Direct your browser towards the URL of where the CPOS is located. In LCS you will find the URL here:

Step 2: Activate your POS on mobile device by selecting store and register, and log in

Step 3: Log into CPOS and start using it. Here are some sample screens from my iPhone, where I count an item using CPOS.

You can also “simulate” this in your PC browser, but just reducing the size of your browser window before you log into CPOS. Here I’m showing the inventory lookup in CPOS.

What I would love to see more of is:

– Barcode scanning support using camera

– The ability to create replenishment/purchase orders in CPOS

– More receive capabilities like ASN/Pallet receive etc.

– Improved browser functionality (like back-forward browsing etc)

To me it seems clear that we will see additional improvements in CPOS, making it the preferred mobile platform for Dynamics 365 for Retail. As we get a little, I hope to see more of this as Microsoft is definitely investing i