D365 Outsourcing your master data (DaaS)

In Dynamics 365 implementation projects I often say that all we do can mainly be categorized into 3 headline topics.

As we know for Dynamics 365, Microsoft is providing the software and the platform needed. It is easy to buy as a service where only a monthly commitment is essentially required. This is the nature of the Software-as-a-Service cloud-based concept.

The implementation partners are the best in structuring an implementation project and guiding step-by-step through the jungle. There is a lot of knowledge needed to understand complex processes needed in an organization. The partners are typically working tightly with people and ensuing that the organizational machinery is oiled and running smoothly. Defining processes that follows the entire end-to-end processes like procure-to-pay or order-to-cash.

The third element of equal importance is master data. I have written some previous blogpost about the subject, that is relevant to check up. Traditionally building the master data have often been the responsibility of the organization implementing Dynamics 365 and have been regarded as the heart and soul of the organization. The data is often manually built/generated and maintained, and low quality in master data can have catastrophic effects in any organization. If you cannot trust your data, then you do not have the information needed to make good business decisions.

Traditionally this have been identified as an integration requirement, but the main “ownership” of the data have still been handled internally in the company. Here is where I see a change. Instead of maintaining your own master data, the master data is maintained through cloud based public services operated based on a monthly fee. Just like SaaS (Software as a Service), we see mature implementations of DaaS (Data as a Service), where Dynamics 365 customers is closely integrating and outsourcing much of the maintenance to vertical specific online services.

But one aspect I see, is that the data providers are not global actors, but tends to be more local and verticalized services to specific domains. To be specific towards some providers here in Norway, I would like to name-drop some providers that I have encountered that provide such services.

BREG – Brønnøysund Register Center

The Brønnøysund Register Center develops and operates digital services that streamline, coordinate and simplify dialogue with the public for individuals and businesses. They operate many of the Norway’s most important registers, that contains information about companies, roles, tax etc Many of the services is free, and you can read more about them. If you need validated and confirmed information about any organization on Norway, then this is the registers you need to integrate towards. My friend Fredrik S, from Microsoft have create many demo’s showing how easy it actually is to set this up.

BISNode – Integrated credit check and risk management

Knowing the commercial risk is essential for all businesses. By having updated information, the decisions become less risky and less labor intensive.

1881 – search and return person address information

1881 is Norway’s leading provider of personal and business information and is providing information on telephone numbers, names and addresses. By having lookup into databases like 1881 you instantly get address information that enrich your data and simplifies transaction handling.

GS1 – The Global Language of Business

GS1 is the main provider of a lot of supply-chain oriented master data. Here you maintain product GTIN/barcodes, and they also provide a GLN (Global Location Number) register. When working with delivery addresses, then this is a must-have, because it ensures that goods are shipped and received to the right places. For a small fee, you get access to updated addresses directly into D365, where the addresses are also enriched with GPS coordinates. One more relevant aspect of GS1, is the GPC (Global Product Classification), that makes it easier to search for products globally and is also a very good reporting/analytics structure.

TradeSolution – The Norwegian Grocery PIM

If you are going to sell or purchase products through the Norwegian grocery chain’s, you need to have a close connection with Trade solution. I have written about them previously, but they make sure you have a reliable source of product master data and properties of the products. If you are using their services, there is no need for a third part PIM solution. They also provide a media store for product pictures.

NOBB – The Norwegian Construction PIM

NOBB contains almost 1,000,000 articles from 700 suppliers. You will find a wide range of product information, e.g. lumber, building materials, hardware, tools, fasteners, paints, houses and gardens, water / plumbing, electrical etc. The database contains basic data, price, logistics data, images, documentation streamline the industry’s need for structured and quality-assured basic data. The quality of the product database is ensured through the industry bodies Quality Forum and the Standardization Committee. The item owner updates and maintains the information based on industry standards (ref quality forum and standardization committee). This is a unique quality assurance and proximity to the industry that no other players can offer.

Elfo – The Norwegian electronics PIM

Electronics Industry Association – EIF – is an industry association for Norwegian-based companies that runs electronics-related activities that are mainly aimed at the professional market, either as importer, manufacturer or developer.

Farmalogg – The Norwegian pharmacy industry PIM.

The product register, with few exceptions, covers all goods that are sold in pharmacies, and it contains information that is necessary for the safe and efficient handling of the goods throughout the value chain from manufacturer / supplier, through wholesaler and retailer, to end-user.

Prisguiden – Compare your prices

Price databases that allows you to compare your prices to competitors. You can also measure the popularity and trends that happens in the market. What does customers search for? By tightly integrating towards the market, makes decision making easier and can be made more automated.

Consignor – Easy shipping

Delivery Management is all about connecting your warehouse to your customers in the most efficient way. By making one standard integration to services like Consignor, they make sure that no matter what combination of carrier services you choose, customer will get the same high-quality feeling when receiving a delivery from you.

Currency exchange rate

This service is already present in standard Dynamics 365 – Start using it!

There are surly many other master data providers, and here I have list listed a few actors in the Norwegian marked. By outsourcing your master data maintenance, you will get much higher quality on the data and more return on investment.

Are you ready to outsource your master data ?

DaaS Leben ist kein Ponyhof

 

 

D365 – My Covid-19 10 day’s response story

Hi Friends.

I hope you all are hanging in there and can still work and deliver excellent experiences with Dynamics 365.

I wanted to share my Covid-19 10-day response story on how fast a reduced scope Dynamics 365 implementations has been made available. Some weeks ago, we and Microsoft were contacted by an important player in the health industry, that urgently wanted to establish purchasing- and supply-chain processes for medications and equipment’s. The key element here was the urgency because it was unclear in what directions the pandemic would take here. What the customer needed was tools that could process information about supply providers and what kind of supplies is needed for readiness stockpiling. Our first step was to setup Dynamics 365 (CRM) to store relations and this was done in a few days. Then the next step was to setup and go live with a “minimum viable product” of Dynamics 365 finance and supply-chain apps. We had a goal of doing this in 10 working days. This is the story I would like to share.

Day 1: Onboarding, tools, and deployment

In the initiation of a project, I always have a document named “Welcome to the [Customer]-project”. This is a great document, because it contains all the essential information about the onboarding to a project and can be shared to all participants. It is typically a 6-7 pages document explaining the onboarding process and the main objectives. It also contains references to LCS, SharePoint/Teams sites, DevOps and URL’s to environments. The most valuable element is a full overview of all the people that will somehow be involved in the project. In this project we decided on a small efficient 4 person team(POD), and fast-track support from Microsoft.

Microsoft quickly processed licenses, and we quickly deployed the LCS project. The first we started was to deploy the Tier-2 sandbox, and we named this the ‘UAT’ environment, and this was to be used as the master data/golden environment in the start. We also deployed the Tier-1 sandbox and named this “Test”, and would be used to have access to Visual Studio etc. The initial version we deployed was 10.0.10.

We have a ready implementation templates that is imported into DevOps, that contains the main structure of requirements and tasks. We scope this down to the actual processes we need.


We also have a ready folder structure for the team’s site where we can store and complete all documentation. By the end of the first day we had established the tools needed for starting the project.

Day 2: Working with the generic tasks in the backlog

We established a 30-minutes daily sprint meeting with main implementation major actors, where the plan is presented, and where the today’s tasks are prioritized. We did not have the time to create large word documents, to we decided to document the solution in DevOps, and organizing all the system setup around the entity templates as they can be extracted from D365. I exported the templates to Excel, and then import them to DevOps using the Azure DevOps Office® Integration, and this gives be 419 tasks to setup as much as possible in standard.

This makes it possible for we to have a step-by-step task list of all the elements I need to build the “Golden environment”. Also, each task is being assigned, and the actual setup is documented with a direct URL to the D365 form, and a screen dump of the actual setup.

On the first day we where able to process close to 200 tasks and setting up the most generic parts of the system.

Day 3: Working with the finance task backlog

When working on the finance setup we have a standard chart-of-accounts we imported, and we had to setup financial dimensions. We are also setting up the accounting structure, creating a few inventory posting profile and setting up tax parameters. Normally this is quite strait forward and we can use much from previous projects.

Day 4-5: Working with products

Now the Excel skills is put to the test. We have a excel sheet that contain most of the product master data. In total over 33.000 products, and each product have classifications, attributes, properties, and vendor/producer information. We quickly decided to use the same item numbering as was present in the excel sheet. Each column in the sheet was classified if:

  • This is a field we have in D365?
  • Should field become a category in a hierarchy?
  • Should the field because an attribute?

To get the products inn it was a very advanced copy/paste/merge of data into excel sheets that we then imported into Dynamics 365. At the end, we realized that all information we had could be imported, and without any information loss. It was hard work, but the end result was promising containing a list of all medical supplies available and classified into the medical ATC structure.



We also imported barcodes, vendors, producers, employees, address information, external items names/descriptions, attributes.

Day 6: Frist demo, UAT and deploy production environment

On day 6 we were ready to show the actual master data, and the initial view of the system. The customer was impressed by how fast we where able to build a system and processes that was familiar to their operation.

We decided to update the system to 10.0.11, and in parallel with the setup of the system we had been working closely with the Microsoft fast track solution architect to make the environments ready for production deployment. After a few iterations we got the production environment up and running and performed a DB-refresh of the production environment with the master data we had in the tier-2 sandbox. This meant that now we had an environment available to start performing transactional process testing and trimming the systems. I know that this is not the normal way of doing this, but thanks to Microsoft’s understanding of the urgency we where allowed to go this “fast-track” route. In DevOps we established the processes we wanted to test and optimize.

Day 7: Test dual write, business events and power platform

As earlier described, we also implemented some of the “CRM” elements first. Now we could enable the dual write, and synchronize vendors, employees, and other information into the CDS. Our first step was just to validate that it was working as expected in the UAT, and it worked as a charm We can share these master data across the D365 platform.

The next thing was to test how we could use the business event framework to integrate towards a 3’rd party WMS provider. Dynamics 365 have a business event that is kicking in when performing a purchase order confirmation. We decided to enable purchase order change management to have a strict workflow and ensure that we would rely on the purchase confirmation process.

This allows us to create a solution where the business event is catched by a power automate flow, that fetched all the lines of the purchase confirmation. And then transforms this into the format that the WMS provider needs. We can also enrich the data sent to the WMS provider, so that it is sufficient with all needed master data in their system. The next step is to import receive lines from the 3’rd party WMS provider. This will happen by power automate creating an arrival journal, and then a batch job in D365 is posting it, and then posting a product receipt. It all ends with a new business event being triggered (Purchase order received) that will send a message to the WMS provider that the goods now have been received. What we then archive is that the on-hand in each system is synchronized, and without any major delay caused by processing.

In total we have setup quite a lot of batch jobs, that handles all from cleaning, posting, and planning. We used the takings from the following blogpost as a template for batch jobs.

Day 7: Master planning and Planning Optimization

We do expect that quite a lot of requisitions and requirements will be processed through the system. So, using the new planning optimization engine from Microsoft suited the project well. Calculating the requirement on all products is extremely fast and done within minutes. This will allow for faster reaction time to new requirements and potentially reduce stockout situations caused by vendor lead time.

On day 7 we also imported all employees and created some approval position hierarches. This way we can extend the workflow processing for approvals.

Day 8-9: Testing, Testing, Testing in UAT

We started day 8 by refresh the UAT environment and executing testing according to key central the business requirements defined in DevOps. We found 3-4 issues, that was reported to Microsoft (Index performance etc), that was quickly fixed within hours by the excellent support architects. We also wanted to provide a bit visually nicer purchase order form-letter, that was more presentable, and decided to import the modern reports package from Microsoft. This makes it a bit easier to adjust.

We did try out the configurable business documents, but in this case it would take a bit more time to learn properly (that we did not have..) to set up correctly. Any issues we found, was also fixed in the PROD environment.

The main processes we focused on was the procurement processes, with approval steps, and manual coordination with vendors.

Day 10: Project closure and training

On day 10, we summarized on how far we had come, and created a project closure/summary report that also contains next steps and more backlog suggestions. We have suggested additional focus on Azure Data Lake, Power BI and implementation of a vendor portal. We also planned to perform training and making final changes to enable end-user onboarding. What we see is that making a system ready is not just setting up the system but implementing the use of the system in the daily operation. This is expected to take more time, and we are ready to respond

Final words and tips

I really hope this system will show it value and will be regarded as small but valued contribution to the covid-19 response. Microsoft have published the following page where there are resources that can help. Microsoft have also launched a program where you can get a 200 seat Dynamics 365 Customer Service system for free for 6 months to Covid-19 response related activities. Se https://dynamics.microsoft.com/en-us/covid-19-offer/

If you have any similar stories, please share them. The Dynamics 365 community cares and stands united in this Corona-19 fight!

D365 Importing JSON data the hard way!

I recently created a solution where I’m importing products and all related data for the grocery industry, and I wanted to share my experience so that others may follow. This is not a “Copy-Paste” blogpost, but more show my approach to the process that can be used when working with more advanced and complex JSON integrations. Many industries have established vertical specific databases where producers, distributor’s and stores are cooperating and have established standards on product numbers, product naming, GTIN, Global Location Number (GLN) etc. In Norway we have several, and the most common for the grocery industry here is TradeSolution. Most products is available to the public at VetDuAt.no, but they also have a Swagger API where the JSON data can be fetched and imported to D365.

One of the experiences I had when starting this journey, is that D365 is not modelled according how the data in these industry specific public databases. Much is different, and the data is often structed differently. We also see that the product databases are quite rich in terms of describing the products with physical dimensions, attributes, packing structure, allergens, nutrition’s etc.

To give you a small figure of the complexity you often can find, here is a subset of the JSON hierarchy:

I needed to decide how I should import this data. Should I just import what I have fields for in D365? Should I extend D365 will lot’s and lot’s of new fields? Or should I model according to how the external database is presenting the data? I decided on the latter and import the data as it was presented. This would give the best result and the least information loss in the process. I decided to go for a model where D365 is requesting a JSON file from the Swagger API, and then placing the JSON structure in a C# class structure. Then extracting the data from the C# objects and place the data into a new module I named EPD. The next step the process does is to take these data and populate the standard 365.

The benefit I see is that I’m not overextending the std Microsoft code. The data is available in D365, and can be used in Power BI etc. I would like to share some of the basic steps when fetching such large data structures from external services.

Fetch the JSON from the service.

To fetch a JSON file, I’m using some .net references, that helps handle Active Directory and http connections. The first method shows how to get an accesstoken, and this is relevant of the swagger services requires this. The next method is where the swagger URL is queried, and the JSON file is returned. In additional some success/error handling.

So at this time we have the JSON file, and we want to do some meaningful with it. Visual studio have a wonderful feature, where you can paste a JSON, and convert it into classes. To make this work, you will have to create a C# project.

This will generate the C# class, and in this example the number of sub-objects and the number of properties is in the hundreds, and the properties can be objects and event array’s of objects.

In addition I need to have a method that takes that JSON file, and deserializes the content into the class methods.

Store the JSON object data into D365 tables.

So at this time, we have been able to fetch the data, and in the following code, I’m getting accesstoken, getting the JSON, deserializing the it into an C#-object, and parsing it forward for more processing.

 

Now, let’s start inserting this data into a new D365 table. For simplicity reasons, I have created a D365 table for each data object in the JSON file. This allow me to store the entire hierarchical JSON structure into D365 tables for further processing. As soon as I have the data stored in D365, I can create the codes that moves it forward into the more functional tables in D365.

A lesson learn was that when creating sub tables to store hierarchical JSON data, it is sometimes needed to create relationship between the records in multiple tables. Sometimes also uniqueness is required, and the best way I have found (so-far) is to create a GUID field, and use this GUID to relate the data in the different tables. This can easily be accomplished with the following code.

Create the std D365 data using data entities through code.

At this stage I have ALL the data in D365, and I can start processing the data. Here is a subsection of how I create released products by using standard
data entities, where a table containing the JSON data is sent in, and I can create the products and all sub tables related to products.

This approach has resulted in solution, where it is easy for the end-user to fetch data from external systems, and import them into D365. Here is a form showing parts of the “staging” information before it is moved into D365 standard tables. (This form in in Norwegian, and showing a milk )

I would like to thank the community for all the inspired information found out there. Especially Martin Dráb (@goshoom) that have been very active in promoting the “Paste JSON as classes” in Visual studio.

 

 

 

 

 

 

D365 : Automatic license disablement and login reminder

When assigning licenses to a Dynamics 365 user, it would be beneficial if the system disabled and removed a license from a user if the user has not used the system for X days. X minus 5 days the system should send out a message to the user like this:

“This is a login REMINDER for Dynamics 365. Kurt Hatlevik has not logged into for at least 25 days. Your last login was 2/20/2020 12:10:00 AM. Login to Dynamics 365 is required at least once within a 30 days window or your account may be deactivated without notice. Please login within the next few days to ensure access is maintained.

Reactivation will require user administrator approval and will be dependent upon license availability.”

This would make the system more secure, and it will also free up licenses for users that are not using the system.

If you also think this could be beneficial, please vote on this idea her : https://experience.dynamics.com/ideas/idea/?ideaid=c12972cf-6a6c-ea11-b698-0003ff68dcfc# 

D365 and the supply structures in grocery retail industry

Today I will write a bit about the supply chain structure we see in the retail grocery industry, and challengers Dynamics 365 may face, and how to address them. The grocery industry has for many years seen that industry collaboration brings benefits and synergies throughout the value chain. We see industry collaboration that offers a range of services to its owners, customers and partners. In the country where I’m from, the main collaboration initiative is TradeSolution, and is owned by the main grocery chains in Norway. TradeSolution operates and maintains central registers, databases, and various IT, reporting and analysis services in Norway, but we see much of the same pattern in other countries and other industries also.

One essential element is to have a unification of how to identify products and how the products are packed, ordered and shipped. In Norway we have the term EPD (Electronic Product-Database), that makes it easy for the entire Norwegian grocery marked to purchase and sell products. Much of the information shown in the blogpost here is originating from TradeSolutions public pages here.

What is EPD?

In Dynamics 365, one of the most essential SCM elements are products and released products, and the associated master data tables related to this. In the grocery industry it is actually the packaging that is the center of it all. The products etc is actually properties of a packing structure. It would be an oversimplification to say that EPD is products. EPD is describing not only the products, but also the packaging of the products. The EPD standard is describing the products in up to 4 levels: basis, inner box, outer box and pallet(with SCCS). Each level identified with a GTIN. See also my old blogpost about SSCC.

So far so good. We can model this in Dynamics 365 by having a product defined as a “Basis”, and use the inner box, outer box and pallet as unit conversions. In D365 we also have the possibility to create barcodes for each unit of measurement (UOM). It would also be quick to assume that the EPD number is an external item description.

Unfortunately, the grocery industry is a bit more complex. Let’s take a quick look on the EPD numbers of Coca Cola. It is actually 7 packing structure/EPD numbers, and these are shown to the right(7digits). All of the represents different packaging of the same basis unit, and can have different properties and attributes.

What we also see is that some boxes are marked with a “F”, that means this is a consumer unit. So talk in D365 language, is can be sold to consumers. Some are also marked with a “B” that means that this is the unit that the EPD number is purchased in. So if we take a detailed look at EPD 4507224, we see that it is defined what units you can sell, and what units you can purchase. On a single EPD number there is only one level you can choose to purchase of. Here are 2 examples that describes the complexity. First example is an EPD, where the grocer can sell in basis unit and in inner box unit (EPD 4507224)

The next example is where the grocer can also sell basis unit and in another inner box unit type (EPD 2142941)

As you can see here, the conversion between inner boxes to pallet results in different quantities.

To further add complexity we can add the definition mix to the element. The ordering is happening on the inner box level, but it actually contains separate products that is sold through the stores.

On last element is also the concept of unmarked variants. Like this package of yogurts.

Summary EPD

  • A product is identified by a EPD number (EPDnr)
  • A unit is identified by a GTIN (Global Trade Item Number)
  • A unit is called «pakning» in EPD
  • A product can have up to 4 levels of units (hierarchy)
  • A product can be a mix of multiple «basis» or «mellom/innerbox» units
  • A “basis” unit can be shared by many products
  • The first level of the units is called «basis» in EPD (often referred to as a customer unit or base unit)
  • The top level of the units is called «topp» in EPD (often referred to as a load carrier unit)
  • The levels between «basis» and «topp» (if any) are called « mellom/innerbox/outerbox » units
  • A basis unit can consist of units without identification called unmarked variants («umerkede varianter»)
  • Within an EPD structure, only one of the packings is used for ordering.
  • Multiple packings can be used for sale.

Some key issues we have faced with Dynamics 365 on how the industry is modelling products is the following:

  1. Cost: As seen, a product can be sold in many different UOM’s, and we also see that the industry can have different purchase prices depending on which EPD number you choose to order. Meaning that a 4 pcs pack have a different cost than a 24 pcs pack. As the product can be purchased in multiple UOM with different prices, it is difficult to model the cost pricing correctly, because the inventory transactions will be on the lowest item. The inventory transaction costing is based on the lowest level, meaning basis. This costing problem is the reason why I suggest FIFO in retail grocery implementations.
  2. On-hand: Keeping track of how many basis units, or other consumer units is difficult, because you do not always know with the consumer is breaking up a coca cola inner box. Where should the cost come from, when having multiple purchasing units as shown in figure. This makes it difficult in Dynamics 365 to 100% correctly model the revenue per pcs sold.
  3. Unit conversion: As shown in the example, the same unit (like pallet) can contain different number of basis products. This means that it is insufficient to unify the UoM per product. UoM conversion is EPD dependent. Clear relationships between the UoM must also be modelled. A product may have multiple definitions of an inner box, outer box and pallet.
  4. External item descriptions: Dynamics 365 external item description cannot be used, because it only supports one external item description per vendor. UoM is not taken into consideration.
  5. Attributes: In the grocery industry, there may be different attributes per EPD number, and also different attributes per UoM.

How to model this in Dynamics 365?

To solve the distribution requirements, we see in the grocery industry, it is required to do some front-end remodeling of how products are represented. The grocery industry are focused on packaging and Dynamics 365 is product oriented. The key here is that EPD is Object Oriented, a product can be represented in several packaging structures.

The entities we have at our disposal in Dynamics 365 is the following:

  1. Products and released products
  2. Unit of measurement and conversion
  3. Barcodes
  4. External item descriptions
  5. BOM’s

But Dynamics 365 is what is it, and any change on the architecture of how products and transactions are handled is not on the near roadmap. We must try to model this structure in a way, such that the EPD standard and Dynamics 365 standard is modelled to work jointly together.

First, lets try to model how the EPD(Only subset) from a grocery supply perspective(Not D365!). An EPD can consist of multiple packaging structures, and a package main contain packages. At the bottom of the packing structure there is a reference to a basic package, that describes the product.

 

 

When importing EPD based products I see the following as a solution:

  1. EPD will be a separate entity/Table, and modelled as the grocery industry have it.(New tables in D365, the feeds the std D365 tables)
  2. D365 products will be defined as the “Basic Package”
  3. The EPD package structure populate the barcode table and the product specific unit of measurement table. Because there is several packaging, the traditional naming of the unit of measurement cannot be used. The unit of measurement conversion is actually dependent on the EPD number. In essence, this means having unit’s of measurement named :

    PCS – Basic unit for the lowest basis product
    IB-4507224 – Unit for the inner box
    OB-4507224 – Unit for the outer box
    LC-4507224 – Unit for the load carrier

    With this we can create the unit of measurement conversion between the different types.

Let’s say we have the following simple product:

This would be modelled in D365 with a released product:

I would here have to define 4 unit of measurements:

I would then have to define the following unit conversions to describe the unit conversions between the different EPD packing structures.

The more EPD packing structures present, the more unit conversions needs to be defined. (In the coca cola example there will be 6 more conversions)

We also need to store GTIN per packing unit per EPD:

We also have the Physical dimensions menu item, that now let’s us describe the physical dimension on the product per EPD unit.

 

In Dynamics 365 we can only select one suggested purchasing unit. So if you have multiple EPD associated with a product you will have to choose one, and this is the unit that is suggested.

The purchase order would then look like this, and where the unit is describing the EPD number.

To keep track of all unit conversions, GTIN/Barcodes etc will be an impossible manual job. Since EPD is an industry standard, all of these data is imported through WEB-services.

TradeSolution have their webservices that offer the possibility to send EPD structures to D365. This way, all packing structures of products can be automatically imported, distributed into std D365 and adjusted when needed.

The suggestion is not 100%, but it would make sure that grocery retailers can procure and sell the products, while also have the concept of packing structures in place.

Let’s conquer the grocery industry also

 

 

 

 

 

 

 

D365 – What have changed (pmfTablehasChanged)

This short post is for you hardcore X++ developers that create magic everyday. D365 have the following method, that allows you to validate if any fields on a record have been changed. If it returns true, then something has changed, and if false, then nothing has been changed. There are scenario’s where you would like to know if there have been any changes to the record before you update/write to the Db, to save some roundtrips to the Db.

Then this is nice, and 100% std

Happy coding friends.

Batch Jobs; Take control of the executions

Dynamics 365 can be automated quite a lot with the use of batch jobs. With batch jobs, your Dynamics 365 solution becomes “alive”, and we can set up the system to automate many manually processes. Lets say to have the following “vanilla process”, and wants to automate as many steps as possible.



This document covers the Batch jobs needed to be setup for this process to be as automated as possible. I wanted to put a structured system on all the batch jobs that is typically used in a production system. But this also generates a lot of data, that you don’t normally need. It is therefore common to create both functional batch jobs that processes and executes functionality, and also execute cleanup jobs that removes irrelevant data.

Batch job Naming conventions

To make it simpler to understand the batch jobs a simple structure of naming the batch jobs have been created. The first character is just “A”, to make sure that the sorting of the batch jobs is in the best possible way, and that the batch jobs can be sorted according to name. The next is a 3 digit number and at the last there is a then a description that explains the batch job.

ID

Description

A001-A099

System administration batch jobs

A100-A199

Data management batch jobs

A200-A299

General ledger batch jobs

A300-A399

Procurement and sourcing batch jobs

A400-A499

Sales and marketing batch jobs

A500-A599

Retail batch jobs

A600-A699

Inventory management batch jobs

A700-A799

Warehouse management batch jobs

Reach of these ranges are then set up as batch groups, and you can better control what AOS servers is executing what type of batch jobs:


In this blog post more than 87 batch jobs have been specified, and that keeps the Dynamics 365 system updated and as automatic as possible

Job description
A001 Notification clean-up
A002 Batch job history clean-up
A003 Batch job history clean-up (custom).
A004 Daily Diagnostics rule validation
A005 Weekly Diagnostics rule validation
A006 Monthly Diagnostics rule validation
A007 Named user license count reports processing
A008 Databaselog cleanup
A009 Delete the inactivated addresses
A010 Scan for orphaned document references.
A011 Report data clean up
A012 Cryptography crawler system job that needs to regularly run at off hours.
A014 Updates system notification states.
A015 Deletes non-active and orphaned system notifications.
A016 Database compression system job that needs to regularly run at off hours.
A017 Database index rebuild system job that needs to regularly run at off hours
A018 Deletes expired email history.
A019 Process automation polling system job
A020 Scan for document files that have been scheduled for physical deletion.
A021 System job to clean up expired batch heartbeat records.
A022 System job to seed batch group associations to batch jobs.
A023 System job to clean up unrecovered user session states.
A024 Change based alerts
A025 Due date alerts
A026 Email distributor batch
A027 Email attachment distributor
A103 Entity Store Deploy measurement
A103 Refresh data entity
A200 Clean up ledger journals
A201 Import currency exchange rates
A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.
A205 Update purchase and sales budget
A206 Source document line processing
A207 Source document line processing queue cleanup
A208 Ledger journal monitor
A300 Purchase update history cleanup
A300 Purchase update history cleanup
A301 Delete request for quotation
A303 Draft consignment replenishment order journal cleanup
A303 Run Forecast planning
A304 Run Master planning
A305 Post product receipt
A403 Sales update history cleanup
A405 Order packing slip
A406 Order invoice
A407 Calculate sales totals
A500 All retail distribution jobs (9999)
A501 Upload all channel transactions (P-0001)
A502 Process Assortment
A503 Update listing status
A504 Product availability
A505 Generate related products based on customer transactions
A506 Process delivery modes
A507 Synchronize orders job
A508 Update search Product data
A509 Update search Customer data
A510 DOM batch job
A511 DOM fulfillment data deletion job
A512 Default channel database batch job
A513 Recommendation batch job
A514 Retail scheduler history data removal batch job
A515 Create customers from async mode
A516 Retail transaction consistency checker orchestrator
A517 Retail transactional statement calculate batch scheduler
A518 Retail transactional statement post batch scheduler
A519 Retail financial statement calculate batch scheduler
A520 Retail financial statement post batch scheduler
A521 Process loyalty schemes
A522 Post earned points in batches
A523 Process loyalty lines for other activities
A524 Retail time zone information job
A600 Calculation of location load
A601 Inventory journals clean-up
A602 Inventory settlements clean up
A605 On-hand entries cleanup
A606 Warehouse management on-hand entries cleanup
A607 On-hand entries aggregation by financial dimensions
A608 Cost calculation details
A609 CDS – Post integration inventory journals
A700 Work creation history purge
A701 Containerization history purge
A702 Wave batch cleanup
A703 Cycle count plan cleanup
A705 Work user session log cleanup
A706 Wave processing history log cleanup
A707 WMS Replenishment
A708 Automatic release of sales orders

I will not go in detail of all the jobs, but here I at least refer to where you can find the menu item or what class is used in the batch job tasks. Also take a look at blog post by the D365 Solution architecture team, that is a subset of the batch jobs presented in this blog post.

System administration batch jobs

These are general system batch jobs that can perform cleanups and other general executions.

ID

Name, path and recurrence

Description and recurrence

A001 A001 Notification clean-up

System administration > Periodic tasks > Notification clean up

Daily

This is used to periodically delete records from tables EventInbox and EventInboxData. Recommendation would also be if you don’t use Alert functionality to disable Alert from Batch job.

A002 A002 Batch job history clean-up

System administration > Periodic tasks > Batch job history clean-up

Daily

The regular version of batch job history clean-up allows you to quickly clean all history entries older than a specified timeframe (in days). Any entry that was created prior to – will be deleted from the BatchJobHistory table, as well as from linked tables with related records (BatchHistory and BatchConstraintsHistory). This form has improved performance optimization because it doesn’t have to execute any filtering.

A003 A003 Batch job history clean-up (custom).
System administration > Periodic tasks > Batch job history clean-up (custom)

Manually

The custom batch job clean-up form should be used only when specific entries need to be deleted. This form allows you to clean up selected types of batch job history records, based on criteria such as status, job description, company, or user. Other criteria can be added using the Filter button.

A004 A004 Daily Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Daily

Incorrect configuration and setup of a module can adversely affect the availability of features, system performance, and the smooth operation of business processes. The quality of business data (for example, the correctness, completeness, and cleanliness of the data) also affects system performance, and an organization’s decision-making capabilities, productivity, and so on. The Optimization advisor workspace is a tool that lets you identify issues in module configuration and business data. Optimization advisor suggests best practices for module configuration and identifies business data that is obsolete or incorrect.
A005 A005 Weekly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Weekly

Performs a weekly validation and diagnostics.
A006 A006 Monthly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Monthly

Performs a monthly validation and diagnostics based on the rules.
A007 A007 Named user license count reports processing

Class : SysUserLicenseMiner

Daily

Batch job that counts number of users that have been using the system. The data is used in the Named user license count report. D365 creates this execution automatically, but you have to rename it to fit this structure.
A008 A008 Databaselog cleanup

System administration > Inquiries > Database > Database Log

Weekly

This job cleans up the database log, and makes sure that only (let’s say) 100 day’s of history remains. In the query criteria I set created date time less than “d-100”, to ensure that I keep 100 day’s of database log. This is general housekeeping and dusting in the system, and keeping the system nice and tidy.
A009 A009 Delete the inactivated addresses

Organizational administration > Periodic >Delete inactivated addresses

Weekly

Deletes addresses that have been set to inactive.
A010 A010 Scan for orphaned document references.

Class : DocuRefScanOrphansTask

Daily

Batch job that is setup automatically by the system, and scans for document references where the source record is deleted.
A011 A011 Report data clean up

Class: SrsReportRunRdpPreProcessController

Daily

Cleans up any data generated for SSRS reports.
A012 A012 Cryptography crawler system job that needs to regularly run at off hours.

Class: SysCryptographyCrawlerTask

Every 3 days

Auto created at D365 setup …Not sure what this is, yet…..
A013 A013 Data cache refresh batch

System administration > Setup >

Data cache >Data cache parameters

Every 10 minutes

The data cache framework is used to cache data sets and tiles. Enabling of the data cache framework will redirect certain queries against a cache table instead of executing them against the underlying source tables.
A014 A014 Updates system notification states.

Class : SystemNotificationUpdateBatch

Every minute

Updates notifications,
A015 A015 Deletes non-active and orphaned system notifications.

Class : SystemNotificationScanDeletionsBatch

Daily

Deletes non-active and orphaned system notifications
A016 A016 Database compression system job that needs to regularly run at off hours.

Class: SysDatabaseCompressionTask

Daily

Compresses the database
A017 A017 Database index rebuild system job that needs to regularly run at off hours

Class: SysDatabaseIndexRebuildTask

Daily

Rebuilds indexes to ensure good index performance
A018 A018 Deletes expired email history

Class: SysEmailHistoryCleanupBatch

Daily

Deletes expired email history
A019 A019 Process automation polling system job

Class: ProcessAutomationPollingEngine

Every minute

Using business events, the polling use case can be re-designed to be asynchronous if it is triggered by the business event. Data will be processed only when it is available. The business logic that makes the data available triggers the business event, which can then be used to start the data processing job/logic. This can save thousands of batch executions from running empty cycles and wasting system resources.
A020 A020 Scan for document files that have been scheduled for physical deletion.

Class: DocuDeletedFileScanTask

Hourly

Scan for document files that have been scheduled for physical deletion
A021 A021 System job to clean up expired batch heartbeat records.

Class : SysCleanupBatchHeartbeatTable

Daily

Cleans up the new internal monitoring BatchHeartbeatTable table (Only after PU32), and used for priority-based batch scheduling.
A022 A022 System job to seed batch group associations to batch jobs.

Class:
SysMigrateBatchGroupsForPriorityBasedScheduling

Daily

See priority-based batch scheduling.
A023 A023 System job to clean up unrecovered user session states.

Class:
SysUnrecoveredUserSessionStateCleanup

Daily

Cleans up sessions that is unrecovered.
A024 A024 Change based alerts

System administration > Periodic tasks > Alerts > Change based alerts

Hourly (or faster)

Events that are triggered by change-based events. These events are also referred to as create/delete and update events.

See also Microsoft docs.

A025 A025 Due date alerts

System administration > Periodic tasks > Alerts > Due date alerts

Hourly (or faster)

Events that are triggered by due dates.

See also Microsoft docs.

A026 A026 Email distributor batch

System administration > Periodic tasks > Email processing > Email distributor batch

Send emails. See also Microsoft docs.
A027 A027 Email attachment distributor Send emails, with attachments. For workflow.

Data management batch jobs

Data management executions can generate a lot of data, and to maintain performance and avoid data growth, it is relevant to clean up staging tables and job executions. Also document any of your recurring executions to make it easy and simple to maintain a overview of your data imports and exports that are recurring.

ID

Name, path and recurrence

Description

A100

[Cannot be executed in batch]

Data management workspace > “Staging cleanup” tile

Manually

Data management framework makes us of staging tables when running data migration. Once data migration is completed then this data can be deleted using “Staging cleanup” tile.

A101

A101 Job history cleanup

Data management workspace > Job history cleanup

Daily

The clean up job will execute for the specified amount of time. If more history remains to be cleaned up after the specified about of time has elapsed, the remaining history will be cleaned up in the next recurrence of the batch job or it can be manually scheduled again.

A102

A102 BOYD Data management export

Data management workspace >export in batch

Hourly

If you have a data management export to BYOD, then this can be executed in batch. There are other options that also can be evaluated for this purpose. See A102 BOYD Data management export

A103

A103 Refresh data entity

System administration à Setup à Entity Store

Monthly

To refresh the entity store (the built in embedded power BI). The refresh updates the aggregated measurements, and is only relevant of there are updates or changes that affect these.

General ledger batch jobs

ID

Name, path and recurrence

Description

A200

A200 Clean up ledger journals

Periodic tasks > Clean up ledger journals

Weekly

It deletes general ledger, accounts receivable, and accounts payable journals that have been posted. When you delete a posted ledger journal, all information that’s related to the original transaction is removed. You should delete this information only if you’re sure that you won’t have to reverse the ledger journal transactions.

A201

A201 Import currency exchange rates

Currencies > Import currency exchange rates

Daily

Automatically imports exchange rates from the bank.

A202

A202 Purchase budget to ledger

Inventory management > Periodic tasks > Forecast updates > Purchase budget to ledger

Monthly

Posts the purchase budget to ledger

A203

A203 Sales budget to ledger

Inventory management > Periodic tasks > Forecast updates > Sales budget to ledger

Monthly

Posts sales budget to ledger

A204

A204 Update purchase and sales budget

Inventory management > Periodic tasks > Forecast updates > Update purchase and sales budget

Monthly

Updates the purchase and sales budget.

A205

A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.

General Ledger > Periodic tasks > Batch transfer for subledger journals

Daily

Batch transfer for subledger journals

A206

A206 Source document line processing

Class: SourceDocumentLineProcessingController

Every 10 minutes

Used for accounting distribution. See Microsoft docs.

A207

A208 Source document line processing queue cleanup

Class: SourceDocumentLineProcessingQueueCleanupController

Weekly

Used for cleaning up accounting distribution. See Microsoft docs.

A208

A208 Ledger journal monitor

Class: LedgerJournalTableMonitorController

Every 6 hours

Monitors if ledger journals should be blocked or opened.

Procurement and sourcing batch jobs

ID

Name, path and recurrence

Description

A300

A300 Purchase update history cleanup

Periodic tasks > Clean up > Purchase update history cleanup

Weekly

This is used to delete all updates of confirmations, picking lists, product receipts, and invoices generate update history transactions.

A301

A301 Delete request for quotation

Periodic tasks > Clean up > Delete requests for quotations

Manually

It is used to delete requests for quotation (RFQs) and RFQ replies. The corresponding RFQ journals are not deleted, but remain in the system.

A302

A302 Draft consignment replenishment order journal cleanup

Periodic tasks > Clean up > Draft consignment replenishment order journal cleanup

Weekly

It is used to cleanup draft consignment replenishment order journals.

A303

A303 Run Forecast planning

Master planning > Forecasting > Forecast planning

Weekly

Demand forecasting is used to predict independent demand from sales orders and dependent demand at any decoupling point for customer orders. See also at Microsoft docs, where using additional azure services to perform the calculation.

A304

A304 Run Master planning

Master planning > Master planning > Run > Master planning

Daily

Master planning is used to generate planned (purchase) orders, based on the coverage settings. We expect this service to be enhanced with more real-time oriented planning engine. The master planning batch job execution is located at. Also check out the Microsoft docs on this (large) subject.

A305

A305 Post product receipt

Procurement and Sourcing > Purchase orders > Receiving products > Post product receipt

Automatically post purchase receipt when all lines have been registered,

Sales and marketing batch jobs

ID

Name, path and recurrence

Description

A400

A400 Delete sales orders

Periodic tasks > Clean up > Delete sales orders

Manually

It deletes selected sales orders.

A401

A401 Delete quotations

Periodic tasks > Clean up > Delete quotations

Manually

It deletes selected quotations.

A402

A402 Delete return orders

Periodic tasks > Clean up > Delete return orders

Manually

It deletes selected return orders.

A403

A403 Sales update history cleanup

Periodic tasks > Clean up > Sales update history cleanup

Weekly

It deletes old update history transactions. All updates of confirmations, picking lists, packing slips, and invoices generate update history transactions. These transactions ca be viewed in the History on update form.

A404

A404 Order events cleanup

Periodic tasks > Clean up > Order events cleanup

Weekly

Cleanup job for order events. Next step is to remove the not needed order events check-boxes from Order event setup form.

A405

A405 Order packing slip

Sales order > Ordershipping > Post Packingslip

Hourly

Set up automatic packingslip posting of the sales order is completely picked. (If this is the process). This means that as soon as the WMS have picked the order it gets packingslip updated.

A406

A406 Order invoice

Accounts payable > Invoices > Batch invoicing > Invoice

Hourly

Set up automatic invoice posting of the sales order is completely packingslip updated. (If this is the process).

A407

A407 Calculate sales totals

Periodic tasks > Calculate sales totals

Recalculate the totals for the sales order. This is typically used in scenario’s when the sales order is part of a “Prospect to cash” scenario. See docs.

Retail batch jobs

ID

Name, path and recurrence

Description

A500

A500 All retail distribution jobs (9999)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

This batch job is sending all distribution jobs to the retail channel database. This data like products, prices, customers, stores, registers etc. The distribution job is a “delta” distribution, meaning that only new and changed records are sent. There is a lot of more to be discussed on how to optimize the 9999-distribution job, and for really large retail installations some deep thinking is required. For smaller installations it should be OK to just use the setup that is automatically generated when initializing D365 retail/Commerce.
A501

A501 upload all channel transactions (P-0001)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

The P-0001 is sending the retail transactions back from the POS to the D365 HQ, where the retail transactions can be posted and financially updated.
A502

A501 Process Assortment

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process Assortment

Hourly

This job processes the assortment based on the assortment categories set on an item, and based on the assortment set up, puts the items in the relevant stores’ assortment. When defining an assortment, you have in D365 the possibility to connect organization hierarchies to retail category hierarchies. The process assortment will perform the granulation of this, so that D365 have a detailed list of each product that is present in each store. The assortment is setup under Retail and Commerce à Catalogs and assortments à Assortments and more details is available on Microsoft docs.
A503

A503 Update listing status

Retail and Commerce > Retail and Commerce > Products and Inventory > Update listings

Daily

The listing status is related to publishing a retail catalog to an online store. The Microsoft documentation is not the best in this area, and the closes explanation I have is that it is related to the listing status on the catalog.
A504

A504 Product availability

Retail and Commerce > Retail and Commerce > Products and Inventory > Product availability

Daily

The batch job for product availability is calculate if a product is available on online store. Checkout this blogpost for further details. SiteCore eCommerce integrations can benefit from this, and in essence it populates the data needed for distribution job 1130, and that maintains the following tables into the channel database
A505

A505 Generate related products based on customer transactions

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Generate related products

Daily

This job will automatically populate related products based on sales transaction purchase history. The two relationships created are ‘customers who bought this item also bought’ and the ‘frequently bought together’ relation types. This data can then further be used in eCommerce scenario’s. Fore deep details, take a look at the class ‘RetailRelatedProductsJob’
A506

A506 Process delivery modes

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process delivery modes

Daily

This job sets up delivery modes on a new store when added to organization hierarchy ‘retail store by department’. On the modes of delivery you can assign a organizational hierarchy, and this batch job assigns the specific modes of deliveries to each store. The modes of delivery is used in omnichannel scenario’s where the customer can have their products sent home etc.
A507

A507 Synchronize orders job

Retail and Commerce > Retail and Commerce IT > Synchronize orders

Hourly

If you have setup your channels to create sales order asynchrony, this job will create the sales orders and post payments. Also take a look at the following Microsoft docs on how sales orders and payments are synchronized from an online store.
A508

A508 Update search Product data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of products, that makes it faster and easier to search for products in the call center.
A509

A509 Update search Customer data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of customers, that makes it faster and easier to search for customers in the call center.
A510

A510 DOM batch job

Workspace > Distributed Order Management > Dom processor job setup

Hourly

Run distributed order management on retail sales orders to determine what warehouse should deliver the sales order
A511

A511 DOM fulfillment data deletion job

Workspace > Distributed Order Management > DOM fulfillment data deletion job setup

Daily

Cleans up the DOM data that is no longer the valid calculation.
A512

A512 Default channel database batch job

Class : RetailCdxChannelDbDirectAccess

Every 3 minutes

This job main duty is to check all Download sessions and Upload sessions with status “Available”, then it will apply the data to respective target DB’s (AX or channel DB). See also this blog.
A513

A513 Recommendation batch job

Class FormRunConfigurationRecommendationBatch

Weekly

Se Microsoft docs.
A514

A514 Retail scheduler history data removal batch job

Retail and Commerce > Headquarters setup > Parameters > Retail scheduler parameters

Class: RetailCdxPurgeHistory

Daily

Deletes CDX history. Typical only keeping 30 days of CDS history
A515

A515 Create customers from async mode

Retail and Commerce > Retail and Commerce IT > Customer > Create customers from async mode

Hourly

If customers should be created async (parameter), then this job will create the customer.
A516

A516 Retail transaction consistency checker orchestrator

Retail and Commerce > Retail and Commerce IT > POS posting > Validate store transactions

Hourly

Performs validation on the unposted POS transactions. See Microsoft docs.
A517

A517 Retail transactional statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Creates transactional statement. Se the following blog post.
A518

A518 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Create and posts sales orders. Se the following blog post.
A519

A519 Retail financial statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate financial statement in batch

Daily

Retail statement Trickle feed financial statement calculate. Creates financial statement. Se the following blog post.
A520

A520 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post financial statement in batch

Daily

Retail statement Trickle feed financial calculate. Posts shift declaration Se the following blog post.
A521

A521 Process loyalty schemes

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty schemes

Processes loyalty schemes. See Microsoft docs.
A522

A522 Post earned points in batches

Retail and Commerce > Retail and Commerce IT > Loyalty > Post earned points in batches

Loyalty points should be posted in batch. See Microsoft docs.
A523

A523 Process loyalty lines for other activities

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty lines for other activities

Other Loyalty points in batch. See Microsoft docs.
A524

A524 Retail time zone information job

Monthly

Generates timezone information up until 2054. Ensures that timezone used in the store does not causes inconsistent dates.

Inventory management batch jobs

ID

Name, path and recurrence

Description

A600

A600 Calculation of location load

Inventory management > Periodic tasks > Clean up > Calculation of location load

Daily

WMSLocationLoad table is used in tracking weight and volume of items and pallets. Summation of load adjustments job can be run to reduce the number of records in the WMSLocationLoad table and improve performance.

A601

A601 Inventory journals clean-up

Inventory management > Periodic tasks > Clean up > Inventory journals cleanup

Weekly

It is used to delete posted inventory journals.

A602

A602 Inventory settlements clean up

Inventory management > Periodic tasks > Clean up > Inventory settlements cleanup

Manually/Yearly

 

It is used to group closed inventory transactions or delete canceled inventory settlements. Cleaning up closed or deleted inventory settlements can help free system resources.

Do not group or delete inventory settlements too close to the current date or fiscal year, because part of the transaction information for the settlements is lost.

Closed inventory transactions cannot be changed after they have been grouped, because the transaction information for the settlements is lost.

Canceled inventory settlements cannot be reconciled with finance transactions if canceled inventory settlements are deleted.

A603

A603 Inventory dimensions cleanup

Inventory management > Periodic tasks > Clean up > Inventory dimensions cleanup

Manually/Yearly

This is used to maintain the InventDim table. To maintain the table, delete unused inventory dimension combination records that are not referenced by any transaction or master data. The records are deleted regardless of whether the transaction is open or closed.

Inventory dimension combination record that is still referenced cannot be deleted because when an InventDim record is deleted, related transactions cannot be reopened.

A604

A604 Dimension inconsistency cleanup

Inventory management > Periodic tasks > Clean up > Dimension inconsistency cleanup

Manually/Yearly

This is used to resolve dimension inconsistencies on inventory transactions that have been financially updated and closed. Inconsistencies might be introduced when the multisite functionality was activated during or before the upgrade process. Use this batch job only to clean up the transactions that were closed before the multisite functionality was activated. Do not use this batch job periodically.

A605

A605 On-hand entries cleanup

Inventory management > Periodic tasks > Clean up > On-hand entries cleanup

Monthly

This is used to delete closed and unused entries for on-hand inventory that is assigned to one or more tracking dimensions. Closed transactions contain the value of zero for all quantities and cost values, and are marked as closed. Deleting these transactions can improve the performance of queries for on-hand inventory. Transactions will not be deleted for on-hand inventory that is not assigned to tracking dimensions.

A606

A606 Warehouse management on-hand entries cleanup

Inventory management > Periodic tasks > Clean up > Warehouse management on-hand entries cleanup

Weekly

Deletes records in the InventSum and WHSInventReserve tables. These tables are used to store on-hand information for items enabled for warehouse management processing (WHS items). Cleaning up these records can lead to significant improvements of the on-hand calculations.

A607

A607 On-hand entries aggregation by financial dimensions

Inventory management > Periodic tasks > Clean up > On-hand entries aggregation by financial dimensions

Weekly

Tool to aggregate InventSum rows with zero quantities.

This is basically extending the previously mentioned cleanup tool by also cleaning up records which have field Closed set to True!

The reason why this is needed is basically because in certain scenarios, you might have no more quantities in InventSum for a certain combination of inventory dimensions, but there is still a value. In some cases, these values will disappear, but current design does allow values to remain from time to time.

If you for example use Batch numbers, each batch number (and the combined site, warehouse, etc.) creates a new record in InventSum. When the batch number is sold, you will see quantity fields are set to 0. In most cases, the Financial/Physical value field is also set to 0, but in Standard cost revaluation or other scenarios, the value field may show some amount still. This is valid, and is the way Dynamics 365 for Finance and Operations handles the costs on Financial inventory level, e.g. site level.

Inventory value is determined in Dynamics 365 for Finance and Operations by records in InventSum, and in some cases Inventory transactions (InventTrans) when reporting inventory values in the past. In the above scenario, this means that when you run inventory value reports, Dynamics 365 for Finance and Operations looks (initially) at InventSum and aggregates all records to Site level, and reports the value for the item per site. The data from the individual records on Batch number level are never used. The tool therefore goes through all InventSum records, finds the ones where there is no more quantity (No open quantities field is True). There is no reason to keep these records, so Dynamics 365 for Finance and Operations finds the record in InventSum for the same item which has the same Site, copies the values from the Batch number level to the Site level, and deletes the record. When you now run inventory value reports, Dynamics 365 for Finance and Operations still finds the same correct values. This reduced number of InventSum records significantly in some cases, and can have a positive impact on performance of any function which queries this table. 

A608

A608 Cost calculation details

Inventory management > Periodic tasks > Clean up > Cost calculation details

Monthly

Used to clean up cost calculation details.

A609

A609 CDS – Post integration inventory journals

Inventory management > Periodic tasks > CDS integration > Post integration inventory journals

Fetches journals from the CDS (Common Data Service) and posts them. This applies only of the CDS is in use.

Warehouse management batch jobs

ID

Name, path and recurrence

Description

A700

A700 Work creation history purge

Warehouse management > Periodic tasks > Clean up > Work creation history purge

Weekly

This is used to delete work creation history records from WHSWorkCreateHistory table based on number of days to keep the history provided on dialog.

A701

A701 Containerization history purge

Warehouse management > Periodic tasks > Clean up > Containerization history purge

Weekly

This is used to delete containerization history from WHSContainerizationHistory table based on number of days to keep the history provided on dialog.

 

A702

A702 Wave batch cleanup

Warehouse management > Periodic tasks > Clean up > Wave batch cleanup

Weekly

This is used to clean up batch job history records related to Wave processing batch group.

A703

A703 Cycle count plan cleanup

Warehouse management > Periodic tasks > Clean up > Cycle count plan cleanup

Weekly

This is used to clean up batch job history records related to Cycle count plan configurations.

A704

A704 Mobile device activity log cleanup

Warehouse management > Periodic tasks > Clean up > Mobile device activity log cleanup

Weekly

This is used to delete mobile device activity log records from WHSMobileDeviceActivityLog table based on number of days to keep the history provided on dialog.

A705

A705 Work user session log cleanup

Warehouse management > Periodic tasks > Clean up > Work user session log cleanup

Weekly

This is used to delete work user session records from WHSWorkUserSessionLog table based on number of hours to keep provided on dialog.

A706

A706 Wave processing history log cleanup

Warehouse management > Periodic tasks > Clean up > Wave processing history log cleanup

Weekly

This is used to clean up history records related to Wave processing batch group.

A707

A707 WMS Replenishment

Warehouse management > Replenishment > Replenishments

Calculate location replenishments on the warehouse locations.

A708

A708 Automatic release of sales orders

Warehouse management > Automatic release of sales orders

Releases sales orders to the warehouse so that the picking can start.

Monitoring Distribution jobs

The Retail IT workspace is specifically created to monitor all distribution jobs, sending data to RCSU and POS. If there are failed sessions, they will be seen here. Also the current download (To RCSU) and Upload (From RCSU) is shown here.


Monitoring Batch jobs

The best place to monitor all current batch jobs is through the system administration workspace. Here all failed, running, waiting and withheld batch jobs are shown. This workspace also has additional system administration features.



D365 – To exist or not, that is the question!(part 2)

Some years ago I created a free community solution for “Not-Exists Join“. Not exists join means that we can filter and search on data that does not have any relational records. This answers questions like;

– Show me all customers that have no sales orders the last X days

– Show me all items with no inventory transaction. Show me items with no movement last 30 days.

– Show me all items that have no price.

Countless community friends have used this for AX 2012. But since Dynamics 365 was released this solution could not be applied. To make it properly I have decided to push a request through the CDE (Community Driven Engineering), and hopefully making it available to all D365 customers as part of the standard solution. All code is ready and checked-in , and I’m just waiting for Microsoft review.

The way the CDE works, is that partners and customer that have code or bugfixes can work together with Microsoft on implementing changes. It is Microsoft that have the final decision, and they will also make it part of their IP. But for all you community friends, here is a sneak peek of what I’m working on together with Microsoft.

The advanced filter and query in Dynamics 3656 are a very powerful tool. Here you can search and filter on most fields and add join relations to the query.

But there is one area that the advanced query screen is not handling. That is “not-exist-join”. Let’s say I want a list of all the customers that don’t have sales orders. The standard D365 will not help here. The purpose of this document is to show how to implement “not-exists-join” into standard.

Functional Solution

In the joins form, a new section of relations has been added that represents the tables that can be “not-exist-join” added:

In this sample the customers will no sales orders will be in the query result/form. But the feature are generic, and all 1:n relations can also be selected as a “Not exists” relation.

When will you have this in standard? Maybe 10.0.10?? It depends on Microsoft and final approval of the code and feature. But hopefully it should not be in the far future. But “cheer and share” and maybe we as the community can accelerate this very requested feature.

D365 community ROCK’s and Happy DAX’ing!!

Microsoft Bookings and Microsoft Graph

One common feedback we get when implementing Dynamics 365 is the ability to handle appointments and booking. There are many very good 3’rd party solutions, but did you know that Microsoft have an easy to use booking system that works online and integrated with Outlook. It’s called Microsoft Bookings, and is worth taking a small look at especially if you have the need of booking your customers for appointments and simple services. Microsoft Bookings provides online and mobile apps that make appointment scheduling simple and efficient for small businesses and their customers. Any small business that provides service on an appointment basis, such as auto repair shops, hair salons, and law firms, can benefit from having their bookings managed so as to free up time for the more important task to grow their business. Microsoft Bookings is available to businesses that have an Office 365 Business Premium subscription.

Here is a small live demo for you my friends: https://outlook.office365.com/owa/calendar/DXCCommerce1@dxccommerce.onmicrosoft.com/bookings/

The first page an online customer arrives at is the following screen, that can be published on Facebook or any social media sites. Here I choose to order my haircut from my favorite hairdresser. (Full manual is available here)

 

When booking I will get a confirmation email, and the booking coordinator will also get an email. The booking is also available on my phone:

 

On the back-office side, Microsoft have created a simplified view of managing and setting up your bookings:

Here you manage the calendar, customers and staff.

Here is the calendar for a specific day showing all appointments and bookings for today. Drag and drop of appointments between staff and dates is of course possible.

You can also manage you staff.

And the services you offer and map them towards your staff.

 

If you are a functional person, then just stop reading here, because here comes the good part: There is a complete API interface your you to integrate towards booking. (See also this link) Connecting this towards Dynamics 365 or commerce apps can be done by a developer, and makes it possible to expose booking services to POS, call-center and with tight integration to your Dynamics 365 solution.

Check out Microsoft Booking and Microsoft Bookings API in Microsoft Graph.

Here are some sample pictures on how to access the Booking system using Microsoft Graph. First, here I list all the booking sites listed in my tenant:

Pay attention to the fact that it returns and “id”, that identifies my booking on a specific store. If I now queries for bookings at the ID like this:

https://graph.microsoft.com/beta/bookingBusinesses/DXCCommerce1@dxccommerce.onmicrosoft.com/appointments (You will not get access to this link, but you are welcome to click it )

I get the following, where the service is listing up all bookings posted into Microsoft Bookings. A consent through the azure portal must be setup. And the great thing is that is actually is a two-way service. I can post bookings in.

BOOM! Take that! We now have a complete interface towards all services that Microsoft Graph can expose and can let us integrate on a completely new level.

If I wanted, I can now connect my bookings to any planning engine that would add more value to the service. Like picking me up in a golden limo-cab when I book my hairdressing hour. The possibilities are endless. Also remember that this is not restricted to bookings, but all services that azure may provide. We in the Dynamics partner community have just scratched the surface of the possibilities that Microsoft now provide.

Happy DAX’ing friends.

Dynamics 365 Branding and Commerce (Preview) Firsthand experience

PS! Remember to read the last lines in this blogpost

As I hope you have seen in your never-ending twitter/news feed is that Microsoft again adding lot’s of new apps and features to Dynamics 365. Microsoft are delivering on the communicated vision of Dynamics 365. We now have apps where we have a holistic approach to business processes. To solve business requirements users will be using a combination of apps that works natively together. We see how the entire solution is being connected, and further split up into specific areas. In the old days when we had large ERP suites, and we sold functional modules. We are now implementing connected apps that enable business processes per user. If anyone wonder what the “new” hashtag is, it is easy: “#MSDyn365“, and get used to it. We no longer a need to put things into additional silo’s to explain the legacy, and to succeed we must embrace and deliver the right combination of apps that solves the business requirements per business process.

One of the most exiting news in the current wave 2 release is the delivery of Dynamics 365 Commerce(preview). I have been privileged to validate and try of this solution the last days. My current experience is that: This Rock’s! Microsoft finally can deliver a complete suite to give a true omnichannel experience. One interesting finding is that Microsoft will rebrand their “Dynamics 365 for Retail” offering to “Dynamics 365 Commerce”. Why? Because what is now being offered extends the binderies of the traditional retail solutions. As seen in the following figure you can get a complete integrated end-to-end system. And this is not just for retailers, but for all companies that want to digitalize their processes and offer true omnichannel can benefit of this.

1 : Picture from Microsoft presentations

To try out this new solution, you can request a preview. You ask for a preview here. When/if accepted you will receive an email from Microsoft containing instruction on how to deploy this preview. This guide is also available her and it is important that the guide is followed very carefully. To complete the guide, you need to get some assistance from you Azure AD tenant administrator. Also, the preview is currently only deployable to US Azure datacenters, and this put’s some latency into the commerce experience.

One interesting thing with the commerce, is that even though this is a tier-1 environment, you get the possibility to deploy RCSU and the e-Commerce Server. The data set is basically standard Dynamics 365 for Retail, where the configuration key for retail essential enabled. So we can showcase that the Dynamics 365 Commerce also can be delivered as a standalone app or be extended with the finance and supply chain management apps.

The preview commerce solution is what you expect an e-commerce solution to be:

The back-end editor for the editor is easy to use, and it is easy configure your

To get a full understanding of the solution also head over to Microsoft DOC’s to learn more : https://docs.microsoft.com/en-us/dynamics365/commerce/

But I’ll do something better for you; You can check-out the preview solution yourself right now: https://d365commerceus2ecom-eval.commerce.dynamics.com/DXCCommerce (I expect that the site will be available for a only few days, so hurry)

If you want to buy something use Card number: 4111-1111-1111-1111, Expiration: 10/20, CVV: 737 . Also remember that this is a US-based Azure Datacenter and NOT a production grade scaled system.

Happy DAXing and DXCuLater!

D365 Retail – Buzz Alert !

THIS IS COOL !
Microsoft is launching several new product lines for retailers.

Dynamics 365 Commerce

Empower your business to create exceptional, insightful shopping experiences for every customer with Dynamics 365 Commerce—built on our proven Dynamics 365 Retail solution.

https://dynamics.microsoft.com/en-us/commerce/overview/

Microsoft Connected Store

Empower retailers with real-time observational data to improve in-store performance. From customer movement to the status of products and store devices, Dynamics 365 Connected Store will provide a better understanding of your retail space. (Check out the video)

https://dynamics.microsoft.com/en-us/ai/connected-store/

 

 

D365F&O, Lots of new high value content on DOC’s

The Microsoft Dynamics team have been quite buzzy after the vacation producing a lot of valuable content to Dynamics 365. I would like to highlight some of the latest additions that is worth checking out and to share in the Dynamics 365 ecosystem. Just this year alone, 714 articles have been published, and just the last 2 months close to 300 articles are made available. With this amount of information, I do get questions if there are some hidden gems on docs. And here some of them are:

1. Learning catalog

There are now more tailored learning paths towards customers and partners, with references to free, self-paced online learning path, Tech-talks, and formal instructor-led training. Here you will find articles, videos, and all you need to start learning Dynamics 365.

2. Test recorder and Regression suite automation tool for Retail Cloud POS

Now we can start creating regression testing on Retail POS. Cool stuff, and in my mind where we actual see the true value of regression testing. Retail is Detail, and this delivers quality.

3. Master planning setup wizard

Setting up master planning involves taking many decisions and here you can read how this is done in 10.0.5.

4. One Version service updates FAQ

This page answers a lot of question on the One Version strategy, and what this means for you. At many customers I see that extensive, time-consuming and costly testing processes are being manually executed each time Microsoft is releasing a new monthly updated. Why? I do not see the need to perform full testing on all modules on a monthly basis. Yes, it is a fact that nobody releases flawless code. (Even not Microsoft), but if you follow the procedures and guidelines from Microsoft, the monthly updates should be safe to deploy. There are several release rings and programs in place ensuing that quality is in place at GA. (General Available). Please align to the release cadence updates, and focus on your essential core processes. If you find painful bugs, report them asap.

5. Environment planning

I have seen several projects where the focus is to save costs on implementation environments. This page explains a lot on Microsoft’s take on this. My simple advice is use Tier-1/One box for development on a cloud hosted CSP subscription, and the rest of the environments as Tier-2 or higher (my recommendation is to have 2 additional Tier-2 environments for larger projects). The benefit to use self-service processes is priceless. Also keep in mind that Azure costs are very cheep compared to consultancy hours trying to maintain and manually transfer databases between environments. Also take a look at the great Denis Trunin’s blogpost on development VM’s performance.

6. Business events overview

This is the future and start adopting this feature into your business processes. This is also a key enabler for working closer with the Dynamics Power platform.

7. Regulatory updates

Here you find localized information for your country, and how to comply to specific local requirements. This is being updated very often.

8. Unified product experience

Do you want to keep the products from D365F&O synced with D365Sales ? This article explains how to achieve a near real time bi-directional integration with CDS. Great stuff also explaining dual write capabilities.

9. Adyen payment processing with omnichannel experience

Payment connector is far more versatile than just for retail. Also check out the FAQ.

10. Asset management

Great stuff on the horizon. Keep track of your stuff

11. Franchising

No longer in the official 2019 Wave 2 release. So, we must keep waiting for this in the future.

 

Take care, and

DXC you later

 

 

Analyzing Cloud POS performance in Dynamics 365 for Retail

It is a constant requirement that systems retailers are directly interacting with should be Bigger, Better, Faster, Stronger (BBFS). In this blog post, I will dig into how the POS performance can be analyzed to better understand the transactional performance of the Dynamics 365 POS. What I’m specially interested in is how perceived performance is towards actual. What we think is good performance is relative to the observer. The average reaction time for humans is 250 ms to a visual stimulus, but newer studies shows that we can identify visual stimulus down to 13 ms. Your screen has a refresh rate of 17 ms. As time is relevant and the expected performance is close to real-time, this can sometimes lead to performance expectations that is actual irrelevant towards what is wanted to be achieved. We as humans cannot go beyond 250 ms visual response time, so this is important to keep in mind.

As you can see in the following video, 4 items is scanned and then a quick cash payment is done. The total time taken to complete this example transaction in CPOS is approx. 5 s.

But as you can see on the screen, there is a lot happening, and when the user interface is being redrawn. I wanted to go deeper to understand exactly what is happening when scanning. More specifically on what’s happening when adding the sales lines in the POS.

As the CPOS is a 100% web based application, we can use Google Chrome to take a deeper look into exactly what is happening. By pressing the F12(Or CTRL-Shift-I), you get up the developers tools.

Then start the recording (CTRL-E), add a line in POS, and stop the recording. Then you will see:

1. CPU load, Activity bars, Network calls
2. The actual animation on the POS display each millisecond
3. Exactly how long calls to the Retail Server is taking.
4. The entire REST-call stack being executed on the CPOS client.

Here you see an example where I added one line to the POS basket, and this resulted in 2 calls to the retail server.

If we look at one of the calls happening:

ScanResults() (*.dynamics.com/Commerce/ScanResults(‘07100’)?$expand=Customer&api-version=7.3) – This scans the product/item barcode and sends it to the retail server. In google development tool, we can analyze exactly what is taking place on this call. Here we see that the total time was 559.54 ms but the actual waiting time for the RSSU to respond is 263,69 ms(Waiting TTFB). The browser is waiting for the first byte of a response. TTFB stands for Time-To-First-Byte. This timing includes 1 round trip of latency and the time the server took to prepare the response.) I have measured the network latency to this Tier-2 with RCSU system to be 40 ms.

If I scan the item again, we see that the caching of DNS etc kick’s in the TTFB lowers to 132,80 ms.


As you can see you can really go deeeep, and analyze all what is happening, from client execution to server execution, without any debugging tools. Down to the milliseconds, and better understand the performance. The profile created can be exported and imported for deeper analysis. We can see that there are many factors that influence performance, from network delay’s to form refresh. Microsoft could have the pleasure of shaving milliseconds of the animations, server calls and J-scripts, but this is an ongoing investment from and R&D perspective.

My honest opinion is that the Cloud based Dynamics 365 for Retail POS is performing good. Network elements and the speed of light is a fundamental restriction. The use of animations also seams to affect how performance is perceived, but it does not affect the general performance and usability. Legacy system that is on-prem have the benefit of not having latency, but the cloud solution brings so many other positive elements. If you choose MPOS instead, these tools are not available and you can use fiddler for analysis. But a small tip is to have a CPOS client available when performance testing, as this also will affect MPOS.

Bigger, Better, Faster, Stronger !

Meetings: Every minute counts, and snooze to 1 minute before meeting starts

As a consultant I’m used to having a lot of “back-to-back” meetings, and when the next meeting is near, I typically get an outlook reminder 15 minutes prior to the meeting.

Then using the “Snooze” button is good. If I snooze until 5 minutes before I am too early. 0 minutes before and I am too late. You know that in the drop-down, the minimum selection is 5 minutes? That is too much for me. I would like to have a new reminder when it is 1 minute before the meeting start. But did you also know that you can type into the field? You can actually write “1 minute”, and this will then remind you when it is 1 minute to the meeting start.

A smaller more advanced way is to set the default reminder to 16 minutes, prior to the meeting

And then when the reminder “pop’s” up, to can select to “Snooze” and select to be reminded in 15 minutes. That is exactly 1 minute before the meeting starts.

Now I have just “earned” 4 more minutes where I can create D365 customer value before the meetings starts

D365F&O – Address performance tips

Sometimes the smallest thing can make a huge difference. At a customer we experienced a huge load (DTU +70% average), and the LCS shows that there was a single SQL query that was the reason for the load. The data composition here was that there was close to a half million customers in the customer table, and most of them had addresses, email and phone numbers assigned to them. Except of the customers used for retail statement processing.

In LCS environment monitoring you can see this as spikes in the overview.

 

The query you typical see looks like this:

(@P1 int,@P2 nvarchar(256),@P3 int,@P4 bigint)SELECT TOP 1 T1.COUNTRYREGIONCODE,T1.DESCRIPTION,T1.ISINSTANTMESSAGE,T1.ISMOBILEPHONE,T1.ISPRIMARY,T1.ISPRIVATE,T1.LOCATION,T1.LOCATOR,T1.LOCATOREXTENSION,T1.PRIVATEFORPARTY,T1.TYPE,T1.ELECTRONICADDRESSROLES,T1.MODIFIEDBY,T1.RECVERSION,T1.PARTITION,T1.RECID FROM LOGISTICSELECTRONICADDRESS T1 WHERE ((T1.PARTITION=5637144576) AND ((T1.TYPE=@P1) AND (T1.LOCATOR<>@P2))) AND EXISTS (SELECT TOP 1 ‘x’ FROM LOGISTICSLOCATION T2 WHERE ((T2.PARTITION=5637144576) AND (T2.RECID=T1.LOCATION)) AND EXISTS (SELECT TOP 1 ‘x’ FROM DIRPARTYLOCATION T3 WHERE ((T3.PARTITION=5637144576) AND (((T3.LOCATION=T2.PARENTLOCATION) AND (T3.ISPOSTALADDRESS=@P3)) AND (T3.PARTY=@P4)))))

By downloading the query plan, we see that there is a index seek on the table LOGISTICSELECTRONICADDRESS.

 

This results in that the indexes don’t get a good “hit” on the logisticselectronicaddess.type.

The solution was surprisingly easy. Add Phone, Email address and URL to the customers.

 

Then the DTU drastically goes down, and normal expected performance was achieved.

 

Conclusion; Remember when having many customers, to fill inn contact information.

This just must be shared

D365F&O – Community Driven Engineering

I have previously blogged about the importance of reporting new ideas, issues and bugs to Microsoft, and also why the community will benefit from sharing. I see that experienced engineers have the solution available and are more than willing to give it for free to get the fixed-up code into the standard solution to benefit customers and future projects.

 

But the formalized support path does require time and energy and remember that not all Microsoft support consultants are engineers that you can discuss X++ topics with. But how can the process of contributing to the D365 community be easier?

But did you know that Microsoft have a program for Community Driven Engineering with Dynamics 365 F&O? This covers not only bugs, but also new features. Community driven engineering (CDE) is a Microsoft effort to make external engineers more efficient at providing recommended bug fixes as minor features to Microsoft, as well as to make Microsoft more efficient in accepting fixes from the community. If the fix is accepted, it will be merged into the main Dynamics 365 F&O branch. I have tried the program, and reported in a fix for auto-report as finished, and the fix was accepted, and hopefully in the near future the entire community can benefit from it.

How to start?

If you have the right skills and the willingness to share and give away your fixes (or features) you can sign up at https://aka.ms/Communitydrivenengineering. You need to be accepted into the program, and your user must be whitelisted before you can access. The CDE also have a private Yammer group, that you get access to when accepted. But I must warn you. This program is meant for the most experienced and technical people we have in our community, and that are deep into X++ and AzureDevOps. You must have approval from CxO-level in your organization that you can share code with Microsoft. (Lawyer stuff)

Here is the overall flow for the external engineer:

  1. You create a bug or a Feature in CDE Azure DevOps
  2. The bug or Feature is reviewed by the MS team and accepted or rejected
  3. You create a branch for this work and commit in this branch
  4. When done you create a Pull Request
  5. The Pull Request is reviewed by the MS team and feedback is provided
  6. After some iterations the Pull Request will be approved and complete
  7. The MS team will take over the code and include in a future release

Here are the more technical details of how it works.

The following text is copied from the onboarding documentation of the CDE.

It takes approximately one hour to get started with CDE, the majority of which is the initial build time.

  1. Obtain a development VM from LCS with build 8.1.195.20001 (app 8.1, platform update 22) or later. The latest branch I have access to is 10.0.80.19, that basically is 10.0.2 PU 26.
  2. Make sure you have opened Visual Studio at least once on the VM to sign in and pick default settings.
  3. Install Git on the machine from https://git-scm.com/downloads . The default installation options should work fine.
  4. From an administrator command line instance, clone this repo to a location on the machine.
    pushd k:\
    mkdir git
    cd git
    git clone https://dev.azure.com/msdyncde/_git/cde

  5. Define your user name and email in Git
    git config –global user.name “John Doe”
    git config –global user.email johndoe@example.com

  6. Mount the git repo into the F&O deployment
    pushd K:\git\cde
    powershell .\Mount.ps1
  7. Open Visual Studio as administrator and rebuild the following models
    ApplicationSuite
    ApplicationWorkspaces
    FiscalBooks
    GeneralLedger
    Project
    Retail
    Tax

At this point you can start development(in the SYS layer actually)

How to submit a change?

Changes submitted by the community are committed to the same REL branch matching the version on the dev VM. Once the pull request (PR) is completed, that signals that Microsoft has officially accepted the change and it will show up in a future official release, usually the next monthly release (depending on what day of the month the release closes). The change will only enter the master branch of msdyncde through a future official release. Syncing to the tip of a REL branch will pull in other community changes submitted from that version.

  1. Create a Bug or Feature depending on whether the change is related to incorrect behavior of existing code, or new behavior.
    https://dev.azure.com/msdyncde/cde/_workitems
    New work item > bug
    Fill in the title, assign it to yourself, and set the Area to your best guess as to where the behavior belongs (will help us review appropriately)
    In repro steps and system info, provide information on why this change is necessary
  2. In Git, create a topic branch to work on. Branches are usually named by username/bug number.
    git checkout -b johndoe/482
    git push –set-upstream origin johndoe/482

  3. In Visual Studio make changes to Application Suite SYS code as normal. Changes are actually being made directly in the Git folder.
  4. Push your changes to VSTS.
    git add -A
    git commit -m “Message explaining what is being changed”
    git push

  5. Send a pull request from VSTS
    https://dev.azure.com/msdyncde/_git/cde/pullrequests?_a=mine
    New pull request
    Source branch = johndoe/482, Destination branch = Rel_8.0.30.8022 (or whatever version you have)
    Fill in the title and description, link the work item > Create

Any feedback from Microsoft reviewers (or other Community reviewers) will show up in the PR. Changes can be made to the PR by editing in Visual Studio, and doing git add / commit / push again. Once Microsoft has signed off, all comments have been resolved, a work item is linked, and all other polices have been met, then you can click Complete to complete the pull request. When a PR is completed, that is official acceptance by Microsoft that the change will become part of a future official release, usually the next monthly release.

Behind the scenes

  • The powershell script starts by checking what version of source code exists on the VM by examining the K:\AosService\PackagesLocalDirectory\ApplicationSuite\Descriptor\Foundation.xml file.
  • It then checks out the REL branch associated with that version, which matches the platform and other model versions currently on the machine.
  • The development config files are updated to allow changes to SYS models, which is normally disallowed on dev VM’s.

In addition to having an accelerated approach to get fixes into main branch, participants also have some more benefits. You will have access to the latest & greatest code changes through all code branches that Microsoft makes available. You can search through the code and see if there are code changes that affects extensions or code that is local to you installations. You can also see how the Microsoft code is evolving and improvements are made available in the standard application. You will also build gradually very valuable network towards the best developers in the world, where you will discuss technical topics with the actual people creating the world’s best ERP-system.

One final joke for those considering going into this program: Git and sex are a lot alike. Both involve a lot of committing, pushing and pulling. Just don’t git push –force


D365F&O – Auto-report as finished in a Retail scenario

For many years I have had the opportunity to work on Dynamics 365 topics involving Kitting, Value Added Services(VAS) and Bill-of-Materials(BOM). Today I would like to write about the released product parameter “Auto-report as finished” in a retail scenario, and you can read more about report as finished at the Microsoft docs. To explain the business scenario, let’s take hot-dogs. A hot-dog is normally assembled as the customer wants, but in this scenario, we have a standardized hot-dog with 4 ingrediencies.

As a retailer, I would like to sell the finished product, but keep track of the raw materials. To do this you need to create a BOM, and when the hot-dog is sold, Dynamics 365 will automatically report a hot-dog as finished, and draw the ingrediencies from the store warehouse. It is possible to use a production order, but for retailers this is overkill. Something much easier is needed. Instead of exact BOM’s, then average BOM’s can also be used, since knowing exactly how much onion or mustard the customer will apply is not an exact science.

Dynamics 365 have a nice feature for this; Auto-report as finished.

What this parameter does, is then when the product is physically deduced (or Sold) a BOM journal will be created and posted. This will create issue-transaction (sold) from your inventory.

Here I have created a BOM for my hot-dog:

When creating a sales order and posting a packing slip you will see that a Bom journal is automatically created and posted.

The posted BOM Journal looks like this, and here we see that a hot-dog is added to the warehouse, while the ingrediencies are subtracted from the warehouse.

For retailers, this means that we can sell goods in the POS, and when the statement posting process is creating and posting sales orders, the auto-report as finished functionality will be posted. So, no need of any production order, or manually posting Report as Finished journals. So, Dynamics 365 have an alternative to retail kit’s, if a more standardized BOM’s are used. The BOM can then also be used for cost calculations on food and retail produced items. Comparing the counting and the actual transactions will also help to know how accurate the BOM are for describing the cost picture of the products. Master planning will also catch this, and you can get replenishment to work on ingrediencies.

BUT!!! There are some issues.
As a workaround and to make this work you will have to specify default warehouse per site per item in the default order settings.(I know this is an impossible task if you have 500 products and 500 stores, as this would mean you have to create 250.000 default order settings). I have a support request going with Microsoft to change this, so that this is not needed, and that the warehouse can be inherited from the parent transaction. So, if you get error like this, then you have done nothing wrong, and hopefully it will be fixed on future releases.

STOPP HERE, unless you like X++

Here is something for the “technical” guys; The code that automatically triggers this auto-report as finished is actually the class InventUpd_Physical.updatePhysicalIssue(). For those of us, that have worked quite some time with Dynamics, we understand that this class is very central, because all physical inventory transactions are posted through this class. The behavior of auto-posting BOM’s will therefore influence all places where a physical transaction is posted.

Microsoft have created a method on the movement classes named ” canBeAutoRepAsFinished()”, that let’s them refuse this behavior on certain transaction types.

If you don’t want to wait until Microsoft fixes the feature where the warehouse dimension is inherited from parent BOM, then you do have an option to extend the class BOMReportFinish.initInventDimFromBOMRoute(), and here set the InventLocationId from the parent. Her is at least my suggestion to fix the issue in the standard code(without extension):

Here is the code for validating that warehouse storage dimension is used on the BOM-line, and sending this back to the report as finished class.

Take care and I’ve got to get back to work. When I stop rowing, the mothership just goes in circles.

Near real-time replenishment in Dynamics 365 F&O

There is a lot of good stuff on the horizon for Dynamics 365. I highly recommend that you check out the following article of some new planning services that will come in the April 2019 release.

https://docs.microsoft.com/en-gb/business-applications-release-notes/April19/dynamics365-finance-operations/planning-service

To make this happen, I would expect the planning to go deeper into the SQL stack, and also to maximize the utilization of in-memory processing of the transactions.

For Retailers, this will be highly appreciated, where limited space in the stores means that shelf replenishment several times each day is common. Especially for perishable products with limited shelf-time. Keeping things fresh and presentable is a necessity for the customer to buy. The ability to more quickly react to customer demands ensures that the customers actually find the products in your store. And the same aspect, when there are a slower sale, the ability to adjust down the replenishment according to activity. This saves cost and increases profit. In Retail, it is the small improvements that in sum creates the big results.

For the planning service to work, it needs the transactions to take action on. In Dynamics 365 for Retail we must choose between the ability to aggregate the transactions coming from the POS/Channel databases, or more quickly posting the statements. I’m looking forward to many good discussions on this area.

The future is faster

Retail Enterprise Architecture mapping using ArchiMate and ARDOQ

The warning; The blog post is High Level, but the benefits can be mind-blowing.

Enterprise Architecture is about understanding and change. In today’s business, change is everywhere and the essential part to survive. But change is not easy. To have insights and understanding of your own organization is essential for change and risk assessment. Understanding how people, processes and technology are connected will give focus to achieve high value benefits. In my profession we use the Microsoft Dynamics technology stack as a main driver for implementing improvements. But we also acknowledge that Dynamics 365 is not the only system at work. Even though Dynamics 365 is a central component, there always will be many other systems, processes and technologies that is included in the enterprise architecture (EA). We need a way to describe all these connections in uniformed way, that allows us to communicate a model for enterprises dynamically.

But why should EA mapping be a central part of your business? here are 6 business motivators and benefits of having a structured approach of the EA mapping:

Increased stability and availability. It is critical vital that all central systems have a near 100% availability. POS and back-end systems must always work, and the supporting processes must be streamlined to secure that risks related to business improvements and changes are minimized and understood. The EA mapping documents the relationships and show consequences changes.
Guaranteed Performance. Having acceptable system response 24/7, that can deal with business pikes must be planned and built around the system. Systems must deal with a variable load, handling that the sudden event changes the transaction volume. Any disruptions quickly result in customers walking away. The EA mapping must document components central for performance compliance, and the business actors involved
Scalable capacity. New stores or changes in the business model can quickly change the requirement for transaction and processing capacity. To be cost effective, the capacity scalability must dynamic according to the actual need. Both in terms to scaling up and down. The EA mapping documents components central for scalability, and the business actors involved.
Strong security. Cyberattacks are increasing and it is vital important to secure information and transactions. Being GDPR compliant puts demands on systems and internal processes on how to handle own and customer information. Security, tractability and audit trail builds trust into the system and documenting compliancy. The EA mapping documents governance and role compliance, and the business actors involved.
Right focus. There are always new business opportunities and process improvements. Keeping track on where to focus will lead to better and faster implementation of changes in a secure and stable manner. New ideas must be analyzed, and risk assessed, and also to understand the implications. The EA mapping can assist in focusing on what changes have the highest priorities and benefits.
Cost control. Being a retailer involves large investments in technology like POS, Mobile apps, customer portals and enterprise systems. Moreover, there may be large fluctuations in system usage throughout the year. By purchasing these features in the subscription form, it is possible to equalize the operating costs and that you only pay for what is needed. Good liquidity is archived by balancing cost full investments towards the revenue stream and securing actual return on these investments

To move forward a “language” is needed to describe an enterprise architecture model where you can visualize, plan, implement and maintain all relationships that exists today, in transitions and the final vision.

Architecture Layers using ArchiMate

The overall mapping can be modelled in 5 main layers; Here I would like to focus on the symbolism used for identifying. The notation here is ArchiMate, that is open and independent enterprise architecture modeling language to support the description, analysis and visualization of architecture within and across business domains in an unambiguous way.

Motivation Elements defines the overall drivers and goals that the enterprise have. Much of the vision is located here. The Motivation elements can also be seen as a vertical layer, in close relationship to all layers.

The Strategy layer defines the overall course of action and a mapping towards resource and business capabilities.


The Business layer defines the business processes and the services the enterprise is providing, and the here the main business processes are defined. To simply the modeling it is relevant to start with the Business Objects, Business processes, Business Roles, Business actors, Business events, Business Services and Business Rules and Logics.

The Application layer contains application services and capabilities, their interactions and application processes. Here Dynamics 365 and much of the power platform is located. To simply the modeling it is relevant to start with Data objects, Application functions and Application components.


The Technology and physical layer describes the software and hardware(physical or virtual) capabilities that are required to support the deployment of business, data, and application services; this includes IT infrastructure, middleware, networks, communications, processing, standards, etc. The underlaying structure of Microsoft Azure would typically be described here. To simply the modeling it is relevant to start with Artifacts, System Software, Technology Service, Device and Communication network.

Architecture Relationships using ArchiMate

The real beauty comes, when the relationships between architecture elements are being defined. But to do this, a set of predefined relationships needs to be defined. The most common used is the following one

If putting this together in a combined setup I get the following relationship diagram of what is relevant to document.

(*Credits to Joon for this visualization)

As seen here, the business processes are a realization of the application functions, and this clarifies how a proper Enterprise Architecture modelling is documents. With this model, we can what business actors is assigned to what Business roles. This again shows the business process assignment to the role. The Business processes are there to realize business services.

Building the Architecture model using Ardoq

The architecture relationships can be challenging to describe using tools like Visio. Often, we see that great work is done, but not used to the potential. An alternative is to use cloud based mapping tools as ardoq, that covers most aspects in documenting relationships between business processes, applications, roles, risks and transitions. This is not a commercial for this tool, but I find it great. So, I decided to try to use Ardoq to model the Contoso demo data.

Here I will focus on the Application Layer, as this is the layer where the application functionality and data are located. First, I create the application components:

Then I create the Application Functions, and I also import the Business Roles that is available in the Contoso demo dataset.

Next job is to build the relationship between the application functions(D365), business processes(vertical processes) and business roles. This will allow me to visualize and to trace dependencies across all the EA mappings. Let’s take an example looking into the responsibilities of an employee named April Mayer.

I can here see that she is related to the business roles; Accounts payable clerk and manager. If I click on the “Accounts payable clerk” I jump into the view of this business role, and I can see that it is related to the business processes of accounts payable, and an association to April Mayer.

Jumping to accounts payable allows be to see the business processes involved.

I can also visualize the entire Enterprise Architecture Map will all objects and relations,

And zoom into specific on the relations; This graph shows me that April Meyer belongs to the role “Employee”, Accounts payable manager and clear. The Accounts payable clerk is associated with the business process “Accounts payable”. The clerk role is associated with the Financial management modules in Dynamics 365.

Here is another visualization, that shows the how the business objective of “Marketing” can be achieved, and what Business roles are involved, what Business processes, Application functions and what application components are also involved.

Knowing the relation and the ability to communicate is a key to happy Enterprise Architecture mapping.

Give is a try, the result can be very powerful.

Additional information

1. A high value blogger on Enterprise Architecture is http://theenterprisingarchitect.blogspot.com/.

2. Homepage of archimate: http://pubs.opengroup.org/architecture/archimate3-doc/toc.html .

3. Homepage of ARDOQ : https://ardoq.com/ Give it a try !

MPOS – Open full (kiosk) screen mode when having dual display

For a retailer, every saved “click” is appreciated, and the ability to remove any noise appreciated.

When starting MPOS in maximum mode, you will often see that you have a title bar at the top, and the app-bar at the bottom.

In windows 10 you can also use the “tablet-mode” to get the MPOS into full screen mode.

BUT! If you have a dual display setup, it the tablet mode does not work.

If you want to remove them, there is a smart keyboard short-cut:

Shift-Windows-Enter

This will put the MPOS in full screen mode, and giving a nicer appearance without the bar’s.

Then the questions is how to make this always happen, when starting the MPOS ? This was actually not a easy task, but a colleague of me (Espen) made it possible , du using a powershell script.

The following page contains a small powershell script, that opens a UWP app in full (kiosk) screen mode:

Add this to a “start up folder”, and create a new powershell script containing ;

[Path]\StartUWPAppFullScreen.ps1
-app
Shell:Appsfolder\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App

 

Then create a shortcut towards this new powershell app.

How initial investigations (by Sven Erik) shows that the MPOS app ID is Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App and let’s hope this ID stay’s permanent.

Then the MPOS looks nicer for the user, without noice.

 

 

 

Retail assortments and planned orders extensions

Microsoft have created an excellent description of this in the Assortment management doc-page. Retail assortments are essential to define what products that should be available across retail channels. Assigning a product to an assortment, will assign the product to the stores that have the assortment. This makes it possible to sell the product in the store.

But there is something missing, and that is to use assortments for replenishment and procurement. Retailers want the master planning to only suggest procurement and transfers on products that belongs to the stores assortments. You can control requirement parameters per warehouse, but there is no standard way to use the assortment to control the generation of planned orders.

This blog post is to show how to make a very small extension that allows you to make sure that only products that belongs to a stores assortment will generate planned orders. The solution I will make use of is to look into how the product lifecycle state functionality is working, and extend this with an assortment planning parameter. I have called this parameter “Is lifecycle state active for assortment procurement

What it will do, is to validate if a product is in the assortment of the store, and if it is, then the product will be requirement calculated and will generate planned orders. If the product is not in the assortment of the store, that no planned orders will be generated.

To make this happen, I needed to create 4 extensions. The three first is adding a new field on the product lifecycle form.  For an experienced developer this is easy to create, and no need to spend time on in this this blog-post.

The real fun is how to manage and control the planned order generation. This happens in in an extension to the ReqSetupDim class.  Here there is a lookup to the assortment on the product, and checks if the store have this assortment.  If not, then the masterplanning will not generate any planned orders. I therefore create an extention class and use the new method wrapping/CoC feature to add some additional code.

///
<summary>
/// Contains extension methods for the ReqSetupDim class.
/// </summary>

[ExtensionOf(classStr(ReqSetupDim))]
final class ReqSetupDim_extension
{

    ///
<summary>
    /// Validates if a product should be assortment planned
    /// </summary>

    /// The parm of the ReqSetupDim class.
    /// false if the product is not assortment planned; otherwise, return default value.
    public boolean  mustReqBeCreated(InventDim _inventDimComplete)
    {
        Boolean ret = next mustReqBeCreated(_inventDimComplete);

        if (ret)
        {
            if (inventdim.InventLocationId)
            {
                InventTable                 inventtable;
                EcoResProductLifecycleState ecoResProductLifecycleState;

                //Fetching fields from  inventtable
                select firstonly ProductLifecycleStateId, Product from  inventtable where inventtable.ItemId == this.itemId();

                //validating if the product is active for planning and that also assortment planning is enabled.
                select firstonly RecId from ecoResProductLifecycleState
                        where   ecoResProductLifecycleState.IsActiveForAssortmentPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.IsActiveForPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.StateId == inventtable.ProductLifecycleStateId;

                if(ecoResProductLifecycleState)
                {
                    RetailStoreTable                    store;
                    EcoResProduct                       product;
                    RetailAssortmentLookup              assortmentLookupInclude;
                    RetailAssortmentLookup              assortmentLookupExclude;

                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupInclude;
                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupExclude;

                    //Finding OMOperatingUnitID from the inventlocationId
                    while select firstonly OMOperatingUnitID from store
                        where store.inventlocation == inventdim.InventLocationId
                    {
                        //Check if the product is in the assortment of the store in question
                        select RecId from product
                            where product.RecId == inventtable.product
                        exists join assortmentLookupInclude
                            where   assortmentLookupInclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupInclude.lineType == RetailAssortmentExcludeIncludeType::Include
                        exists join assortmentLookupChannelGroupInclude
                                where   assortmentLookupChannelGroupInclude.OMOperatingUnitId == store.OMOperatingUnitID
                                    &amp;&amp;  assortmentLookupChannelGroupInclude.AssortmentId == assortmentLookupInclude.AssortmentId
                        notexists join assortmentLookupExclude
                            where   assortmentLookupExclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupExclude.lineType == RetailAssortmentExcludeIncludeType::Exclude
                        exists join assortmentLookupChannelGroupExclude
                            where   assortmentLookupChannelGroupExclude.OMOperatingUnitId == store.OMOperatingUnitID
                                &amp;&amp;  assortmentLookupChannelGroupExclude.AssortmentId == assortmentLookupExclude.AssortmentId;

                        if (!product)
                        {
                            ret = false; //The product does NOT belong to the stores assortment, and should not be planned
                        }
                    }
                }
            }
        }
        return ret;
    }
}

I also have code to restrict creation of manual purchase orders, and where simular code can be used, but let’s hope that Microsoft can further extend standard Dynamics 365 with assortment based procurement planning.

Copy with pride, and let’s hope next year will give us 365 more opertunities.

POS Invoice Pay – #Dyn365F&O

A very nice omnichannel capability made available in Dynamics 365 version 8.1, is the ability for customers to pay their invoices directly in the POS. A scenario is that a customer is allowed to purchase “on-account” and then later pay all the invoices. Let’s say that the customer is in a hotel, and allows the customers to buy food, drinks and services throughout the stay. At the end of the stay the customer pays for all the services at the reception. Like “pay-before-your-leave”.

There is no requirement that the goods have to be sold on a POS. It is fully omnichannel capable. So, the orders can be created in the call-center, WEB or in stores. I would like to share this with you and how you can set it up in the Contoso demo data set. If you open the functionality profiles, you will find the possibility to enable paying:

  • Sales order invoice
  • Free text invoice
  • Project invoice (Yes! Even project invoices!)
  • Sales order credit note

The next thing you need to do is to add a “Sales invoice” – button to the transaction screen. (I’m using Houston store, and button grid F2T2)

This will add a sales invoice button to the POS design, that allows for paying invoices in POS.

The next thing is to create a POS transaction/order. First select a customer (like Karen), and then use the on-account button to sell the goods.

On the payment screen you can say how much you would like to put on account, and you also see that the credit limit and balance is available.

The next step requires that the there are some periodic batch jobs, that needs to run;

1. Run the “P-job”, to fetch the transactions from the channel database.

2. Run the “Calculate statement” (manually or in batch)

3. Run the “Post statement” (This process will create the sales order and the invoice)

!Make sure the statement is posted and invoiced before continuing!

The option you now have is to continue to the process in Dynamics 365, and create an automatic sending of the invoice to the customer through print management, or have the customer come to the “reception” and pay for the goods directly.

To pay the order, select the Karen customer, and use the Sales Invoice button.

If you have done all right, you should find the invoice in the list now. (If you have enabled aggregation in the parameters, you will have a single invoice per customer)

I can then select the invoice (or multiple), and pay it using cash, card, loyalty (And even on-account again)

This opens up for some very nice omnichannel processes, and I hope that Microsoft invests further in this. It would be nice to actually see the actual lines on the invoices that is being paid, and to even print-out the invoice if the customer requires this. Also I suggest that for retailers, use the modern report possibility to make the invoice look awesome.

Take care friends, and thanks for all your support and encouragement!

A quick look at download Retail distribution jobs (CDX)

Commerce Data Exchange (CDX) is a system that transfers data between the Dynamics 365 F&O headquarters database based and retail channels databases(RSSU/Offline database). The retail channels databases can the cloud based “default” channel database, the RSSU database and offline databases that is on the MPOS devices. If the look at the following figure from Microsoft docs, this blog post is explaining how to practically understand this.

What data is sent to the channel/offline databases?

In the retail menus you will find 2 menu items; Scheduler
jobs
and scheduler subjobs. Here the different data that can be sent is defined.

When setting up Dynamics 365 the first time, Microsoft have defined a set to ready to use scheduler jobs that get’s automatically created by the “initialize” menu item, as described here.

Scheduler jobs is a collection of the tables that should be sent, and sub jobs contains the actual mapping between D365 F&O and channel database fields. As seen in the next picture, the fields on the table CustTable in D365 is mapped towards the AX.CUSTTABLE in the channel database.

To explore what is/can be transferred, then explore the Scheduler jobs and scheduler subjobs.

Can I see what data is actually sent to the channel/offline databases?

Yes you can! In the retail menu, you should be able to find a Commerce Data Exchange, and a menu item named “Download sessions”.

Here you should see all data that is sent to the channel databases, and here there are a menu item names “Download file”.

This will download a Zip file, that contains CSV files, that corresponds to the Scheduler
jobs
and scheduler subjobs.

You can open this file in Excel to see the actual contents. (I have a few hidden columns and formatted the excel sheet to look better). So this means you can see the actual data being sent to the RSSU/Offline channel database.

All distribution jobs can be set up as batch jobs with different execution reoccurrence. If you want to make it simple but execute download distribution job 9999 to run every 30 minutes. If you have a more complex setup and need to better control when data is sent, then make separate distribution batch-jobs so that you can send new data to the channel databases in periods when there are less loads in the retail channels.

Too much data is sent to the channel databases/offline database and the MPOS is slow?

Retail is using change tracking, and this makes sure that only new and updated records is sent. This makes sure that amount of data is minimized. There is an important parameter, that controls how often a FULL distribution should be executed. By default it is 2 days. If you have lots of products and customers, we see that this generates very large distribution jobs with millions of records that will be distributed. By setting this to Zero, this will not happen. Very large distributions can cripple your POS’es, and your users will complain that the system is slow, or they get strange database errors. In version 8.1.3 it is expected to be changed to default to zero, meaning that full datasets will not be distributed automatically.

Change tracking seams not to be working?

As you may know, Dynamics 365 have also added the possibility to add change tracking on data entities when using BOYD. I have experienced that adjusting this affect the retail requirement for change tracking. If this happens, please use the Initialize retail scheduler to set this right again.

Missing upload transactions from your channel databases?

In some rare cases it have been experienced that there are missing transactions in D365, compared to what the POS is showing. The trick to resent all transactions is the following:

Run script: “delete crt.TableReplicationLog” in the RSSU DB. And the next P job will sync all transactions from RSSU DB (include missing ones).

 

Using Cloud POS as your retail mobile device

Handheld functionality for retailers is a question I get a lot. Then typical in the area of counting, replenishment, receive and daily POS operations. In version 8.1 Microsoft have taken a small step forward to make it easier to use any handheld device that supports a common browser. Because Cloud POS (CPOS) runs in a browser, the application isn’t installed on the device. Instead, the browser accesses the application code from the CPOS server. CPOS can’t directly access POS hardware or work in an offline state.

What Microsoft have done is to make the CPOS change according to the screen size, to work more effectively on your device. To make it simple, I just want to show you how it looks on my iPhone.

Step 1: Direct your browser towards the URL of where the CPOS is located. In LCS you will find the URL here:

Step 2: Activate your POS on mobile device by selecting store and register, and log in

Step 3: Log into CPOS and start using it. Here are some sample screens from my iPhone, where I count an item using CPOS.

You can also “simulate” this in your PC browser, but just reducing the size of your browser window before you log into CPOS. Here I’m showing the inventory lookup in CPOS.

What I would love to see more of is:

– Barcode scanning support using camera

– The ability to create replenishment/purchase orders in CPOS

– More receive capabilities like ASN/Pallet receive etc.

– Improved browser functionality (like back-forward browsing etc)

To me it seems clear that we will see additional improvements in CPOS, making it the preferred mobile platform for Dynamics 365 for Retail. As we get a little, I hope to see more of this as Microsoft is definitely investing in this area. In our own customer projects we will be developing more and more functionality using RTS (Real Time Service calls) to add more features to be used together with CPOS.

To take this to the next level, please also check evaluate to create a hybrid app, that incorporate CPOS in a app friendly way. Sources say that this will also allow us to build extensions like camera barcode scanning

The direction is right and my prediction for the future is that: Mobile Retail device = CPOS.

Focus18 – EMEA – London

The User Groups for Dynamics 365, AX, CRM, BC/NAV, and Power BI road-trip named Focus is arriving to Europe and is making a stop in London from 5-6 September, 2018 featuring dive deep sessions covering advanced topics on D365 Finance and Operations and Customer Engagement. Additionally, specific topics to the Retail space including modern POS, inventory management, sales orders, ecommerce, credit card processing and more. This is great stuff!

It is a privilege for me to participate and present together with great MVP’s, Microsoft experts and the Dynamics 365 community. If you want to check out my sessions, I will have the following sessions:

Deep dive into retail pricing and discounts. 

This session is about what product sales price and discount options that exists in Dynamics 365 for Retail – “out-of-the-box”.  With actual and real examples of how to implement and maintain your retail prices.

 

Learn, Try, Buy for Retailers.

The “Learn, Try and Buy for Retailers” is an accelerated onboarding approach that enables you to evaluate if a cloud enabled Dynamics 365 for Retail is the right direction, and to be able to learn as much as possible prior to performing a business- and solution analysis. This is available for agile and iterative approaches, and this sessions shows why buying a small Dynamics 365 license is an affordable investment to purchase before scope of implementation have been defined. Using VSTS (Visual Studio Team Services) is a central topic in this session.

Power BI and Retail.  How to get the numbers.

This sessions shows how to publish retail transactions into a Azure SQL database or CDS(Common Data Services), and then analyze the retail sales in Power BI.

Check out https://www.focusemea.com/locations/london as there are many other very interesting sessions.

 

See you in London!

 

 

MSDYN365FO: Automate repetitive tasks – the easy way

Here the other day, I got the task of posting a few thousand Retail Kit orders / BOM-Journals because they failed at the first time. I started, and managed to manually post 50 journals before my fingers got cramps and I started to feel dizzy. I could not multiselect the journals and post them, so I had to manually click “post” on each journal.

I surely sent a SR to Microsoft explaining that this should be easier in standard, and that SR is in process. But it will probably end up as a “As-Designed” state, or “post it to ideas.microsoft.com”.

But there is an easier low-tech way to solving this. Just install a Mouse-ghost app, and it will repeat the task for you. So I used the app “Mouse Recorder Premium” to post all the 1300 journals, and it went smoothly. Just record the clicks and then repeat for a 1000 times.

To make sure I did not “lock” my PC while this was performing, I started the task in a Hyper-V VM, and then it can run in the background.

That’s today’s small trick to get rid of repetitive tasks

Measure sales per Retail Category in Power BI

Drill down on sales per category, employee, and department is key essentials for Retailers. Doing this gives a more specific view of what’s generating sales and what isn’t. Having insights into top categories or departments might help make decisions about purchasing and marketing. A good point of sale comes with reporting and analytics, so you can quickly get the data you need, whenever you need it — without manual calculations.

Power BI is a must have for all retailers, and this blogpost is about creating a retail category hierarchy in power BI.

If you have worked with Retail Categories, you know that there exists a “parent-child” relationship between the categories as illustrated from the following data in the Contoso demodata set.

In power BI it is possible to also create such hierarchies, but it requires some minor changes to reflect this. My inspiration came from Power BI Tutorial: Flatten Parent Child Hierarchy. I will not go through how I build a retail power BI analysis, but I can share that I use ODATA entities, and here is the entities I’m using:

More information on the data model is available in DOCS her.

The “trick” is to create a new column named “Path“, and a column named CategoryL[X] for each level in the hierarchy, that for the RetailProductHierarchyCategories looks like this:

Here are the column formulas

Path = PATH(RetailProductHierarchyCategories[CategoryName];RetailProductHierarchyCategories[ParentCategoryName])

CategoryL2 = PATHITEM(RetailProductHierarchyCategories[Path];2)

CategoryL3 = PATHITEM(RetailProductHierarchyCategories[Path];3)

CategoryL4 = PATHITEM(RetailProductHierarchyCategories[Path];4)

CategoryL5 = PATHITEM(RetailProductHierarchyCategories[Path];5)

…etc

Then I create a new hierarchy column for, where I specify

And I use the Hierarchy Slicer that is available in the power BI marketplace.

In power BI I then get a Retail Category slicer, and can filter and measure sales per category in power BI

Microsoft are in process of aligning ourselves with future of Power BI and create the new version of Retail Channel Performance with New Common Data Service for Analytics capability coming to Power BI https://powerbi.microsoft.com/en-us/cds-analytics/

Keep on rocking #MSDYN365FO!

A Practical Guide for Dynamics 365 Iterative Implementation

With the introduction of Dynamics 365 and cloud enabled tools like Office and VSTS(Visual Studio Team Services) we have accelerators towards iterative ways of performing an implementation.

Digitalization also enables the ability to go from a document and template approach to a committed task driven implementation with a sprint based sub-deliveries, where all parties are involved. This also increases visibility, removes lead-times and results in faster deliveries. Adapting digitalization and going iterative in a project it not only about using new tools and processes like VSTS, but also covering Practices, Principles, Values and Mindsets of the project participants.

The iterative preparation

As described in earlier blogposts it is vital to have a clear concept of process modeling where processes are broken down to sub-processes and requirements. Having WBS (Work-Breakdown-Structures) is the tool to plan and execute on deliverables. The traditionally solution analysis is transforming into a iterative preparation phase that can let us define clear work packages that can be solved in sprint executions.

The Iterative preparation should have a formalized set of workshops, and the main purpose is to generate an approved solution backlog. It is normally recommended to complete the preparation phase before going into the execution phase. But in larger projects the preparation phase could be a parallel phase to the execution phase, and where customer approved solution backlogs can be planned into sprints and started upon before the phase is ended.

Please remember that iterative implementation models do not give a detailed picture of scope or costs! The actual deliveries are defined by the customer approved solution backlog.

The following flow chart shows the main activities in the preparation phase.

The granularity and level of details needed in the deliverable documents is agreed on in the project. A middle and practical way is to create the deliverable documents with a minimum set of information and a final conclusion, and then URL link the content in documents towards a VSTS site for further information and process.

The preparation phase is highly customer intensive and require a detailed plan, invitations, workshops and time to document the findings. Before is participating in preparation workshops it is recommended that the participants have completed a “Learn, Try, Buy” phase. An example project plan for the preparation phase can look like this for a retail customer.

As seen in the example plan, the preparation can have dedicated tracks for the functional areas, and these will vary based on the vertical models that is being used. The level of granularity of the sub topics is recommended to be according to the first and second level in the process models.

Use process models to define scope and topics.

The contents of the preparation workshops should be organized based on the process models. This makes sure that best practices are discussed and taken into account for the execution phase. The value chain shown here is the divided into 3 main epic tracks; Management processes, Operational processes and Support processes. There are different models for each vertical. As seen in the following figure I typical use to illustrate the EG retail value chain model.

ProcessModels

For each of the “boxes” in the model represents a topic, where business processes is discussed and defined. The model will provide:

  • Workshop Agenda’s templates
  • UAT test scripts templates and recommended process recordings
  • Stack/technology recommendations
  • Process flows (visio or BPM in LCS)
  • Solution Backlog templates.
  • KPI assessment recommendations (APQC)

From Model to solution backlog

Based on the findings from the preparation phase a solution backlog is created. The most efficient tool to do this in, is the VSTS (Visual Studio Team Services), setup using the CMMI definitions. Here all backlogs are organized in a hierarchy of Epic’s, Features, Backlogs, tasks and Impediments.

The general consensus of these definitions are:

Level Description
Epics Something that transcends projects/releases/versions.
Features Something that cannot be delivered in a single sprint, but can be delivered in a single release.
Requirement(CMMI)
Product Backlog (SCRUM)
Something that can be delivered in a sprint, and have an estimation.
Bug Something that that is not working and can be solved in a sprint, and have an estimation.
Task Assigned work elements with remaining effort.

To relate the structures to CMMI, the following guideline can also be followed.

More details in how to create a backlog in VSTS can be found here. Best practice is that the VSTS site is located on the customers tendant, and that external project participants are invited. The VSTS backlog can also be regarded as a WBS (Work Breakdown Structure). In the following example you can see how the backlog is structured according to a business process model.

The VSTS will also provide dashboards where a complete status can be seen and monitored. Setting up these Dashboards is based on defined queries towards tasks and backlogs, and are easy to tailor to the needs.

How to fill in a backlog item

The product backlog (and other elements) contains a small set of fields needed.

What should be regarded as a minimum set of information defined on a backlog is:

  • Name
  • Description
  • Acceptance Criteria
  • Effort estimated.

If additional fields are needed, they are quite easy to add and also easy to extend with new statuses.

If additional fields are needed, like APQC ID, planning dates, additional names etc, they can very easily be added to the form. See https://www.visualstudio.com/en-us/docs/work/customize/customize-work for more information.

In the preparation phase perform these activities:

  • Right-size backlog items by splitting larger items into smaller items. No backlog item should be larger than it will take to complete in a single sprint.
  • Identify and fill in gaps in the product backlog. Capture new ideas and stories, architecture and design requirements, and other spikes.
  • Reorder the backlog to represent today’s priorities and business value focus.
  • Ensure well defined acceptance criteria has been added to each item.
  • Revisit estimates made to backlog items and adjust upwards or downwards based on recent understanding about scope and acceptance criteria.
  • Review all potential backlog items to consider for the upcoming sprint to make sure they are well understood and that any additional work required to support their development is well understood by both product owner and the team.

Mapping a VSTS product backlog to the functional requirement documentation

Most often it is mandatory to also deliver a Functional Requirement Document. This document is delivered as in the end of the preparation phase. The reason why this document is important, is that it explicitly defines all requirements, and is a commercial document. But instead of writing a hundreds of page document, try to link the requirements using URL-links to the document. Then the FRD only contains the vital and important information that regulates responsibilities and commercial conditions.

The preparation phase ends when the deliverables from the phase is approved and signed by the customer. After the phase is approved, all information on the VSTS site can be copied into Excel backup sheets, that represents a snapshot of the status at the end of the prep phase.

Roles involved in an iterative preparation phase

The roles running an iterative preparation phase depends on project size and complexity. As a minimum, it is recommended that the following defined roles are present in this phase:

  • Project manager (Planning and facilitating)
  • Solution architect (Overall approval of solution)
  • Technical lead (Integrations and migration)
  • Functional Consultants (Covering training, functional area’s)
  • Junior Business consultants (Assisting writing and maintaining the solution backlog)

Customer project participants need to match this roles.

Iterative Execution phase

As the solution backlog is filled, sprints may be filled with approved backlog items. The overall process of the sprint is to deliver at set of backlog items, that have been broken down to specific tasks. The duration of a sprint is determined by the scrum master, the team’s facilitator. Once the team reaches a consensus for how many days a sprint should last, all future sprints should be the same. Normally, a sprint lasts between 2 to 4 weeks. During the sprint, the team holds daily stand up meeting to discuss progress and brainstorm solutions to challenges. The customer may not make requests for changes during a sprint and only the scrum master or project manager has the power to interrupt or stop the sprint. At the end of the sprint, the team presents its completed work to the customer and the customer uses the criteria established at the sprint planning meeting to either accept or reject the work.

The following diagram shows the activities involved, and the expected deliverables from a sprint.

Define the Sprint log

To solve a backlog, then several resources may be required to be involved. When defining the sprint log, each backlog is split into tasks, that defines the sequence, remaining work and the assigned to. This means having tasks for analysis and design of the backlog, creating scripts for testing, tasks for developing and tasks for performing the test of the backlog item. As seen in the following figure a backlog is divided into tasks, and each task is must have a clear description and a “remaining work” estimate. If essential resources are needed to solve the task, then the task also should be assigned to this person.

When a task have been assigned to a person the person is committing to the task, and agrees on delivering the task within the defined sprint.

Conducting a sprint planning meeting

The customer, project manager and the Scrum Master will start a sprint by selecting the backlogs that should be solved in the current print. This is done in VSTS, and the backlogs are “dragged-and-dropped” to the selected sprint, or marked to specific iteration.

When planning a sprint, also identify what resources are needed in the sprint. In the sprint overview, then define the capacity and the resources required in the sprint. This makes planning easier and resource/capacity constraints can be identified by project manager/scrum master.

The daily sprint meeting

This meeting is the most important meeting each day. It should only last for 15 minutes, starts at the same time every day and is located on the same place every day. The SCRUM master is responsible to make sure that the meeting is as efficient as possible. It is a team meeting, where each team member explains what he is working on, and if there are any issues. Do NOT use the sprint meeting to try to solve issues. Just identify and share. Use other meetings to solve and to go deeper into each topic. Any notes that is important and identified in the CMMI can be described in the discussion field on the task/backlog.

Also use the “Add Tag” to mark tasks and backlogs that need special attention and follow-up.

Reporting status and completion

All backlog items have a state. The meaning of these states can be seen in the following flow charts:

Teams can use the Kanban board to update the status of backlogs, and the sprint task board to update the status of tasks. Dragging items to a new state column updates both the State and Reason fields. If additional intermediate steps and stages are needed, this can be customized in the settings of the VSTS.

Documentation

One of the disadvantages of an iterative implementations is that there are no clear and sharp end-phases, and this often does not fit good with commercial contracts. It is therefore important to make sure the deliverable documents are created/updated according to the progress. But remember to be active in documenting as much as possible in VSTS, and define the creation of deliverable documents as defined backlogs in the sprint. Expect to use at least 10% of you time in VSTS to give visibility to others.

Conduct Solution testing

Quality is a vital aspect of and everybody in the team owns quality – including developers, managers, product owners, user experience advocates, and customer project members. It is vital that the solution testing is a customer responsibility and that the testing is structures and planned accordingly.

VSTS provide rich and powerful tools everyone in the team can use to drive quality and collaboration throughout the implementation process. The easy-to-use, browser-based test management solution provides all the capabilities required for planned manual testing, user acceptance testing, exploratory testing, and gathering feedback from stakeholders.

Creating test plans and performing test are one of the most vital elements in the iterative implementation. Please read the following URL for UAT testing https://www.visualstudio.com/en-us/docs/test/manual-exploratory-testing/getting-started/user-acceptance-testing . Building a test plan is therefore a mandatory step and this ensures that defined accept criteria have been met.

The following documents are the input to the solution testing:

Flow Test Script
UAT Test Script by Function
UAT Test Script by Role
UAT Test Script Details

Test & Feedback

Visual Studio Team Services Marketplace contains a ton of jewels, and one add-in that can accelerate testing and feedback is the Test & Feedback extension to VSTS.

When installing it, you get a small icon in Chrome, where test and feedback can be given.

When setting it up, you just point to the VSTS site. And then you are ready to start giving feedback, and to collect backorders, bugs or tests, just click on the play button.

While navigating and taking screenshots, notes and video, it all gets recorded, with URL, time etc.

When done with the recording then create a bug, test or create a test case:

After saving the bug. I see that a bug have been created in VSTS:

I now have a complete bug report in VSTS, that the consultants can start to process and to identify if this is a bug or an “as designed” feature.

Microsoft Tools available for a Dynamics 365 project.

When working in a project, it is important to know that Microsoft tools and services are tightly connected and that each tool can simplify and enrich the user experience and efficiency. In the following figure the different most common tools can be seen. Also pay attention to that there are powerful integrations between these tools, and this section will provide some small tips on how to make these tools work together.

Having a clear understanding of the tools available can speed up implementations, and also give better visibility to all stakeholders. In the following topics, some of these benefits are discussed.

Microsoft VSTS: Visual Studio Team Services

Create a VSTS site at http://VisualStudio.com. For internal projects, create using your domain account. For customer project, it is recommended to create the site on a customer controlled domain, and then add the domain users as guest users. Other elements in relation to VSTS have been covered earlier in this document.

Who uses it? All implementation project members both from EG, Customer and 3’rd party vendors.
When to use it? Everyday and in all SCUM meetings.
Pricing 5 users free, stakeholders free. Paid user is 6$/month.
Members with Visual Studio subscriptions don’t need licenses. https://www.visualstudio.com/team-services/pricing/

Microsoft Excel: Upload and Maintain

Microsoft Excel can be used to import and publish the structure into VSTS, when the Visual Studio Community edition is locally installed on your PC. This makes it possible to extract all fields and values by using VSTS defined query.

Then a process model may be imported and a best practice product backlog is ready to be processed. Step-by-Step instruction on how to use Excel with VSTS, take a look at https://www.visualstudio.com/en-us/docs/work/office/bulk-add-modify-work-items-excel

Who uses it? Solution Architects and vertical responsible.
When to use it? In the start, when uploading process models and WBS’s as a start.
When mass updating backlog items and tasks.
Pricing Office 365 prices. https://products.office.com/en-us/compare-all-microsoft-office-products
Visual Studio Community edition is free.

Microsoft Project: Plan and Breakdown

Microsoft Project is a vital tool for streamlining quotations, WBS and resource planning. Built-in vertical templates, familiar scheduling tools, and access across devices help project managers and teams stay productive and on target. Microsoft Project is also directly integrated with VSTS, and exporting created backlogs and tasks/activities to Dynamics 365 for Operations can be done, and create a complete end-to-end process covering “from quote to cash”.

“Plan-the-work”, and “Work-the-plan” are essential activities and where all stakeholders can participate and cooperate, and that we deliver what is planned, and the invoice the customer receives corresponds to the agreement and contract. Having predefined WBS structures in Microsoft Project simplifies project planning, and the VSTS is auto updated accordingly to how the planning is performed.

 

Who uses it? Presales, Sales and Project management.
When to use it? Microsoft Project is excellent to handle WBS structures when planning and quoting a project. Microsoft Project is also used for planning resources, and to reach project deadlines. For more information on how connect VSTS and Microsoft Project, take a look at https://www.youtube.com/watch?v=GjYu5WmcQXo
Pricing 30$/user/month for Project Online Professional
https://products.office.com/en-us/project/compare-microsoft-project-management-software?tab=tabs-1

Microsoft Outlook: Inform and Alert

Some stakeholders do not want to go deep into VSTS, or to extract information from Excel/Projects. Also when tasks are being assigned they want to be informed and when issues are resolved, they want to notified. Setting up notifications in VSTS solves this requirement, and will keep project participants informed of any changes. The email also contains a URL directly to the task/backlog.

Setting up notifications are done in VSTS, and individual filtering can be defined.

Who uses it? All project participants receive assigned notifications. Project managers and solution architect receive all notifications.
When to use it? When Outlook is used to keep participants informed.
Pricing Outlook included with Office 365 prices. No additional costs.

Microsoft Teams: Discuss and Involve

Informal communications are vital for any project. Tools like Skype for Business will take care of meetings and screen sharing, but Microsoft Teams gives flexible communication on all platforms and keep everyone in the loop. The users can see content and chat history anytime, including team chats with Skype that are visible to the whole team. Private group chats are available for smaller group conversations. The Microsoft teams can also function as the center point, with direct tabpages towards VSTS, Home Dynamics 365, LCS, Sharepoint etc. Since this September the Microsoft teams support guest users, and since these sites normally is on the customers tendents, we consultants are logging in with our company email addresses.

The VSTS Kanban board are easily accessible from the Microsoft teams.

Who uses it? Project participants involved in a project, that needs to have informal communication and the ability to work asynchrony with a discussion history.
When to use it? When more direct communication is needed, and especially for developers.
Pricing Teams normally included with Office 365 prices. No additional costs.

Microsoft SharePoint online: Documents and Archive

Even in a highly interactive and iterative environment, there is a need for documents. And then especially for deliverable documents. For this, SharePoint Online is used to store, track and develop the documentation. The internal folder structure is optimized for the sales process, and contains commercial binding documents. The SharePoint online site in mention here, is the SharePoint online site that is the customer property. The following document structure can be recommended.

After the project is delivered, the SharePoint site will remain as the documentation together with the VSTS site.

Who uses it? Project participants involved in a project, that needs to create or use formal documentation and deliverable.
When to use it? When having specific deliverable that.
Pricing SharePoint is included with recommended Office 365 E3 prices.

Microsoft Flow and PowerApps: Workflow and Apps

Microsoft Flow and PowerApps are quite new technologies in the Microsoft office family. The idea of bringing these tools into the scope, is to be able to have process and workflow automation in the implementations. PowerApps is also a great tool for data collection in testing and for getting feedback.

Some examples of Microsoft Flow:

Streamline approvals by sending files with approval requests

  • I’m sick button
    à Inform colleagues and block calendar.

Some examples of powerApps:

Who uses it? Superusers and Architects
When to use it? Used for automating tasks and to create fast simple prototype apps that can assist in the implementation
Pricing Flow and PowerApps are included in a Dynamics 365 Plan 2 license.

I hope this blogpost gives an insight into the digitalization process partners now are using in Dynamics 365 implementations. The Microsoft sites contains tons of more information and I recommend to explore more of the Microsoft technology stack that is available for Dynamics implementations.

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

 

Dynamics 365 for Finance and Operations On-Prem

Now the On-Prem system requirements are available : https://www.microsoft.com/en-us/download/details.aspx?id=55496

To get it running the minimum recommended requirements are:

Total Number of instances(VM/Machines) : 21

Total number of CPU cores : 104

Total Memory : 408 Gb

This minimum configuration will estimated be able to support 240-1200 users.

To sum it up: Go Cloud . Much smarter.

Dynamics 365, PowerApps, Flow and Common Data Model

This summer at WPC Microsoft unveiled the Dynamics cloud strategy by explaining their new initiative named Dynamics 365. Let me say it very short; IT ROCKS !

A good Q&A blog explaining it is this blog post from James Crowter. The essence is that the Dynamics 365 will be available in 2 editions; Business (cloud based NAV-edition) and Enterprise (new Dynamics AX, aka AX’7′). In addition, Microsoft have also launched the AppSource that can help finding the right business apps available from other ISV/VAR’s. This is a great offer to customers, where 3rd party apps and extensions can be previewed.

As the new name implies ‘Dynamics 365’, there will be a tight connection to the Office 365 package. Is there something Microsoft is good at, it is cross-selling and building strong dependency though the entire stack of Microsoft technology. This will further strengthen the offering. Some concerns are that the total offering could be regarded as an increase in costs. Very often we see customers comparing their offer based on the wrong assumptions, and where on-premises offers are compared with cloud and SaaS offerings. This will give the wrong perspective, because often in on-premises solutions don’t include all costs related to implementation and running the systems. What looks as cheap today may in the longer run actually result in higher costs and the build-up of a technological debt. When making the classic tradeoff decisions in technology, make sure you understand the implications.

Dynamics 365 is more than just a rebranding, and the introduction of the new Common Data Model(CDM) is the glue(database) that will stick all pieces/entities together. We can expect that in future, all the components will be working together across the ordinary product lines as we know it today. Customers will download a app, and don’t care if they have a business or enterprise edition of Dynamics.

CDM will over time make sure that Microsoft PowerApps enables users to create applications for Windows, iOS, and Android mobile devices. Using these apps, you can create connections to common SaaS services, including Twitter, Office 365, Dynamics 365, Dropbox, and Excel. Making all kinds of apps will easier, and in many cases not even involve any coding.

My Dynamics friends, please try out the Microsoft PowerApps because this a central element in the future of Dynamics 365, and also check out Microsoft Flow, to understand how the CDM in the future will enable the flow of data and processes between all components in the Dynamics 365 and Office 365 landscape.

Again we have a lot of learning, and I’m amazed how fast the transition to a cloud and mobile first business environment is going. This change will also make ripple effects on the entire ecosystem. New technologies require new organizational approaches and new workforce skills and knowledge. I assume that we again will see consolidations and mergers among the traditional ERP vendors, where the traditional WEB and .NET consultancy is being consolidated under the Dynamics 365 umbrella. We can also assume that smaller ERP vendors are just too small to master all these new technologies, and will slowly fade away. Soon, most of our business processes is handled on your mobile phone, backed by the cloud.

And remember, your best bet is to learn!

DAX 2012 end-to-end processes and APQC

With LCS (Life cycle Services), Microsoft have taken steps to introduced the APQC (American Productivity & Quality Center) Process Classification Framework as one of the business modelers.

Microsoft have also created some flowcharts to visualize the APQC processes in dynamics AX.

In some Dynamics AX implementation projects it have been decided to use the APQC model, and the participants(including me) was struggling connecting the APQC model to actual tasks and activities performed in DAX. So I started on a small mental journey to better understand how to use APQC model, and to be able to connect the dots from the classification framework into dynamics AX swim lane process maps.

First I thought that is was possible to take some level 4 APQC try to draw them. It is possible to draw the small and simple activities, but I was struggling drawing an end-to-end processes. Like order-to-cash. I quickly realized that I cannot use the LCS-APQC model to explain end-to-end DAX processes, since end-to-end processes often will involve many different departments, functions and processes.

To visualize a complex processes in an end-to-end process like order-to-cash using APQC classification model I could imagine looking something like this.

Here we see the involved processes visualized with the APQC classification. And it makes sense. But it does not tell you how to use AX in in this process, because that is described using the level 4 (or 5) in APQC. Like in 8.2.2 Invoice customers, would be a specific process, that we can visualize in Dynamics AX/LCS.

The other thing I realized was that APQC is not about implementing a ERP system, like DAX, but it’s about improving the processes you have(regardless of what systems and applications you have). Some of the activities may be handled in DAX, but many is just how to improve the actual steps the company actually does. Very often it can be the working procedure, involving many systems and internal working processes. So you don’t ask “How does this APQC process look in DAX ?”. You should rather ask “What APQC processes is involved in the selected DAX processes (like order-to-pay).

The APQC model will therefore not tell you how to use and setup Dynamics AX. It will just help you to have a common language to classify your processes and to improve them. When it comes to visualizing Dynamics AX processes in swim lane flowcharts it can be a good idea to use the APQC identifiers as process identification.

My tip when using LCS, is to understand the APQC process identification model, and use it as a repository and building blocks when you create your own processes.

By referring to APQC on your own business model you can start on process improvement project for that specific process. You will can also evaluate to implement the benchmark indicators as APQC exemplifies.

My quick summary is therefore;

1. Do I like LCS, and will use it à Yes, and I will use it for implementation and DAX process visualizations.

2. Do I like APQC à Yes, and I will use it for classification, benchmarking and process improvements.

3. Do I think that the LCS APQC processes reflect what we see at customers à No

4. What would be nice to have in LCS business modeler à More end-to-end processes, where the APQC process identifiers is used would be nice.

Happy DAX’ing !!

 

DAX 2012 R3 – Kick-back, Bonus and marked support

We often see that vendor’s use loyalty marketing in hopes of nurturing customers to be even better customers. Introducing new products in a marked can be costly, and success is often rewarded based on performance.

We often encounter requirements where vendors is compensating with kick-back, bonuses and marked support on specific products. The agreements can be formulated like; If you sell more than 1000 units per quarter, we will give you a 10% discount on purchased products for that quarter. But the transactions and invoices then may be that the original invoice is on the exact amount, and each quarter a credit note with the bonus is received. The idea behind kick-backs can be controversial, but this is not the topic her.

Having the cost price on the products determined on the actual purchase price give a very good foundation for a healthy business model, and also gives a much better insight to revenues and margins . But this “delay” in the transactions and payments introduces some issues for customers that relies on using inventory models as FIFO, FEFO etc.

  • What is the actual cost price in a scenario with kickbacks?
  • How can we calculate the actual revenue and margins?
  • Does my system support this feature?

If you have Dynamics AX 2012 R3, then you are lucky. There is a very nice way to handle kickbacks effect on product cost prices, and making sure that the cost prices is affected accordingly.

The name of the feature is charges. With charges we make adjustments of the cost prices on the right dates, and on the right transactions. The inventory closing will make sure that the cost prices are settled to the sales orders/issue inventory transactions.

The first step is to create a new charge, that will adjust the cost price on specific invoices, and also post on a selected ledger account.

Then let’s say the following received invoice from 2013 of 200 Surface Pro 128:

I see on the inventory transactions that the cost amount is 179.800 USD.

The vendor now want to give me a kick-back for my effort of managing to sell these old notepads. When they send us a credit note of 50% of the amount, and I want to post this so that the cost price changes on the inventory transactions.

When this is posted it will adjust the cost of the inventory transaction on the selected posting date:

The voucher transactions looks like this.

In AX 2012 R3 the inventory transactions will make the necessary adjustments on the issue transaction.

The next step is to receive and post the invoice from the vendor post it against the selected ledger account on the charge code, and eventually also run the inventory closing procedure.

Conclusion;

There are no need to credit post the purchase order for adding kickbacks, bonuses and marked support etc. It is supported in Dynamics AX 2012 R3.

Happy DAX’ing

 

 

 

 

Sneak preview of the WMS E&E

Last week I had the first preview of some of the new features in WMS E&E.  The development is going forward and at current stage we are 30% finished.  We are currently 11 people working on the solution, so this is a major investment for us.  We have been so lucky to get 4 developers from Russia and Ukraine. We are using Skype and Live Meeting, together with Citrix/Terminal Server and I really works out. 

The Inventory II module from FSB-development is paying off, and secures that this new solution gets its performance and scalability.

The first clients have looked at the solution and are so far happy with the progress.   

Here is some pictures from the presentation:

Gantt Scheduling of Input port

Here is one of the cool features we are woring on.  A gantt based schedulingtool for inbound logistics.  This is the process of directing trucks and containers to spesific input ports.  The user interface is simple and allows for moving the containers between ports and in time.  The gantt will also include a scheduling program, so that the user can recalculate the real ETA-date/time.  When moving a container in thime, the expected deliverydate for that shipment will update the rest of the system and the receive process will be visible for the user.

 

 There is a requirement to schedule the receive of containers. Each container can be placed at a specific input port, and is also allocated a timeslot for emptying the container. 

Each port as attached a working calender and a capacity . The capacity can be expressed in cartons/hour. 

In the gantt view, there must be possible to reschedule containers and also to move them between ports.

The scheculing engine must first allocate a container to a free port, and then find the apropriate timeslot for the container. As soon as the container has been given a port, the receive can start.    
When dobble clicking on a container, the container form must be brought up to front.   
At the botton the planned capacity must be shown, displaying number of cartons expected per hour.

E&E WMS extended with licensepalate

Yesterday we extended the system with a new storage dimension, called LicenseplateID.  This is a unique ID of a carton, that is barcodeded, can can be used anyware in the warehosue system both on put-away and on picking.  The ID is also used in the integration with the conveyor and miniload systems.  The process for extending the system to include a new dimension was quite easy.  Just search in the AOT of the Dynamics AX for the macro "inventdimdevelop", and all places that needed to be extended came up.  The hole process was done in under 3 hours.  Now the tricky part is starting, and that is to build all the funtionallity on to.  I included a small picture to the license plate.
 

Module to be included in E&E WMS II

The original guys behind the inventory transactionsystem (FSB – Flemming, Søren and Benny) is now out with a new module called Inventory II.  This module eliminates the need for large periodic inventory closing timeslots, and will also introduce some new technology called Watermark and SnapShot.  The module also clams to have fixed one of the major drawbacks in the current system, and that is the reservation mechanism.  In Inventory II It is possible to do a reservation on a higher dimensionlevel.  Here you can reserve against the total warehouse, and still be able to move the goods around within the Warehouse.  The exact location or serialnumber can be specified at a latertime, lets say when the items arrives in the pickingarea.  Today I will take a close look on this new module, that I know Microsoft also have evaluated.  My plan is to have this module as the Backbone in the E&E WMS module.  Here is some details that FSB has officially announsed:

Watermark technology

The architecture of Inventory II is based on the new revolutionary Watermark database technology. This technology ensures that even large and complex inquiries can be addressed by the database without performance slow down, even on growing databases.
All database inquiries will target a very small amount of data, as transactions are divided into relevant and non-relevant data in a very effi cient way without use of complex index keys, resulting in a second to none performance.

What is the impact of this technology?

Having the Inventory II architecture built on this technology gives significant advantages in the daily operation. These include:
Cost prices are calculated real-time
Master scheduling during daily operation
True 24/7 capabilities
No traditional inventory closing

Real-time reliable cost prices

The Inventory II module is monitoring all inventory movements,resulting in immediate action upon arrival of new or changed cost prices. Cost prices are real-time calculated, adjusted and updated throughout item transfers and bills of materials.

No inventory closing function

Adjustment of item consumption is calculated and posted immediately, whenever identifi ed, eliminating the traditional inventory closing function.

Cost price deviation monitor

Cost price deviations are identifi ed and presented real-time in the deviation monitor, giving the finance department a unique tool to act on. Cost price deviation alerts can be defi ned, reducing the risk for simple errors, like key-in errors in the purchase department or from suppliers that may reside in the system for a longer period, resulting in complicated cleanup tasks.

Physical cost valuation

Inventory valuation can be fi xed on packing slip prices instead of on delayed invoice prices. Upon receipt of invoices, deviations are posted on dedicated General ledger accounts.

New model for weighted average cost

A new simplifi ed periodic average cost price model is introduced.  Average cost prices are always calculated as clean weighted average prices for individual periods. Issues related to uncontrollable number of settlements are eliminated, as settlements are not created anymore.

Intelligent reservation

Rule based reservation levels

With the introduction of reservation levels a new degree of fl exibility is applied to the inventory. This will allow for reservation at less detailed levels, e.g. reservation at a warehouse without determination of location or batch at the time of order intake. Reservation levels can automatically be further specified prior to the time of picking to include e.g. what specific location to pick from. The process of picking is now a matter of reporting what specific items were picked: e.g. batch number, serial number, etc.

Transferring reserved items
Reserved inventory items can now be transferred. The warehouse staff is now free to manage the warehouse in an efficient way not having to struggle with restrictions on reserved items.

FIFO/LIFO reservation and picking order

Inventory II allows for controlling and optimizing the order in which items are reserved. A FIFO or LIFO reservation order can ensure that the oldest or newest items are picked fi rst, while a location reservation order can ensure that the physical process of picking is more optimal according to physical locations.

Lot reservation

Reservations directly on specifi c incoming lots leading to full traceability and transparency on reserved items.

24/7 capability

Run your Master scheduling any time
The Master scheduling can be executed at any time in daily operation and still delivers a consistent profi le even though new transactions are created concurrently.

Open slots for Inventory closing not necessary

The traditional Inventory closing function is not relevant any more as cost prices are calculated real- time. Because of that there are no needs for allocating exclusive time slots for this task during weekends or nights.

No worries regarding blocking locks

Neither Inventory closing (which is now just a matter of changing a date) nor Master scheduling causes blocking locks preventing daily operation to take place.
Simply run your operation 24/7 if you prefer

Performance and scalability

The foundation and architecture of the Inventory II solution has been designed with specifi c regards to high performance and scalability. The unique design patterns such as the Watermark and Snapshot technologies have been chosen out of performance reasons.

The result of this architecture is revolutionary:
Blocking lock issues in inventory are nonexistent. The improved concurrency will set new standards for scalability.

Inventory inquiries execute incredibly fast, independent of transaction volume, even on historical data.

Real cost prices

are calculated real-time, but the individual users are not affected by this processing since it is handled by a smart background process.

 

Main components in E&E WMS

The WMS E&E will cover most major components, and here is a list/overview of what lies ahead of us. 

The solution will be wide and deep.  As some has commented, many of theese solutions already exists, but not with the extensions we plan to include. 

One of the major elements is integrations to coveyor and storage systems, like univeyor.

Evaluation of the new Rapid Configuration Tool

I have been using the RCT tool for Axapta 3.0, and it was not the time to look at the RCTY tool for Dynamics Ax 4.0.  That was a disepointment. Much has been removed from the tool.  To remove the checklist and the helpfile was a bad decision.  The checklist’s is very user-friendly and give a much easier overview of the progress.  Keep also in mind that not all will be using MS Project, or have it installed.  I know about the hard work in coding the checklists, but my guess is that without it, the use of the tool will be marginalized.  Trust me.  The concept of having a nice graphical view of the implementation is important.  My belief is that this tool will not be used without it, because the concept of a checklist gives an impression of a best practice way of implementing Dynamic AX.  The tool was also used, and nice to have in the background to follow the progress of an implementation.  People also returned to the checklists to double check the setup date, and to get access to video, powerpoint presentations and documents. 

 

Our clients said “WOW!” when they understood and saw the easy visual GUI of the RCT.  Now we only have the boring normal presentation with tree’s, grids, tabs and buttons.

 

I’m also missing the nice help file that was included and integrated into the RCT of Axapta 3.0.

I feel that the RCT in Ax 4.0 is only a shadow of the solution that was available in Ax 3.0.  Please include a decent Helpfile and a checklist.

 

Take a look at the following screens, and tell me which is the most user-friendly :

Sorting on the “Release sales order picking”

It seams that the sorting and filtering of the "release salesorder picking" doesn’t work.
Thats to sad, because often you want to filter and sort based on date/salesordernumber or customer.
 
When you now are trying to sort, nothing happens.  I think the reason for this is that there is a innerjoin between the salesline table an a temperary table.
 
I’ll digg into it an check if it’s easy to change.
.
.
.
Actually the sorting query are set when you open the window, and it cannot de resetted, before you go out and inn again.  I had great expectations to the new functionallity, but I see now that it’s too shallow.  More advanced companies can have 20.000 + different items, and more then 20.000 open orders.  Then it’s impossible to do a manual item allocation.  Even the form will be extreemly slow, because it will take very long time to traverse though the 20.000 orders, and placing them in the temporary table.  This means that it’s still need to create a lot of addons regarding the advanced distribution and picking. 
 
I also see that this form actually are working orderline by orderline, but often a customer primary wants to see the allocation order by order.  When a orderline is physical reserved on another warehouse then the need for allocation should be cleared.  
 
Often there is a need to create some kind of "Wave"-solution, where you also take into account the movement of items from a warehouse/location to another.  Often the picking area is to small to handle items for all the orders, so there is a need to create a limit.  Tehrefor often the customer desides for that they should be able to deliver 3000 orders the next day, and the system should create transfer orders to supply the right items.  It is important to remember that you need to more the right items, or you will get a large backlog of orders waiting for only a few items.
 
I personally would like to add a lot more functionality here.
 

My first located error in Ax 4.0

It’s isn’t my intension to talk negative about the new Ax 4.0 version, but I will use the blog to find and document bugs in Ax 4.0.   Later I will send in a Service Request to Microsoft, so that all can benefit to improved software quality.
 
Here is my first bug :
 
A new nive feature in Ax 4.0 is the price simulation in sales and project.  The tool aid sales personel in calculating and saving various scenarios for sales orders.  but if you choose to the the new contribution ration to 100, you will get a dived by Zero error, as follows :
 
Error:
Error executing code: Division by zero.
Stack trace
(C)\Forms\SalesQuotationPriceSimulation\Data Sources\SalesQuotationPriceSimTable\Fields\ContributionRatio\Methods\modified
(C)\Classes\FormRealControl\modified
(C)\Classes\FormRealControl\leave
(C)\Classes\FormRun\task
(C)\Classes\SysSetupFormRun\task – line 20
 
I expect that it’s an easy thing to correct in future versions.

First hands on Dynamics Ax 4.0

Well…..Finally I managed to get Ax 4.0 up and running on my windows XP portable.  Actually it was quite easy and strait forward.  Just run the installation and you are up and running.  The only thing that needs to be presence was a connection to AD (Active Directory).  After that, you can take the PC with you anywhere.  Now I will start digging into the application….