About Kurt Hatlevik

I appreciate that I can be a part of this worldwide blog community—as a consultant working from Norway, the blog lets me share more than 20 years of experience with Microsoft Dynamics 365. Along the way, I participated in developing retail, PDA/RF, barcoding, master data, kitting and WMS-solutions for Dynamics. My blog focuses on my deepest interests and expertise: along with a 360 degree view of digital transformation topics, I welcome opportunities to dive into retail and intercompany supply chain automation, logistics, and production—everything that is moving around in a truly connected enterprise. As Enterprise Architect on Dynamics 365, I specialize in strategic development and planning for corporate vertical solutions and works to build international networks that increase knowledge and understanding for Dynamics 365. As an advocate for both providers and customers, I'm committed to ensure that customers constantly changing needs are meet, and I see community as key for increasing expertise. I welcome you to connect with me.

D365 X++: I’m using InventABC as my template

Hi fellow developers.

Back in the early days I was at a good old Damgaard conference, and attended at a technical session. In this session one of the founding fathers of Axapta came up with the phrase “Copy with pride“. What he meant was to look at existing code and patterns in the X++ code, and to feel free to used and copy these patterns from the Microsoft code and use them in customer customizations and extensions. I have lived by this principle and “copied with pride” and encourage other to feel free to take my work and copy what they need.

One of the more common customization/extension requests is where we have a class that does some magic and have a dialog, query and can run in a recurring batch etc. I guess all developers have their own approach, and as the blog name suggest I’m using the InventABC* classes as a “copy with pride” for my code. I know this code is more than 20 years old, and I guess there are better and advanced patterns to use. I don’t say this pattern is the best, but for me it is a fast and easy way to create simple periodic recurring classes. Please any feedback is welcome, and feel free to share your approaches.

The InventABCUpdate do contain the code to creating a dialog, executing in batch, parameters and most of the code I need to create the code needed. After done, my final code does not resemble the original InventABCUpdate code, but for me it is a journey where I have fixed starting point and then make constant adjustments until I have reached the desired solution.


For any of you that is starting the journey of learning development in X++, and quite quickly being able to create solutions our customers it is well worth investment to study of the InventABC to classes and then speed up your coding experience.

Happy coding friends.

 

EDIT: The community responds quickly A better class to use could be Tutorial_RunBaseBatch

D365 commerce; The need for speed (replenishment)

A nightmare for retailers is situations like this, where customers are experiencing empty shelfs.

The normal process for handling these situations is to have store employees to constantly monitor shelfs, back office stocks and to order replenishment when needed. We also see a lot of number crunching demand forecasting systems being offered to the marked, and with a questionable success rate.

Up until now, we have seen systems that have a quite slow react time. Goods are replenished and received. When selling the products there where delays in getting an updated on-hand, and very often a nightly master planning created planned orders that is manually firmed and then sent to the vendors or the central warehouse. In essence the process to getting replenishment signals through the supply chain can take day’s.

With the Dynamics 365 Commerce I now see a maturity to speed up the replenishment process, making it possible to dramatically shorten the lead times in each step. Let me explain:

  1. Retail statement trickle feed
    I have covered this topic a few times before, but in essence this means that the sales transactions generated from a POS sale, is updated at a much faster rate. Having an updated on-hand is essential, and where the outbound sales transactions reflects the actual situation on the shelfs.
  2. Inbound inventory operations in POS/Handheld
    The ability to quickly post the arrival and receive directly on the POS ensures that inbound transactions are updated in real time. Also the ability to quickly manually request replenishment for processing can now be done directly in the POS.
  3. Outbound inventory operation in POS/Handheld
    Faster request goods from other places can reduce the lead time and prevent stock-out situations. If some products are available at a another near store/warehouse that having processes to make them available across other near sales channels can speed up the replenishment.
  4. Planning Optimalization
    Nightly master planning (MRP) is too slow for retailers, and are better suited for businesses with longer lead-times. Retailers are looking for speed. With the Planning Optimalizations features we have a close to real-time generation of planned purchase and transfer orders. As soon as needed the system can initiate the supply chain process based on the on-hand and future expected transactions. In standard D365 there are also automatic firming processes that will generate purchase and transfer orders. These orders can be further processed and automatically be sent to vendors or from other storage locations. But it can also be used to automate transfers from a back-office storage and into the store shelfs.
  5. Solution is still in public preview but are very promising some exiting capabilities that can visually monitor your store and enable triggers that start executing supply chain processes. The first abilities are “Display effectiveness“, “Queue management” and “Shopper analytics“, and where the data collection are based on camera technology.

 

By optimizing the flow, it should be possible within Dynamics 365 to speed up and automate the replenishment process to be executed within 10 minutes after the sale have been conducted. Also, with signals from the connected store solution, this can be used to automatically adjust the minimum on-hand/Shelfs based on actual observed data. The gains and possibilities towards what Dynamics now can offer can again bring profit back to modern Brick&Morter retailers.

I hope this can inspire people to look deeper into the capabilities we now can deliver.

 

 

D365 Outsourcing your master data (DaaS)

In Dynamics 365 implementation projects I often say that all we do can mainly be categorized into 3 headline topics.

As we know for Dynamics 365, Microsoft is providing the software and the platform needed. It is easy to buy as a service where only a monthly commitment is essentially required. This is the nature of the Software-as-a-Service cloud-based concept.

The implementation partners are the best in structuring an implementation project and guiding step-by-step through the jungle. There is a lot of knowledge needed to understand complex processes needed in an organization. The partners are typically working tightly with people and ensuing that the organizational machinery is oiled and running smoothly. Defining processes that follows the entire end-to-end processes like procure-to-pay or order-to-cash.

The third element of equal importance is master data. I have written some previous blogpost about the subject, that is relevant to check up. Traditionally building the master data have often been the responsibility of the organization implementing Dynamics 365 and have been regarded as the heart and soul of the organization. The data is often manually built/generated and maintained, and low quality in master data can have catastrophic effects in any organization. If you cannot trust your data, then you do not have the information needed to make good business decisions.

Traditionally this have been identified as an integration requirement, but the main “ownership” of the data have still been handled internally in the company. Here is where I see a change. Instead of maintaining your own master data, the master data is maintained through cloud based public services operated based on a monthly fee. Just like SaaS (Software as a Service), we see mature implementations of DaaS (Data as a Service), where Dynamics 365 customers is closely integrating and outsourcing much of the maintenance to vertical specific online services.

But one aspect I see, is that the data providers are not global actors, but tends to be more local and verticalized services to specific domains. To be specific towards some providers here in Norway, I would like to name-drop some providers that I have encountered that provide such services.

BREG – Brønnøysund Register Center

The Brønnøysund Register Center develops and operates digital services that streamline, coordinate and simplify dialogue with the public for individuals and businesses. They operate many of the Norway’s most important registers, that contains information about companies, roles, tax etc Many of the services is free, and you can read more about them. If you need validated and confirmed information about any organization on Norway, then this is the registers you need to integrate towards. My friend Fredrik S, from Microsoft have create many demo’s showing how easy it actually is to set this up.

BISNode – Integrated credit check and risk management

Knowing the commercial risk is essential for all businesses. By having updated information, the decisions become less risky and less labor intensive.

1881 – search and return person address information

1881 is Norway’s leading provider of personal and business information and is providing information on telephone numbers, names and addresses. By having lookup into databases like 1881 you instantly get address information that enrich your data and simplifies transaction handling.

GS1 – The Global Language of Business

GS1 is the main provider of a lot of supply-chain oriented master data. Here you maintain product GTIN/barcodes, and they also provide a GLN (Global Location Number) register. When working with delivery addresses, then this is a must-have, because it ensures that goods are shipped and received to the right places. For a small fee, you get access to updated addresses directly into D365, where the addresses are also enriched with GPS coordinates. One more relevant aspect of GS1, is the GPC (Global Product Classification), that makes it easier to search for products globally and is also a very good reporting/analytics structure.

TradeSolution – The Norwegian Grocery PIM

If you are going to sell or purchase products through the Norwegian grocery chain’s, you need to have a close connection with Trade solution. I have written about them previously, but they make sure you have a reliable source of product master data and properties of the products. If you are using their services, there is no need for a third part PIM solution. They also provide a media store for product pictures.

NOBB – The Norwegian Construction PIM

NOBB contains almost 1,000,000 articles from 700 suppliers. You will find a wide range of product information, e.g. lumber, building materials, hardware, tools, fasteners, paints, houses and gardens, water / plumbing, electrical etc. The database contains basic data, price, logistics data, images, documentation streamline the industry’s need for structured and quality-assured basic data. The quality of the product database is ensured through the industry bodies Quality Forum and the Standardization Committee. The item owner updates and maintains the information based on industry standards (ref quality forum and standardization committee). This is a unique quality assurance and proximity to the industry that no other players can offer.

Elfo – The Norwegian electronics PIM

Electronics Industry Association – EIF – is an industry association for Norwegian-based companies that runs electronics-related activities that are mainly aimed at the professional market, either as importer, manufacturer or developer.

Farmalogg – The Norwegian pharmacy industry PIM.

The product register, with few exceptions, covers all goods that are sold in pharmacies, and it contains information that is necessary for the safe and efficient handling of the goods throughout the value chain from manufacturer / supplier, through wholesaler and retailer, to end-user.

Prisguiden – Compare your prices

Price databases that allows you to compare your prices to competitors. You can also measure the popularity and trends that happens in the market. What does customers search for? By tightly integrating towards the market, makes decision making easier and can be made more automated.

Consignor – Easy shipping

Delivery Management is all about connecting your warehouse to your customers in the most efficient way. By making one standard integration to services like Consignor, they make sure that no matter what combination of carrier services you choose, customer will get the same high-quality feeling when receiving a delivery from you.

Currency exchange rate

This service is already present in standard Dynamics 365 – Start using it!

There are surly many other master data providers, and here I have list listed a few actors in the Norwegian marked. By outsourcing your master data maintenance, you will get much higher quality on the data and more return on investment.

Are you ready to outsource your master data ?

DaaS Leben ist kein Ponyhof

 

 

D365 – My Covid-19 10 day’s response story

Hi Friends.

I hope you all are hanging in there and can still work and deliver excellent experiences with Dynamics 365.

I wanted to share my Covid-19 10-day response story on how fast a reduced scope Dynamics 365 implementations has been made available. Some weeks ago, we and Microsoft were contacted by an important player in the health industry, that urgently wanted to establish purchasing- and supply-chain processes for medications and equipment’s. The key element here was the urgency because it was unclear in what directions the pandemic would take here. What the customer needed was tools that could process information about supply providers and what kind of supplies is needed for readiness stockpiling. Our first step was to setup Dynamics 365 (CRM) to store relations and this was done in a few days. Then the next step was to setup and go live with a “minimum viable product” of Dynamics 365 finance and supply-chain apps. We had a goal of doing this in 10 working days. This is the story I would like to share.

Day 1: Onboarding, tools, and deployment

In the initiation of a project, I always have a document named “Welcome to the [Customer]-project”. This is a great document, because it contains all the essential information about the onboarding to a project and can be shared to all participants. It is typically a 6-7 pages document explaining the onboarding process and the main objectives. It also contains references to LCS, SharePoint/Teams sites, DevOps and URL’s to environments. The most valuable element is a full overview of all the people that will somehow be involved in the project. In this project we decided on a small efficient 4 person team(POD), and fast-track support from Microsoft.

Microsoft quickly processed licenses, and we quickly deployed the LCS project. The first we started was to deploy the Tier-2 sandbox, and we named this the ‘UAT’ environment, and this was to be used as the master data/golden environment in the start. We also deployed the Tier-1 sandbox and named this “Test”, and would be used to have access to Visual Studio etc. The initial version we deployed was 10.0.10.

We have a ready implementation templates that is imported into DevOps, that contains the main structure of requirements and tasks. We scope this down to the actual processes we need.


We also have a ready folder structure for the team’s site where we can store and complete all documentation. By the end of the first day we had established the tools needed for starting the project.

Day 2: Working with the generic tasks in the backlog

We established a 30-minutes daily sprint meeting with main implementation major actors, where the plan is presented, and where the today’s tasks are prioritized. We did not have the time to create large word documents, to we decided to document the solution in DevOps, and organizing all the system setup around the entity templates as they can be extracted from D365. I exported the templates to Excel, and then import them to DevOps using the Azure DevOps Office® Integration, and this gives be 419 tasks to setup as much as possible in standard.

This makes it possible for we to have a step-by-step task list of all the elements I need to build the “Golden environment”. Also, each task is being assigned, and the actual setup is documented with a direct URL to the D365 form, and a screen dump of the actual setup.

On the first day we where able to process close to 200 tasks and setting up the most generic parts of the system.

Day 3: Working with the finance task backlog

When working on the finance setup we have a standard chart-of-accounts we imported, and we had to setup financial dimensions. We are also setting up the accounting structure, creating a few inventory posting profile and setting up tax parameters. Normally this is quite strait forward and we can use much from previous projects.

Day 4-5: Working with products

Now the Excel skills is put to the test. We have a excel sheet that contain most of the product master data. In total over 33.000 products, and each product have classifications, attributes, properties, and vendor/producer information. We quickly decided to use the same item numbering as was present in the excel sheet. Each column in the sheet was classified if:

  • This is a field we have in D365?
  • Should field become a category in a hierarchy?
  • Should the field because an attribute?

To get the products inn it was a very advanced copy/paste/merge of data into excel sheets that we then imported into Dynamics 365. At the end, we realized that all information we had could be imported, and without any information loss. It was hard work, but the end result was promising containing a list of all medical supplies available and classified into the medical ATC structure.



We also imported barcodes, vendors, producers, employees, address information, external items names/descriptions, attributes.

Day 6: Frist demo, UAT and deploy production environment

On day 6 we were ready to show the actual master data, and the initial view of the system. The customer was impressed by how fast we where able to build a system and processes that was familiar to their operation.

We decided to update the system to 10.0.11, and in parallel with the setup of the system we had been working closely with the Microsoft fast track solution architect to make the environments ready for production deployment. After a few iterations we got the production environment up and running and performed a DB-refresh of the production environment with the master data we had in the tier-2 sandbox. This meant that now we had an environment available to start performing transactional process testing and trimming the systems. I know that this is not the normal way of doing this, but thanks to Microsoft’s understanding of the urgency we where allowed to go this “fast-track” route. In DevOps we established the processes we wanted to test and optimize.

Day 7: Test dual write, business events and power platform

As earlier described, we also implemented some of the “CRM” elements first. Now we could enable the dual write, and synchronize vendors, employees, and other information into the CDS. Our first step was just to validate that it was working as expected in the UAT, and it worked as a charm We can share these master data across the D365 platform.

The next thing was to test how we could use the business event framework to integrate towards a 3’rd party WMS provider. Dynamics 365 have a business event that is kicking in when performing a purchase order confirmation. We decided to enable purchase order change management to have a strict workflow and ensure that we would rely on the purchase confirmation process.

This allows us to create a solution where the business event is catched by a power automate flow, that fetched all the lines of the purchase confirmation. And then transforms this into the format that the WMS provider needs. We can also enrich the data sent to the WMS provider, so that it is sufficient with all needed master data in their system. The next step is to import receive lines from the 3’rd party WMS provider. This will happen by power automate creating an arrival journal, and then a batch job in D365 is posting it, and then posting a product receipt. It all ends with a new business event being triggered (Purchase order received) that will send a message to the WMS provider that the goods now have been received. What we then archive is that the on-hand in each system is synchronized, and without any major delay caused by processing.

In total we have setup quite a lot of batch jobs, that handles all from cleaning, posting, and planning. We used the takings from the following blogpost as a template for batch jobs.

Day 7: Master planning and Planning Optimization

We do expect that quite a lot of requisitions and requirements will be processed through the system. So, using the new planning optimization engine from Microsoft suited the project well. Calculating the requirement on all products is extremely fast and done within minutes. This will allow for faster reaction time to new requirements and potentially reduce stockout situations caused by vendor lead time.

On day 7 we also imported all employees and created some approval position hierarches. This way we can extend the workflow processing for approvals.

Day 8-9: Testing, Testing, Testing in UAT

We started day 8 by refresh the UAT environment and executing testing according to key central the business requirements defined in DevOps. We found 3-4 issues, that was reported to Microsoft (Index performance etc), that was quickly fixed within hours by the excellent support architects. We also wanted to provide a bit visually nicer purchase order form-letter, that was more presentable, and decided to import the modern reports package from Microsoft. This makes it a bit easier to adjust.

We did try out the configurable business documents, but in this case it would take a bit more time to learn properly (that we did not have..) to set up correctly. Any issues we found, was also fixed in the PROD environment.

The main processes we focused on was the procurement processes, with approval steps, and manual coordination with vendors.

Day 10: Project closure and training

On day 10, we summarized on how far we had come, and created a project closure/summary report that also contains next steps and more backlog suggestions. We have suggested additional focus on Azure Data Lake, Power BI and implementation of a vendor portal. We also planned to perform training and making final changes to enable end-user onboarding. What we see is that making a system ready is not just setting up the system but implementing the use of the system in the daily operation. This is expected to take more time, and we are ready to respond

Final words and tips

I really hope this system will show it value and will be regarded as small but valued contribution to the covid-19 response. Microsoft have published the following page where there are resources that can help. Microsoft have also launched a program where you can get a 200 seat Dynamics 365 Customer Service system for free for 6 months to Covid-19 response related activities. Se https://dynamics.microsoft.com/en-us/covid-19-offer/

If you have any similar stories, please share them. The Dynamics 365 community cares and stands united in this Corona-19 fight!

D365 Importing JSON data the hard way!

I recently created a solution where I’m importing products and all related data for the grocery industry, and I wanted to share my experience so that others may follow. This is not a “Copy-Paste” blogpost, but more show my approach to the process that can be used when working with more advanced and complex JSON integrations. Many industries have established vertical specific databases where producers, distributor’s and stores are cooperating and have established standards on product numbers, product naming, GTIN, Global Location Number (GLN) etc. In Norway we have several, and the most common for the grocery industry here is TradeSolution. Most products is available to the public at VetDuAt.no, but they also have a Swagger API where the JSON data can be fetched and imported to D365.

One of the experiences I had when starting this journey, is that D365 is not modelled according how the data in these industry specific public databases. Much is different, and the data is often structed differently. We also see that the product databases are quite rich in terms of describing the products with physical dimensions, attributes, packing structure, allergens, nutrition’s etc.

To give you a small figure of the complexity you often can find, here is a subset of the JSON hierarchy:

I needed to decide how I should import this data. Should I just import what I have fields for in D365? Should I extend D365 will lot’s and lot’s of new fields? Or should I model according to how the external database is presenting the data? I decided on the latter and import the data as it was presented. This would give the best result and the least information loss in the process. I decided to go for a model where D365 is requesting a JSON file from the Swagger API, and then placing the JSON structure in a C# class structure. Then extracting the data from the C# objects and place the data into a new module I named EPD. The next step the process does is to take these data and populate the standard 365.

The benefit I see is that I’m not overextending the std Microsoft code. The data is available in D365, and can be used in Power BI etc. I would like to share some of the basic steps when fetching such large data structures from external services.

Fetch the JSON from the service.

To fetch a JSON file, I’m using some .net references, that helps handle Active Directory and http connections. The first method shows how to get an accesstoken, and this is relevant of the swagger services requires this. The next method is where the swagger URL is queried, and the JSON file is returned. In additional some success/error handling.

So at this time we have the JSON file, and we want to do some meaningful with it. Visual studio have a wonderful feature, where you can paste a JSON, and convert it into classes. To make this work, you will have to create a C# project.

This will generate the C# class, and in this example the number of sub-objects and the number of properties is in the hundreds, and the properties can be objects and event array’s of objects.

In addition I need to have a method that takes that JSON file, and deserializes the content into the class methods.

Store the JSON object data into D365 tables.

So at this time, we have been able to fetch the data, and in the following code, I’m getting accesstoken, getting the JSON, deserializing the it into an C#-object, and parsing it forward for more processing.

 

Now, let’s start inserting this data into a new D365 table. For simplicity reasons, I have created a D365 table for each data object in the JSON file. This allow me to store the entire hierarchical JSON structure into D365 tables for further processing. As soon as I have the data stored in D365, I can create the codes that moves it forward into the more functional tables in D365.

A lesson learn was that when creating sub tables to store hierarchical JSON data, it is sometimes needed to create relationship between the records in multiple tables. Sometimes also uniqueness is required, and the best way I have found (so-far) is to create a GUID field, and use this GUID to relate the data in the different tables. This can easily be accomplished with the following code.

Create the std D365 data using data entities through code.

At this stage I have ALL the data in D365, and I can start processing the data. Here is a subsection of how I create released products by using standard
data entities, where a table containing the JSON data is sent in, and I can create the products and all sub tables related to products.

This approach has resulted in solution, where it is easy for the end-user to fetch data from external systems, and import them into D365. Here is a form showing parts of the “staging” information before it is moved into D365 standard tables. (This form in in Norwegian, and showing a milk )

I would like to thank the community for all the inspired information found out there. Especially Martin Dráb (@goshoom) that have been very active in promoting the “Paste JSON as classes” in Visual studio.

 

 

 

 

 

 

D365 : Automatic license disablement and login reminder

When assigning licenses to a Dynamics 365 user, it would be beneficial if the system disabled and removed a license from a user if the user has not used the system for X days. X minus 5 days the system should send out a message to the user like this:

“This is a login REMINDER for Dynamics 365. Kurt Hatlevik has not logged into for at least 25 days. Your last login was 2/20/2020 12:10:00 AM. Login to Dynamics 365 is required at least once within a 30 days window or your account may be deactivated without notice. Please login within the next few days to ensure access is maintained.

Reactivation will require user administrator approval and will be dependent upon license availability.”

This would make the system more secure, and it will also free up licenses for users that are not using the system.

If you also think this could be beneficial, please vote on this idea her : https://experience.dynamics.com/ideas/idea/?ideaid=c12972cf-6a6c-ea11-b698-0003ff68dcfc# 

D365 and the supply structures in grocery retail industry

Today I will write a bit about the supply chain structure we see in the retail grocery industry, and challengers Dynamics 365 may face, and how to address them. The grocery industry has for many years seen that industry collaboration brings benefits and synergies throughout the value chain. We see industry collaboration that offers a range of services to its owners, customers and partners. In the country where I’m from, the main collaboration initiative is TradeSolution, and is owned by the main grocery chains in Norway. TradeSolution operates and maintains central registers, databases, and various IT, reporting and analysis services in Norway, but we see much of the same pattern in other countries and other industries also.

One essential element is to have a unification of how to identify products and how the products are packed, ordered and shipped. In Norway we have the term EPD (Electronic Product-Database), that makes it easy for the entire Norwegian grocery marked to purchase and sell products. Much of the information shown in the blogpost here is originating from TradeSolutions public pages here.

What is EPD?

In Dynamics 365, one of the most essential SCM elements are products and released products, and the associated master data tables related to this. In the grocery industry it is actually the packaging that is the center of it all. The products etc is actually properties of a packing structure. It would be an oversimplification to say that EPD is products. EPD is describing not only the products, but also the packaging of the products. The EPD standard is describing the products in up to 4 levels: basis, inner box, outer box and pallet(with SCCS). Each level identified with a GTIN. See also my old blogpost about SSCC.

So far so good. We can model this in Dynamics 365 by having a product defined as a “Basis”, and use the inner box, outer box and pallet as unit conversions. In D365 we also have the possibility to create barcodes for each unit of measurement (UOM). It would also be quick to assume that the EPD number is an external item description.

Unfortunately, the grocery industry is a bit more complex. Let’s take a quick look on the EPD numbers of Coca Cola. It is actually 7 packing structure/EPD numbers, and these are shown to the right(7digits). All of the represents different packaging of the same basis unit, and can have different properties and attributes.

What we also see is that some boxes are marked with a “F”, that means this is a consumer unit. So talk in D365 language, is can be sold to consumers. Some are also marked with a “B” that means that this is the unit that the EPD number is purchased in. So if we take a detailed look at EPD 4507224, we see that it is defined what units you can sell, and what units you can purchase. On a single EPD number there is only one level you can choose to purchase of. Here are 2 examples that describes the complexity. First example is an EPD, where the grocer can sell in basis unit and in inner box unit (EPD 4507224)

The next example is where the grocer can also sell basis unit and in another inner box unit type (EPD 2142941)

As you can see here, the conversion between inner boxes to pallet results in different quantities.

To further add complexity we can add the definition mix to the element. The ordering is happening on the inner box level, but it actually contains separate products that is sold through the stores.

On last element is also the concept of unmarked variants. Like this package of yogurts.

Summary EPD

  • A product is identified by a EPD number (EPDnr)
  • A unit is identified by a GTIN (Global Trade Item Number)
  • A unit is called «pakning» in EPD
  • A product can have up to 4 levels of units (hierarchy)
  • A product can be a mix of multiple «basis» or «mellom/innerbox» units
  • A “basis” unit can be shared by many products
  • The first level of the units is called «basis» in EPD (often referred to as a customer unit or base unit)
  • The top level of the units is called «topp» in EPD (often referred to as a load carrier unit)
  • The levels between «basis» and «topp» (if any) are called « mellom/innerbox/outerbox » units
  • A basis unit can consist of units without identification called unmarked variants («umerkede varianter»)
  • Within an EPD structure, only one of the packings is used for ordering.
  • Multiple packings can be used for sale.

Some key issues we have faced with Dynamics 365 on how the industry is modelling products is the following:

  1. Cost: As seen, a product can be sold in many different UOM’s, and we also see that the industry can have different purchase prices depending on which EPD number you choose to order. Meaning that a 4 pcs pack have a different cost than a 24 pcs pack. As the product can be purchased in multiple UOM with different prices, it is difficult to model the cost pricing correctly, because the inventory transactions will be on the lowest item. The inventory transaction costing is based on the lowest level, meaning basis. This costing problem is the reason why I suggest FIFO in retail grocery implementations.
  2. On-hand: Keeping track of how many basis units, or other consumer units is difficult, because you do not always know with the consumer is breaking up a coca cola inner box. Where should the cost come from, when having multiple purchasing units as shown in figure. This makes it difficult in Dynamics 365 to 100% correctly model the revenue per pcs sold.
  3. Unit conversion: As shown in the example, the same unit (like pallet) can contain different number of basis products. This means that it is insufficient to unify the UoM per product. UoM conversion is EPD dependent. Clear relationships between the UoM must also be modelled. A product may have multiple definitions of an inner box, outer box and pallet.
  4. External item descriptions: Dynamics 365 external item description cannot be used, because it only supports one external item description per vendor. UoM is not taken into consideration.
  5. Attributes: In the grocery industry, there may be different attributes per EPD number, and also different attributes per UoM.

How to model this in Dynamics 365?

To solve the distribution requirements, we see in the grocery industry, it is required to do some front-end remodeling of how products are represented. The grocery industry are focused on packaging and Dynamics 365 is product oriented. The key here is that EPD is Object Oriented, a product can be represented in several packaging structures.

The entities we have at our disposal in Dynamics 365 is the following:

  1. Products and released products
  2. Unit of measurement and conversion
  3. Barcodes
  4. External item descriptions
  5. BOM’s

But Dynamics 365 is what is it, and any change on the architecture of how products and transactions are handled is not on the near roadmap. We must try to model this structure in a way, such that the EPD standard and Dynamics 365 standard is modelled to work jointly together.

First, lets try to model how the EPD(Only subset) from a grocery supply perspective(Not D365!). An EPD can consist of multiple packaging structures, and a package main contain packages. At the bottom of the packing structure there is a reference to a basic package, that describes the product.

 

 

When importing EPD based products I see the following as a solution:

  1. EPD will be a separate entity/Table, and modelled as the grocery industry have it.(New tables in D365, the feeds the std D365 tables)
  2. D365 products will be defined as the “Basic Package”
  3. The EPD package structure populate the barcode table and the product specific unit of measurement table. Because there is several packaging, the traditional naming of the unit of measurement cannot be used. The unit of measurement conversion is actually dependent on the EPD number. In essence, this means having unit’s of measurement named :

    PCS – Basic unit for the lowest basis product
    IB-4507224 – Unit for the inner box
    OB-4507224 – Unit for the outer box
    LC-4507224 – Unit for the load carrier

    With this we can create the unit of measurement conversion between the different types.

Let’s say we have the following simple product:

This would be modelled in D365 with a released product:

I would here have to define 4 unit of measurements:

I would then have to define the following unit conversions to describe the unit conversions between the different EPD packing structures.

The more EPD packing structures present, the more unit conversions needs to be defined. (In the coca cola example there will be 6 more conversions)

We also need to store GTIN per packing unit per EPD:

We also have the Physical dimensions menu item, that now let’s us describe the physical dimension on the product per EPD unit.

 

In Dynamics 365 we can only select one suggested purchasing unit. So if you have multiple EPD associated with a product you will have to choose one, and this is the unit that is suggested.

The purchase order would then look like this, and where the unit is describing the EPD number.

To keep track of all unit conversions, GTIN/Barcodes etc will be an impossible manual job. Since EPD is an industry standard, all of these data is imported through WEB-services.

TradeSolution have their webservices that offer the possibility to send EPD structures to D365. This way, all packing structures of products can be automatically imported, distributed into std D365 and adjusted when needed.

The suggestion is not 100%, but it would make sure that grocery retailers can procure and sell the products, while also have the concept of packing structures in place.

Let’s conquer the grocery industry also

 

 

 

 

 

 

 

D365 – What have changed (pmfTablehasChanged)

This short post is for you hardcore X++ developers that create magic everyday. D365 have the following method, that allows you to validate if any fields on a record have been changed. If it returns true, then something has changed, and if false, then nothing has been changed. There are scenario’s where you would like to know if there have been any changes to the record before you update/write to the Db, to save some roundtrips to the Db.

Then this is nice, and 100% std

Happy coding friends.

Batch Jobs; Take control of the executions

Dynamics 365 can be automated quite a lot with the use of batch jobs. With batch jobs, your Dynamics 365 solution becomes “alive”, and we can set up the system to automate many manually processes. Lets say to have the following “vanilla process”, and wants to automate as many steps as possible.



This document covers the Batch jobs needed to be setup for this process to be as automated as possible. I wanted to put a structured system on all the batch jobs that is typically used in a production system. But this also generates a lot of data, that you don’t normally need. It is therefore common to create both functional batch jobs that processes and executes functionality, and also execute cleanup jobs that removes irrelevant data.

Batch job Naming conventions

To make it simpler to understand the batch jobs a simple structure of naming the batch jobs have been created. The first character is just “A”, to make sure that the sorting of the batch jobs is in the best possible way, and that the batch jobs can be sorted according to name. The next is a 3 digit number and at the last there is a then a description that explains the batch job.

ID

Description

A001-A099

System administration batch jobs

A100-A199

Data management batch jobs

A200-A299

General ledger batch jobs

A300-A399

Procurement and sourcing batch jobs

A400-A499

Sales and marketing batch jobs

A500-A599

Retail batch jobs

A600-A699

Inventory management batch jobs

A700-A799

Warehouse management batch jobs

Reach of these ranges are then set up as batch groups, and you can better control what AOS servers is executing what type of batch jobs:


In this blog post more than 87 batch jobs have been specified, and that keeps the Dynamics 365 system updated and as automatic as possible

Job description
A001 Notification clean-up
A002 Batch job history clean-up
A003 Batch job history clean-up (custom).
A004 Daily Diagnostics rule validation
A005 Weekly Diagnostics rule validation
A006 Monthly Diagnostics rule validation
A007 Named user license count reports processing
A008 Databaselog cleanup
A009 Delete the inactivated addresses
A010 Scan for orphaned document references.
A011 Report data clean up
A012 Cryptography crawler system job that needs to regularly run at off hours.
A014 Updates system notification states.
A015 Deletes non-active and orphaned system notifications.
A016 Database compression system job that needs to regularly run at off hours.
A017 Database index rebuild system job that needs to regularly run at off hours
A018 Deletes expired email history.
A019 Process automation polling system job
A020 Scan for document files that have been scheduled for physical deletion.
A021 System job to clean up expired batch heartbeat records.
A022 System job to seed batch group associations to batch jobs.
A023 System job to clean up unrecovered user session states.
A024 Change based alerts
A025 Due date alerts
A026 Email distributor batch
A027 Email attachment distributor
A103 Entity Store Deploy measurement
A103 Refresh data entity
A200 Clean up ledger journals
A201 Import currency exchange rates
A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.
A205 Update purchase and sales budget
A206 Source document line processing
A207 Source document line processing queue cleanup
A208 Ledger journal monitor
A300 Purchase update history cleanup
A300 Purchase update history cleanup
A301 Delete request for quotation
A303 Draft consignment replenishment order journal cleanup
A303 Run Forecast planning
A304 Run Master planning
A305 Post product receipt
A403 Sales update history cleanup
A405 Order packing slip
A406 Order invoice
A407 Calculate sales totals
A500 All retail distribution jobs (9999)
A501 Upload all channel transactions (P-0001)
A502 Process Assortment
A503 Update listing status
A504 Product availability
A505 Generate related products based on customer transactions
A506 Process delivery modes
A507 Synchronize orders job
A508 Update search Product data
A509 Update search Customer data
A510 DOM batch job
A511 DOM fulfillment data deletion job
A512 Default channel database batch job
A513 Recommendation batch job
A514 Retail scheduler history data removal batch job
A515 Create customers from async mode
A516 Retail transaction consistency checker orchestrator
A517 Retail transactional statement calculate batch scheduler
A518 Retail transactional statement post batch scheduler
A519 Retail financial statement calculate batch scheduler
A520 Retail financial statement post batch scheduler
A521 Process loyalty schemes
A522 Post earned points in batches
A523 Process loyalty lines for other activities
A524 Retail time zone information job
A600 Calculation of location load
A601 Inventory journals clean-up
A602 Inventory settlements clean up
A605 On-hand entries cleanup
A606 Warehouse management on-hand entries cleanup
A607 On-hand entries aggregation by financial dimensions
A608 Cost calculation details
A609 CDS – Post integration inventory journals
A700 Work creation history purge
A701 Containerization history purge
A702 Wave batch cleanup
A703 Cycle count plan cleanup
A705 Work user session log cleanup
A706 Wave processing history log cleanup
A707 WMS Replenishment
A708 Automatic release of sales orders

I will not go in detail of all the jobs, but here I at least refer to where you can find the menu item or what class is used in the batch job tasks. Also take a look at blog post by the D365 Solution architecture team, that is a subset of the batch jobs presented in this blog post.

System administration batch jobs

These are general system batch jobs that can perform cleanups and other general executions.

ID

Name, path and recurrence

Description and recurrence

A001 A001 Notification clean-up

System administration > Periodic tasks > Notification clean up

Daily

This is used to periodically delete records from tables EventInbox and EventInboxData. Recommendation would also be if you don’t use Alert functionality to disable Alert from Batch job.

A002 A002 Batch job history clean-up

System administration > Periodic tasks > Batch job history clean-up

Daily

The regular version of batch job history clean-up allows you to quickly clean all history entries older than a specified timeframe (in days). Any entry that was created prior to – will be deleted from the BatchJobHistory table, as well as from linked tables with related records (BatchHistory and BatchConstraintsHistory). This form has improved performance optimization because it doesn’t have to execute any filtering.

A003 A003 Batch job history clean-up (custom).
System administration > Periodic tasks > Batch job history clean-up (custom)

Manually

The custom batch job clean-up form should be used only when specific entries need to be deleted. This form allows you to clean up selected types of batch job history records, based on criteria such as status, job description, company, or user. Other criteria can be added using the Filter button.

A004 A004 Daily Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Daily

Incorrect configuration and setup of a module can adversely affect the availability of features, system performance, and the smooth operation of business processes. The quality of business data (for example, the correctness, completeness, and cleanliness of the data) also affects system performance, and an organization’s decision-making capabilities, productivity, and so on. The Optimization advisor workspace is a tool that lets you identify issues in module configuration and business data. Optimization advisor suggests best practices for module configuration and identifies business data that is obsolete or incorrect.
A005 A005 Weekly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Weekly

Performs a weekly validation and diagnostics.
A006 A006 Monthly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Monthly

Performs a monthly validation and diagnostics based on the rules.
A007 A007 Named user license count reports processing

Class : SysUserLicenseMiner

Daily

Batch job that counts number of users that have been using the system. The data is used in the Named user license count report. D365 creates this execution automatically, but you have to rename it to fit this structure.
A008 A008 Databaselog cleanup

System administration > Inquiries > Database > Database Log

Weekly

This job cleans up the database log, and makes sure that only (let’s say) 100 day’s of history remains. In the query criteria I set created date time less than “d-100”, to ensure that I keep 100 day’s of database log. This is general housekeeping and dusting in the system, and keeping the system nice and tidy.
A009 A009 Delete the inactivated addresses

Organizational administration > Periodic >Delete inactivated addresses

Weekly

Deletes addresses that have been set to inactive.
A010 A010 Scan for orphaned document references.

Class : DocuRefScanOrphansTask

Daily

Batch job that is setup automatically by the system, and scans for document references where the source record is deleted.
A011 A011 Report data clean up

Class: SrsReportRunRdpPreProcessController

Daily

Cleans up any data generated for SSRS reports.
A012 A012 Cryptography crawler system job that needs to regularly run at off hours.

Class: SysCryptographyCrawlerTask

Every 3 days

Auto created at D365 setup …Not sure what this is, yet…..
A013 A013 Data cache refresh batch

System administration > Setup >

Data cache >Data cache parameters

Every 10 minutes

The data cache framework is used to cache data sets and tiles. Enabling of the data cache framework will redirect certain queries against a cache table instead of executing them against the underlying source tables.
A014 A014 Updates system notification states.

Class : SystemNotificationUpdateBatch

Every minute

Updates notifications,
A015 A015 Deletes non-active and orphaned system notifications.

Class : SystemNotificationScanDeletionsBatch

Daily

Deletes non-active and orphaned system notifications
A016 A016 Database compression system job that needs to regularly run at off hours.

Class: SysDatabaseCompressionTask

Daily

Compresses the database
A017 A017 Database index rebuild system job that needs to regularly run at off hours

Class: SysDatabaseIndexRebuildTask

Daily

Rebuilds indexes to ensure good index performance
A018 A018 Deletes expired email history

Class: SysEmailHistoryCleanupBatch

Daily

Deletes expired email history
A019 A019 Process automation polling system job

Class: ProcessAutomationPollingEngine

Every minute

Using business events, the polling use case can be re-designed to be asynchronous if it is triggered by the business event. Data will be processed only when it is available. The business logic that makes the data available triggers the business event, which can then be used to start the data processing job/logic. This can save thousands of batch executions from running empty cycles and wasting system resources.
A020 A020 Scan for document files that have been scheduled for physical deletion.

Class: DocuDeletedFileScanTask

Hourly

Scan for document files that have been scheduled for physical deletion
A021 A021 System job to clean up expired batch heartbeat records.

Class : SysCleanupBatchHeartbeatTable

Daily

Cleans up the new internal monitoring BatchHeartbeatTable table (Only after PU32), and used for priority-based batch scheduling.
A022 A022 System job to seed batch group associations to batch jobs.

Class:
SysMigrateBatchGroupsForPriorityBasedScheduling

Daily

See priority-based batch scheduling.
A023 A023 System job to clean up unrecovered user session states.

Class:
SysUnrecoveredUserSessionStateCleanup

Daily

Cleans up sessions that is unrecovered.
A024 A024 Change based alerts

System administration > Periodic tasks > Alerts > Change based alerts

Hourly (or faster)

Events that are triggered by change-based events. These events are also referred to as create/delete and update events.

See also Microsoft docs.

A025 A025 Due date alerts

System administration > Periodic tasks > Alerts > Due date alerts

Hourly (or faster)

Events that are triggered by due dates.

See also Microsoft docs.

A026 A026 Email distributor batch

System administration > Periodic tasks > Email processing > Email distributor batch

Send emails. See also Microsoft docs.
A027 A027 Email attachment distributor Send emails, with attachments. For workflow.

Data management batch jobs

Data management executions can generate a lot of data, and to maintain performance and avoid data growth, it is relevant to clean up staging tables and job executions. Also document any of your recurring executions to make it easy and simple to maintain a overview of your data imports and exports that are recurring.

ID

Name, path and recurrence

Description

A100

[Cannot be executed in batch]

Data management workspace > “Staging cleanup” tile

Manually

Data management framework makes us of staging tables when running data migration. Once data migration is completed then this data can be deleted using “Staging cleanup” tile.

A101

A101 Job history cleanup

Data management workspace > Job history cleanup

Daily

The clean up job will execute for the specified amount of time. If more history remains to be cleaned up after the specified about of time has elapsed, the remaining history will be cleaned up in the next recurrence of the batch job or it can be manually scheduled again.

A102

A102 BOYD Data management export

Data management workspace >export in batch

Hourly

If you have a data management export to BYOD, then this can be executed in batch. There are other options that also can be evaluated for this purpose. See A102 BOYD Data management export

A103

A103 Refresh data entity

System administration à Setup à Entity Store

Monthly

To refresh the entity store (the built in embedded power BI). The refresh updates the aggregated measurements, and is only relevant of there are updates or changes that affect these.

General ledger batch jobs

ID

Name, path and recurrence

Description

A200

A200 Clean up ledger journals

Periodic tasks > Clean up ledger journals

Weekly

It deletes general ledger, accounts receivable, and accounts payable journals that have been posted. When you delete a posted ledger journal, all information that’s related to the original transaction is removed. You should delete this information only if you’re sure that you won’t have to reverse the ledger journal transactions.

A201

A201 Import currency exchange rates

Currencies > Import currency exchange rates

Daily

Automatically imports exchange rates from the bank.

A202

A202 Purchase budget to ledger

Inventory management > Periodic tasks > Forecast updates > Purchase budget to ledger

Monthly

Posts the purchase budget to ledger

A203

A203 Sales budget to ledger

Inventory management > Periodic tasks > Forecast updates > Sales budget to ledger

Monthly

Posts sales budget to ledger

A204

A204 Update purchase and sales budget

Inventory management > Periodic tasks > Forecast updates > Update purchase and sales budget

Monthly

Updates the purchase and sales budget.

A205

A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.

General Ledger > Periodic tasks > Batch transfer for subledger journals

Daily

Batch transfer for subledger journals

A206

A206 Source document line processing

Class: SourceDocumentLineProcessingController

Every 10 minutes

Used for accounting distribution. See Microsoft docs.

A207

A208 Source document line processing queue cleanup

Class: SourceDocumentLineProcessingQueueCleanupController

Weekly

Used for cleaning up accounting distribution. See Microsoft docs.

A208

A208 Ledger journal monitor

Class: LedgerJournalTableMonitorController

Every 6 hours

Monitors if ledger journals should be blocked or opened.

Procurement and sourcing batch jobs

ID

Name, path and recurrence

Description

A300

A300 Purchase update history cleanup

Periodic tasks > Clean up > Purchase update history cleanup

Weekly

This is used to delete all updates of confirmations, picking lists, product receipts, and invoices generate update history transactions.

A301

A301 Delete request for quotation

Periodic tasks > Clean up > Delete requests for quotations

Manually

It is used to delete requests for quotation (RFQs) and RFQ replies. The corresponding RFQ journals are not deleted, but remain in the system.

A302

A302 Draft consignment replenishment order journal cleanup

Periodic tasks > Clean up > Draft consignment replenishment order journal cleanup

Weekly

It is used to cleanup draft consignment replenishment order journals.

A303

A303 Run Forecast planning

Master planning > Forecasting > Forecast planning

Weekly

Demand forecasting is used to predict independent demand from sales orders and dependent demand at any decoupling point for customer orders. See also at Microsoft docs, where using additional azure services to perform the calculation.

A304

A304 Run Master planning

Master planning > Master planning > Run > Master planning

Daily

Master planning is used to generate planned (purchase) orders, based on the coverage settings. We expect this service to be enhanced with more real-time oriented planning engine. The master planning batch job execution is located at. Also check out the Microsoft docs on this (large) subject.

A305

A305 Post product receipt

Procurement and Sourcing > Purchase orders > Receiving products > Post product receipt

Automatically post purchase receipt when all lines have been registered,

Sales and marketing batch jobs

ID

Name, path and recurrence

Description

A400

A400 Delete sales orders

Periodic tasks > Clean up > Delete sales orders

Manually

It deletes selected sales orders.

A401

A401 Delete quotations

Periodic tasks > Clean up > Delete quotations

Manually

It deletes selected quotations.

A402

A402 Delete return orders

Periodic tasks > Clean up > Delete return orders

Manually

It deletes selected return orders.

A403

A403 Sales update history cleanup

Periodic tasks > Clean up > Sales update history cleanup

Weekly

It deletes old update history transactions. All updates of confirmations, picking lists, packing slips, and invoices generate update history transactions. These transactions ca be viewed in the History on update form.

A404

A404 Order events cleanup

Periodic tasks > Clean up > Order events cleanup

Weekly

Cleanup job for order events. Next step is to remove the not needed order events check-boxes from Order event setup form.

A405

A405 Order packing slip

Sales order > Ordershipping > Post Packingslip

Hourly

Set up automatic packingslip posting of the sales order is completely picked. (If this is the process). This means that as soon as the WMS have picked the order it gets packingslip updated.

A406

A406 Order invoice

Accounts payable > Invoices > Batch invoicing > Invoice

Hourly

Set up automatic invoice posting of the sales order is completely packingslip updated. (If this is the process).

A407

A407 Calculate sales totals

Periodic tasks > Calculate sales totals

Recalculate the totals for the sales order. This is typically used in scenario’s when the sales order is part of a “Prospect to cash” scenario. See docs.

Retail batch jobs

ID

Name, path and recurrence

Description

A500

A500 All retail distribution jobs (9999)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

This batch job is sending all distribution jobs to the retail channel database. This data like products, prices, customers, stores, registers etc. The distribution job is a “delta” distribution, meaning that only new and changed records are sent. There is a lot of more to be discussed on how to optimize the 9999-distribution job, and for really large retail installations some deep thinking is required. For smaller installations it should be OK to just use the setup that is automatically generated when initializing D365 retail/Commerce.
A501

A501 upload all channel transactions (P-0001)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

The P-0001 is sending the retail transactions back from the POS to the D365 HQ, where the retail transactions can be posted and financially updated.
A502

A501 Process Assortment

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process Assortment

Hourly

This job processes the assortment based on the assortment categories set on an item, and based on the assortment set up, puts the items in the relevant stores’ assortment. When defining an assortment, you have in D365 the possibility to connect organization hierarchies to retail category hierarchies. The process assortment will perform the granulation of this, so that D365 have a detailed list of each product that is present in each store. The assortment is setup under Retail and Commerce à Catalogs and assortments à Assortments and more details is available on Microsoft docs.
A503

A503 Update listing status

Retail and Commerce > Retail and Commerce > Products and Inventory > Update listings

Daily

The listing status is related to publishing a retail catalog to an online store. The Microsoft documentation is not the best in this area, and the closes explanation I have is that it is related to the listing status on the catalog.
A504

A504 Product availability

Retail and Commerce > Retail and Commerce > Products and Inventory > Product availability

Daily

The batch job for product availability is calculate if a product is available on online store. Checkout this blogpost for further details. SiteCore eCommerce integrations can benefit from this, and in essence it populates the data needed for distribution job 1130, and that maintains the following tables into the channel database
A505

A505 Generate related products based on customer transactions

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Generate related products

Daily

This job will automatically populate related products based on sales transaction purchase history. The two relationships created are ‘customers who bought this item also bought’ and the ‘frequently bought together’ relation types. This data can then further be used in eCommerce scenario’s. Fore deep details, take a look at the class ‘RetailRelatedProductsJob’
A506

A506 Process delivery modes

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process delivery modes

Daily

This job sets up delivery modes on a new store when added to organization hierarchy ‘retail store by department’. On the modes of delivery you can assign a organizational hierarchy, and this batch job assigns the specific modes of deliveries to each store. The modes of delivery is used in omnichannel scenario’s where the customer can have their products sent home etc.
A507

A507 Synchronize orders job

Retail and Commerce > Retail and Commerce IT > Synchronize orders

Hourly

If you have setup your channels to create sales order asynchrony, this job will create the sales orders and post payments. Also take a look at the following Microsoft docs on how sales orders and payments are synchronized from an online store.
A508

A508 Update search Product data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of products, that makes it faster and easier to search for products in the call center.
A509

A509 Update search Customer data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of customers, that makes it faster and easier to search for customers in the call center.
A510

A510 DOM batch job

Workspace > Distributed Order Management > Dom processor job setup

Hourly

Run distributed order management on retail sales orders to determine what warehouse should deliver the sales order
A511

A511 DOM fulfillment data deletion job

Workspace > Distributed Order Management > DOM fulfillment data deletion job setup

Daily

Cleans up the DOM data that is no longer the valid calculation.
A512

A512 Default channel database batch job

Class : RetailCdxChannelDbDirectAccess

Every 3 minutes

This job main duty is to check all Download sessions and Upload sessions with status “Available”, then it will apply the data to respective target DB’s (AX or channel DB). See also this blog.
A513

A513 Recommendation batch job

Class FormRunConfigurationRecommendationBatch

Weekly

Se Microsoft docs.
A514

A514 Retail scheduler history data removal batch job

Retail and Commerce > Headquarters setup > Parameters > Retail scheduler parameters

Class: RetailCdxPurgeHistory

Daily

Deletes CDX history. Typical only keeping 30 days of CDS history
A515

A515 Create customers from async mode

Retail and Commerce > Retail and Commerce IT > Customer > Create customers from async mode

Hourly

If customers should be created async (parameter), then this job will create the customer.
A516

A516 Retail transaction consistency checker orchestrator

Retail and Commerce > Retail and Commerce IT > POS posting > Validate store transactions

Hourly

Performs validation on the unposted POS transactions. See Microsoft docs.
A517

A517 Retail transactional statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Creates transactional statement. Se the following blog post.
A518

A518 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Create and posts sales orders. Se the following blog post.
A519

A519 Retail financial statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate financial statement in batch

Daily

Retail statement Trickle feed financial statement calculate. Creates financial statement. Se the following blog post.
A520

A520 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post financial statement in batch

Daily

Retail statement Trickle feed financial calculate. Posts shift declaration Se the following blog post.
A521

A521 Process loyalty schemes

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty schemes

Processes loyalty schemes. See Microsoft docs.
A522

A522 Post earned points in batches

Retail and Commerce > Retail and Commerce IT > Loyalty > Post earned points in batches

Loyalty points should be posted in batch. See Microsoft docs.
A523

A523 Process loyalty lines for other activities

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty lines for other activities

Other Loyalty points in batch. See Microsoft docs.
A524

A524 Retail time zone information job

Monthly

Generates timezone information up until 2054. Ensures that timezone used in the store does not causes inconsistent dates.

Inventory management batch jobs

ID

Name, path and recurrence

Description

A600

A600 Calculation of location load

Inventory management > Periodic tasks > Clean up > Calculation of location load

Daily

WMSLocationLoad table is used in tracking weight and volume of items and pallets. Summation of load adjustments job can be run to reduce the number of records in the WMSLocationLoad table and improve performance.

A601

A601 Inventory journals clean-up

Inventory management > Periodic tasks > Clean up > Inventory journals cleanup

Weekly

It is used to delete posted inventory journals.

A602

A602 Inventory settlements clean up

Inventory management > Periodic tasks > Clean up > Inventory settlements cleanup

Manually/Yearly

 

It is used to group closed inventory transactions or delete canceled inventory settlements. Cleaning up closed or deleted inventory settlements can help free system resources.

Do not group or delete inventory settlements too close to the current date or fiscal year, because part of the transaction information for the settlements is lost.

Closed inventory transactions cannot be changed after they have been grouped, because the transaction information for the settlements is lost.

Canceled inventory settlements cannot be reconciled with finance transactions if canceled inventory settlements are deleted.

A603

A603 Inventory dimensions cleanup

Inventory management > Periodic tasks > Clean up > Inventory dimensions cleanup

Manually/Yearly

This is used to maintain the InventDim table. To maintain the table, delete unused inventory dimension combination records that are not referenced by any transaction or master data. The records are deleted regardless of whether the transaction is open or closed.

Inventory dimension combination record that is still referenced cannot be deleted because when an InventDim record is deleted, related transactions cannot be reopened.

A604

A604 Dimension inconsistency cleanup

Inventory management > Periodic tasks > Clean up > Dimension inconsistency cleanup

Manually/Yearly

This is used to resolve dimension inconsistencies on inventory transactions that have been financially updated and closed. Inconsistencies might be introduced when the multisite functionality was activated during or before the upgrade process. Use this batch job only to clean up the transactions that were closed before the multisite functionality was activated. Do not use this batch job periodically.

A605

A605 On-hand entries cleanup

Inventory management > Periodic tasks > Clean up > On-hand entries cleanup

Monthly

This is used to delete closed and unused entries for on-hand inventory that is assigned to one or more tracking dimensions. Closed transactions contain the value of zero for all quantities and cost values, and are marked as closed. Deleting these transactions can improve the performance of queries for on-hand inventory. Transactions will not be deleted for on-hand inventory that is not assigned to tracking dimensions.

A606

A606 Warehouse management on-hand entries cleanup

Inventory management > Periodic tasks > Clean up > Warehouse management on-hand entries cleanup

Weekly

Deletes records in the InventSum and WHSInventReserve tables. These tables are used to store on-hand information for items enabled for warehouse management processing (WHS items). Cleaning up these records can lead to significant improvements of the on-hand calculations.

A607

A607 On-hand entries aggregation by financial dimensions

Inventory management > Periodic tasks > Clean up > On-hand entries aggregation by financial dimensions

Weekly

Tool to aggregate InventSum rows with zero quantities.

This is basically extending the previously mentioned cleanup tool by also cleaning up records which have field Closed set to True!

The reason why this is needed is basically because in certain scenarios, you might have no more quantities in InventSum for a certain combination of inventory dimensions, but there is still a value. In some cases, these values will disappear, but current design does allow values to remain from time to time.

If you for example use Batch numbers, each batch number (and the combined site, warehouse, etc.) creates a new record in InventSum. When the batch number is sold, you will see quantity fields are set to 0. In most cases, the Financial/Physical value field is also set to 0, but in Standard cost revaluation or other scenarios, the value field may show some amount still. This is valid, and is the way Dynamics 365 for Finance and Operations handles the costs on Financial inventory level, e.g. site level.

Inventory value is determined in Dynamics 365 for Finance and Operations by records in InventSum, and in some cases Inventory transactions (InventTrans) when reporting inventory values in the past. In the above scenario, this means that when you run inventory value reports, Dynamics 365 for Finance and Operations looks (initially) at InventSum and aggregates all records to Site level, and reports the value for the item per site. The data from the individual records on Batch number level are never used. The tool therefore goes through all InventSum records, finds the ones where there is no more quantity (No open quantities field is True). There is no reason to keep these records, so Dynamics 365 for Finance and Operations finds the record in InventSum for the same item which has the same Site, copies the values from the Batch number level to the Site level, and deletes the record. When you now run inventory value reports, Dynamics 365 for Finance and Operations still finds the same correct values. This reduced number of InventSum records significantly in some cases, and can have a positive impact on performance of any function which queries this table. 

A608

A608 Cost calculation details

Inventory management > Periodic tasks > Clean up > Cost calculation details

Monthly

Used to clean up cost calculation details.

A609

A609 CDS – Post integration inventory journals

Inventory management > Periodic tasks > CDS integration > Post integration inventory journals

Fetches journals from the CDS (Common Data Service) and posts them. This applies only of the CDS is in use.

Warehouse management batch jobs

ID

Name, path and recurrence

Description

A700

A700 Work creation history purge

Warehouse management > Periodic tasks > Clean up > Work creation history purge

Weekly

This is used to delete work creation history records from WHSWorkCreateHistory table based on number of days to keep the history provided on dialog.

A701

A701 Containerization history purge

Warehouse management > Periodic tasks > Clean up > Containerization history purge

Weekly

This is used to delete containerization history from WHSContainerizationHistory table based on number of days to keep the history provided on dialog.

 

A702

A702 Wave batch cleanup

Warehouse management > Periodic tasks > Clean up > Wave batch cleanup

Weekly

This is used to clean up batch job history records related to Wave processing batch group.

A703

A703 Cycle count plan cleanup

Warehouse management > Periodic tasks > Clean up > Cycle count plan cleanup

Weekly

This is used to clean up batch job history records related to Cycle count plan configurations.

A704

A704 Mobile device activity log cleanup

Warehouse management > Periodic tasks > Clean up > Mobile device activity log cleanup

Weekly

This is used to delete mobile device activity log records from WHSMobileDeviceActivityLog table based on number of days to keep the history provided on dialog.

A705

A705 Work user session log cleanup

Warehouse management > Periodic tasks > Clean up > Work user session log cleanup

Weekly

This is used to delete work user session records from WHSWorkUserSessionLog table based on number of hours to keep provided on dialog.

A706

A706 Wave processing history log cleanup

Warehouse management > Periodic tasks > Clean up > Wave processing history log cleanup

Weekly

This is used to clean up history records related to Wave processing batch group.

A707

A707 WMS Replenishment

Warehouse management > Replenishment > Replenishments

Calculate location replenishments on the warehouse locations.

A708

A708 Automatic release of sales orders

Warehouse management > Automatic release of sales orders

Releases sales orders to the warehouse so that the picking can start.

Monitoring Distribution jobs

The Retail IT workspace is specifically created to monitor all distribution jobs, sending data to RCSU and POS. If there are failed sessions, they will be seen here. Also the current download (To RCSU) and Upload (From RCSU) is shown here.


Monitoring Batch jobs

The best place to monitor all current batch jobs is through the system administration workspace. Here all failed, running, waiting and withheld batch jobs are shown. This workspace also has additional system administration features.



D365 – To exist or not, that is the question!(part 2)

Some years ago I created a free community solution for “Not-Exists Join“. Not exists join means that we can filter and search on data that does not have any relational records. This answers questions like;

– Show me all customers that have no sales orders the last X days

– Show me all items with no inventory transaction. Show me items with no movement last 30 days.

– Show me all items that have no price.

Countless community friends have used this for AX 2012. But since Dynamics 365 was released this solution could not be applied. To make it properly I have decided to push a request through the CDE (Community Driven Engineering), and hopefully making it available to all D365 customers as part of the standard solution. All code is ready and checked-in , and I’m just waiting for Microsoft review.

The way the CDE works, is that partners and customer that have code or bugfixes can work together with Microsoft on implementing changes. It is Microsoft that have the final decision, and they will also make it part of their IP. But for all you community friends, here is a sneak peek of what I’m working on together with Microsoft.

The advanced filter and query in Dynamics 3656 are a very powerful tool. Here you can search and filter on most fields and add join relations to the query.

But there is one area that the advanced query screen is not handling. That is “not-exist-join”. Let’s say I want a list of all the customers that don’t have sales orders. The standard D365 will not help here. The purpose of this document is to show how to implement “not-exists-join” into standard.

Functional Solution

In the joins form, a new section of relations has been added that represents the tables that can be “not-exist-join” added:

In this sample the customers will no sales orders will be in the query result/form. But the feature are generic, and all 1:n relations can also be selected as a “Not exists” relation.

When will you have this in standard? Maybe 10.0.10?? It depends on Microsoft and final approval of the code and feature. But hopefully it should not be in the far future. But “cheer and share” and maybe we as the community can accelerate this very requested feature.

D365 community ROCK’s and Happy DAX’ing!!

Microsoft Bookings and Microsoft Graph

One common feedback we get when implementing Dynamics 365 is the ability to handle appointments and booking. There are many very good 3’rd party solutions, but did you know that Microsoft have an easy to use booking system that works online and integrated with Outlook. It’s called Microsoft Bookings, and is worth taking a small look at especially if you have the need of booking your customers for appointments and simple services. Microsoft Bookings provides online and mobile apps that make appointment scheduling simple and efficient for small businesses and their customers. Any small business that provides service on an appointment basis, such as auto repair shops, hair salons, and law firms, can benefit from having their bookings managed so as to free up time for the more important task to grow their business. Microsoft Bookings is available to businesses that have an Office 365 Business Premium subscription.

Here is a small live demo for you my friends: https://outlook.office365.com/owa/calendar/DXCCommerce1@dxccommerce.onmicrosoft.com/bookings/

The first page an online customer arrives at is the following screen, that can be published on Facebook or any social media sites. Here I choose to order my haircut from my favorite hairdresser. (Full manual is available here)

 

When booking I will get a confirmation email, and the booking coordinator will also get an email. The booking is also available on my phone:

 

On the back-office side, Microsoft have created a simplified view of managing and setting up your bookings:

Here you manage the calendar, customers and staff.

Here is the calendar for a specific day showing all appointments and bookings for today. Drag and drop of appointments between staff and dates is of course possible.

You can also manage you staff.

And the services you offer and map them towards your staff.

 

If you are a functional person, then just stop reading here, because here comes the good part: There is a complete API interface your you to integrate towards booking. (See also this link) Connecting this towards Dynamics 365 or commerce apps can be done by a developer, and makes it possible to expose booking services to POS, call-center and with tight integration to your Dynamics 365 solution.

Check out Microsoft Booking and Microsoft Bookings API in Microsoft Graph.

Here are some sample pictures on how to access the Booking system using Microsoft Graph. First, here I list all the booking sites listed in my tenant:

Pay attention to the fact that it returns and “id”, that identifies my booking on a specific store. If I now queries for bookings at the ID like this:

https://graph.microsoft.com/beta/bookingBusinesses/DXCCommerce1@dxccommerce.onmicrosoft.com/appointments (You will not get access to this link, but you are welcome to click it )

I get the following, where the service is listing up all bookings posted into Microsoft Bookings. A consent through the azure portal must be setup. And the great thing is that is actually is a two-way service. I can post bookings in.

BOOM! Take that! We now have a complete interface towards all services that Microsoft Graph can expose and can let us integrate on a completely new level.

If I wanted, I can now connect my bookings to any planning engine that would add more value to the service. Like picking me up in a golden limo-cab when I book my hairdressing hour. The possibilities are endless. Also remember that this is not restricted to bookings, but all services that azure may provide. We in the Dynamics partner community have just scratched the surface of the possibilities that Microsoft now provide.

Happy DAX’ing friends.

Dynamics 365 Branding and Commerce (Preview) Firsthand experience

PS! Remember to read the last lines in this blogpost

As I hope you have seen in your never-ending twitter/news feed is that Microsoft again adding lot’s of new apps and features to Dynamics 365. Microsoft are delivering on the communicated vision of Dynamics 365. We now have apps where we have a holistic approach to business processes. To solve business requirements users will be using a combination of apps that works natively together. We see how the entire solution is being connected, and further split up into specific areas. In the old days when we had large ERP suites, and we sold functional modules. We are now implementing connected apps that enable business processes per user. If anyone wonder what the “new” hashtag is, it is easy: “#MSDyn365“, and get used to it. We no longer a need to put things into additional silo’s to explain the legacy, and to succeed we must embrace and deliver the right combination of apps that solves the business requirements per business process.

One of the most exiting news in the current wave 2 release is the delivery of Dynamics 365 Commerce(preview). I have been privileged to validate and try of this solution the last days. My current experience is that: This Rock’s! Microsoft finally can deliver a complete suite to give a true omnichannel experience. One interesting finding is that Microsoft will rebrand their “Dynamics 365 for Retail” offering to “Dynamics 365 Commerce”. Why? Because what is now being offered extends the binderies of the traditional retail solutions. As seen in the following figure you can get a complete integrated end-to-end system. And this is not just for retailers, but for all companies that want to digitalize their processes and offer true omnichannel can benefit of this.

1 : Picture from Microsoft presentations

To try out this new solution, you can request a preview. You ask for a preview here. When/if accepted you will receive an email from Microsoft containing instruction on how to deploy this preview. This guide is also available her and it is important that the guide is followed very carefully. To complete the guide, you need to get some assistance from you Azure AD tenant administrator. Also, the preview is currently only deployable to US Azure datacenters, and this put’s some latency into the commerce experience.

One interesting thing with the commerce, is that even though this is a tier-1 environment, you get the possibility to deploy RCSU and the e-Commerce Server. The data set is basically standard Dynamics 365 for Retail, where the configuration key for retail essential enabled. So we can showcase that the Dynamics 365 Commerce also can be delivered as a standalone app or be extended with the finance and supply chain management apps.

The preview commerce solution is what you expect an e-commerce solution to be:

The back-end editor for the editor is easy to use, and it is easy configure your

To get a full understanding of the solution also head over to Microsoft DOC’s to learn more : https://docs.microsoft.com/en-us/dynamics365/commerce/

But I’ll do something better for you; You can check-out the preview solution yourself right now: https://d365commerceus2ecom-eval.commerce.dynamics.com/DXCCommerce (I expect that the site will be available for a only few days, so hurry)

If you want to buy something use Card number: 4111-1111-1111-1111, Expiration: 10/20, CVV: 737 . Also remember that this is a US-based Azure Datacenter and NOT a production grade scaled system.

Happy DAXing and DXCuLater!

D365 Retail – Buzz Alert !

THIS IS COOL !
Microsoft is launching several new product lines for retailers.

Dynamics 365 Commerce

Empower your business to create exceptional, insightful shopping experiences for every customer with Dynamics 365 Commerce—built on our proven Dynamics 365 Retail solution.

https://dynamics.microsoft.com/en-us/commerce/overview/

Microsoft Connected Store

Empower retailers with real-time observational data to improve in-store performance. From customer movement to the status of products and store devices, Dynamics 365 Connected Store will provide a better understanding of your retail space. (Check out the video)

https://dynamics.microsoft.com/en-us/ai/connected-store/

 

 

D365F&O, Lots of new high value content on DOC’s

The Microsoft Dynamics team have been quite buzzy after the vacation producing a lot of valuable content to Dynamics 365. I would like to highlight some of the latest additions that is worth checking out and to share in the Dynamics 365 ecosystem. Just this year alone, 714 articles have been published, and just the last 2 months close to 300 articles are made available. With this amount of information, I do get questions if there are some hidden gems on docs. And here some of them are:

1. Learning catalog

There are now more tailored learning paths towards customers and partners, with references to free, self-paced online learning path, Tech-talks, and formal instructor-led training. Here you will find articles, videos, and all you need to start learning Dynamics 365.

2. Test recorder and Regression suite automation tool for Retail Cloud POS

Now we can start creating regression testing on Retail POS. Cool stuff, and in my mind where we actual see the true value of regression testing. Retail is Detail, and this delivers quality.

3. Master planning setup wizard

Setting up master planning involves taking many decisions and here you can read how this is done in 10.0.5.

4. One Version service updates FAQ

This page answers a lot of question on the One Version strategy, and what this means for you. At many customers I see that extensive, time-consuming and costly testing processes are being manually executed each time Microsoft is releasing a new monthly updated. Why? I do not see the need to perform full testing on all modules on a monthly basis. Yes, it is a fact that nobody releases flawless code. (Even not Microsoft), but if you follow the procedures and guidelines from Microsoft, the monthly updates should be safe to deploy. There are several release rings and programs in place ensuing that quality is in place at GA. (General Available). Please align to the release cadence updates, and focus on your essential core processes. If you find painful bugs, report them asap.

5. Environment planning

I have seen several projects where the focus is to save costs on implementation environments. This page explains a lot on Microsoft’s take on this. My simple advice is use Tier-1/One box for development on a cloud hosted CSP subscription, and the rest of the environments as Tier-2 or higher (my recommendation is to have 2 additional Tier-2 environments for larger projects). The benefit to use self-service processes is priceless. Also keep in mind that Azure costs are very cheep compared to consultancy hours trying to maintain and manually transfer databases between environments. Also take a look at the great Denis Trunin’s blogpost on development VM’s performance.

6. Business events overview

This is the future and start adopting this feature into your business processes. This is also a key enabler for working closer with the Dynamics Power platform.

7. Regulatory updates

Here you find localized information for your country, and how to comply to specific local requirements. This is being updated very often.

8. Unified product experience

Do you want to keep the products from D365F&O synced with D365Sales ? This article explains how to achieve a near real time bi-directional integration with CDS. Great stuff also explaining dual write capabilities.

9. Adyen payment processing with omnichannel experience

Payment connector is far more versatile than just for retail. Also check out the FAQ.

10. Asset management

Great stuff on the horizon. Keep track of your stuff

11. Franchising

No longer in the official 2019 Wave 2 release. So, we must keep waiting for this in the future.

 

Take care, and

DXC you later

 

 

Analyzing Cloud POS performance in Dynamics 365 for Retail

It is a constant requirement that systems retailers are directly interacting with should be Bigger, Better, Faster, Stronger (BBFS). In this blog post, I will dig into how the POS performance can be analyzed to better understand the transactional performance of the Dynamics 365 POS. What I’m specially interested in is how perceived performance is towards actual. What we think is good performance is relative to the observer. The average reaction time for humans is 250 ms to a visual stimulus, but newer studies shows that we can identify visual stimulus down to 13 ms. Your screen has a refresh rate of 17 ms. As time is relevant and the expected performance is close to real-time, this can sometimes lead to performance expectations that is actual irrelevant towards what is wanted to be achieved. We as humans cannot go beyond 250 ms visual response time, so this is important to keep in mind.

As you can see in the following video, 4 items is scanned and then a quick cash payment is done. The total time taken to complete this example transaction in CPOS is approx. 5 s.

But as you can see on the screen, there is a lot happening, and when the user interface is being redrawn. I wanted to go deeper to understand exactly what is happening when scanning. More specifically on what’s happening when adding the sales lines in the POS.

As the CPOS is a 100% web based application, we can use Google Chrome to take a deeper look into exactly what is happening. By pressing the F12(Or CTRL-Shift-I), you get up the developers tools.

Then start the recording (CTRL-E), add a line in POS, and stop the recording. Then you will see:

1. CPU load, Activity bars, Network calls
2. The actual animation on the POS display each millisecond
3. Exactly how long calls to the Retail Server is taking.
4. The entire REST-call stack being executed on the CPOS client.

Here you see an example where I added one line to the POS basket, and this resulted in 2 calls to the retail server.

If we look at one of the calls happening:

ScanResults() (*.dynamics.com/Commerce/ScanResults(‘07100’)?$expand=Customer&api-version=7.3) – This scans the product/item barcode and sends it to the retail server. In google development tool, we can analyze exactly what is taking place on this call. Here we see that the total time was 559.54 ms but the actual waiting time for the RSSU to respond is 263,69 ms(Waiting TTFB). The browser is waiting for the first byte of a response. TTFB stands for Time-To-First-Byte. This timing includes 1 round trip of latency and the time the server took to prepare the response.) I have measured the network latency to this Tier-2 with RCSU system to be 40 ms.

If I scan the item again, we see that the caching of DNS etc kick’s in the TTFB lowers to 132,80 ms.


As you can see you can really go deeeep, and analyze all what is happening, from client execution to server execution, without any debugging tools. Down to the milliseconds, and better understand the performance. The profile created can be exported and imported for deeper analysis. We can see that there are many factors that influence performance, from network delay’s to form refresh. Microsoft could have the pleasure of shaving milliseconds of the animations, server calls and J-scripts, but this is an ongoing investment from and R&D perspective.

My honest opinion is that the Cloud based Dynamics 365 for Retail POS is performing good. Network elements and the speed of light is a fundamental restriction. The use of animations also seams to affect how performance is perceived, but it does not affect the general performance and usability. Legacy system that is on-prem have the benefit of not having latency, but the cloud solution brings so many other positive elements. If you choose MPOS instead, these tools are not available and you can use fiddler for analysis. But a small tip is to have a CPOS client available when performance testing, as this also will affect MPOS.

Bigger, Better, Faster, Stronger !

Retail statement trickle feed (public preview)

Retail statements are one of the most important (and complex) processes a retailer have. It’s where the retail sales and transactions are being transformed to become physical and financial transactions so you can see the sales in finance and in inventory. Retail statement calculation and posting have been covered many times in my blog posts and Microsoft have a large set’s of article on doc’s on the matter. The amount of transactions retail statements calculates and post is to my knowledge THE most complex and intense feature and business process in the entire Dynamics 365 solution. Imagen that every sale, in every store is being processed. For larger retailers Dynamics 365 for Retail are processing millions of transactions daily. This area really put’s computational pressure in the systems and is also one of the areas where Microsoft is investing heavily.

Since the start of D365 there have been done hundreds of improvements on retail statement posting, and the next “big thing” is Retail Statement trickle feed. One of the pain’s in today’s solution is a significantly delay between the when the retail sales have been conducted, and when the inventory transactions have been financially posted. And in short, when the inventory transaction gets a financial status like “Sold”. Why is this important? Because the inventory transactions define on-hand values, as again it defines how the master planning/replenishment is calculated. We want this to be as accurate and up to date as possible. Any delays in having accurate on-hand influences planned purchase orders. Also the ability to spread out the processing of transactions through the day will reduce the amount of “spikes” in the Azure SQL load, making the nightly timeslot more open for other high intensive transaction processing tasks.

Another critical benefit of trickle feed is the decoupling of transactional statements and financial statements. Now you can post transactional statements without even posting a financial statement, and the other way around. Together with the increase of posting frequency that produce small bundles of transactional statements, it will address the main reason for the compounding effect that prevents a series of statement from being posted due to a single invalid transaction. Right now, the only validation that impact financial statements is that all retail transactions for a given shift must be present in HQ in order for a financial statement to be posted. However the transactions don’t need to be successfully posted for a financial statement to be posted.

There is also a new aggregation strategy, where unnamed transactions are always aggregated and named(customers) transactions are never aggregated. There is no more option available to turn aggregation on or off.

Microsoft have made the following improvements to the statement posting process:

  1. Deprecate the “inventory job” that creates temporary reservations.
  2. Create a new job that will, at a predefined schedule, create sales orders, invoice them, and create, post, and apply payments for all the transactions that are synchronized to the HQ at that point of time. In addition, it will also create any ledger journals that need to be created for discounts, gift cards, and so on.
  3. The statement document that gets created at the end of the day will only be used to calculate and post any counting variances.

To enable the new preview (10.0.5) trickle feed solution you have to enable the Retail Statement (trickle feed) – preview configuration key. Also remember to disable the other retail statements configuration keys, and that you don’t have any open statements when doing this.

When finally released (GA) I hope that the new the new feature management is used for enabling this.

When this is done, you will see a set of new menu items. Under the menu \Retail\Retail IT\POS Posting.

The sequence of these batch jobs is to be able to financially post most of the transactions, and the financial statement posting will only be used to calculate and post any counting variances. There is no need to run the “Post inventory” job anymore. But in reality, there is a decoupling, and the transactional statement and financial statement can post independently if the other have not been posted. The only actual requirement is that the P-job have fetched the retail transactions from the retail channel database.

If we look into the Retail Statement form, we now have the possibility to manually create transaction posting and financial reconciliation (That in the essence is the financial statement).

When creating a “Transactional posting”, we see that the form is a bit changed compared how it was before. There a no lines related to payments.

When posting the transactional statement, the following steps are performed:

When calculating and posting a financial statement, you see the more traditional statement posting screen, where you have the payment lines:

The steps in the posting is the following:

The summary of this, is that Dynamics 365 will with trickle feed support a much faster updating frequency to get proper on-hand values and scalability. Since the transaction statement will be running more frequently it also means that there will be less retail statement posting in the evening/night. The transactions will be smaller and therefore also easier to post. But there are a few things to keep in mind. If you trickle feed too often, you will miss out on the transaction aggregation on the unnamed transactions, and will have to process more sales order invoicing per day. This can again slightly increase the load in your system.

This feature will also increase scaling of the system, as posting of transactions can be better load balancing among multiple AOS-batch services. I also have a feeling that there will be more features in this area to come, that will further enable close to real-time master planning, inventory services, and close to real-time power-BI reporting.

 

Next on customers wish list is a super-duper-fast invoicing service of sales orders(retail), as this still is the most resource demanding task in the processing of retail transactions. It is also in the roadmap the ability for the store manager to perform and generate the financial statement when a shift is closing in POS.  The financial statement in HQ in this case will post whatever the financial statement generated in POS defines, breaking the requirement of having all transactions uploaded to HQ db. And beyond this Microsoft is as always improving general performance by working close with customers and partners. We see that the data distribution and different usage of retail statements require different indexes and Microsoft invests heavily in improving how queries are executed.

 

Great work to the Microsoft team working on the retail statement processing.

 

Here is a small joke for all of you that don’t care about retail statement posting

 

 

D365: Search for code with Agent Ransack

When supporting customer’s we often can get small fragments of information on an issue, like a form is not performing as expected, or an error message. The procedure is then often to log into LCS and find traces of the issue. Often we end up with a query that is the source of the issue. But to better understand and analyze how to fix the issue we often need to find exactly in the source code where the query is executed. By also being more exact and precise towards Microsoft support you also get quicker response.

Searching through the code in Visual Studio can be time consuming, and the built in Cross reference is not always updated, but there is an alternative I can recommend. Agent Ransack is a free file searching utility that quickly can scan most D365 source code (the *.XML files placed in K:\AosService\PackagesLocalDirectory\).

Let’s say I see in the LCS that the current query is what I need to find out from here it is executed.

From the query I can then search for the text “Join RetailEODTransactionTable”, and I get 25 results, and even where the exact table is not specified as

I can then open the file in explorer and then validate to see if I need to go into Visual Studio for further analysis.

This speed up the process of finding the source code that you are looking for. It is free and download it from https://www.mythicsoft.com/agentransack/ and install it in you development environment.

 

Take care Daxer’s.

Meetings: Every minute counts, and snooze to 1 minute before meeting starts

As a consultant I’m used to having a lot of “back-to-back” meetings, and when the next meeting is near, I typically get an outlook reminder 15 minutes prior to the meeting.

Then using the “Snooze” button is good. If I snooze until 5 minutes before I am too early. 0 minutes before and I am too late. You know that in the drop-down, the minimum selection is 5 minutes? That is too much for me. I would like to have a new reminder when it is 1 minute before the meeting start. But did you also know that you can type into the field? You can actually write “1 minute”, and this will then remind you when it is 1 minute to the meeting start.

A smaller more advanced way is to set the default reminder to 16 minutes, prior to the meeting

And then when the reminder “pop’s” up, to can select to “Snooze” and select to be reminded in 15 minutes. That is exactly 1 minute before the meeting starts.

Now I have just “earned” 4 more minutes where I can create D365 customer value before the meetings starts

D365F&O – Address performance tips

Sometimes the smallest thing can make a huge difference. At a customer we experienced a huge load (DTU +70% average), and the LCS shows that there was a single SQL query that was the reason for the load. The data composition here was that there was close to a half million customers in the customer table, and most of them had addresses, email and phone numbers assigned to them. Except of the customers used for retail statement processing.

In LCS environment monitoring you can see this as spikes in the overview.

 

The query you typical see looks like this:

(@P1 int,@P2 nvarchar(256),@P3 int,@P4 bigint)SELECT TOP 1 T1.COUNTRYREGIONCODE,T1.DESCRIPTION,T1.ISINSTANTMESSAGE,T1.ISMOBILEPHONE,T1.ISPRIMARY,T1.ISPRIVATE,T1.LOCATION,T1.LOCATOR,T1.LOCATOREXTENSION,T1.PRIVATEFORPARTY,T1.TYPE,T1.ELECTRONICADDRESSROLES,T1.MODIFIEDBY,T1.RECVERSION,T1.PARTITION,T1.RECID FROM LOGISTICSELECTRONICADDRESS T1 WHERE ((T1.PARTITION=5637144576) AND ((T1.TYPE=@P1) AND (T1.LOCATOR<>@P2))) AND EXISTS (SELECT TOP 1 ‘x’ FROM LOGISTICSLOCATION T2 WHERE ((T2.PARTITION=5637144576) AND (T2.RECID=T1.LOCATION)) AND EXISTS (SELECT TOP 1 ‘x’ FROM DIRPARTYLOCATION T3 WHERE ((T3.PARTITION=5637144576) AND (((T3.LOCATION=T2.PARENTLOCATION) AND (T3.ISPOSTALADDRESS=@P3)) AND (T3.PARTY=@P4)))))

By downloading the query plan, we see that there is a index seek on the table LOGISTICSELECTRONICADDRESS.

 

This results in that the indexes don’t get a good “hit” on the logisticselectronicaddess.type.

The solution was surprisingly easy. Add Phone, Email address and URL to the customers.

 

Then the DTU drastically goes down, and normal expected performance was achieved.

 

Conclusion; Remember when having many customers, to fill inn contact information.

This just must be shared

D365F&O – Community Driven Engineering

I have previously blogged about the importance of reporting new ideas, issues and bugs to Microsoft, and also why the community will benefit from sharing. I see that experienced engineers have the solution available and are more than willing to give it for free to get the fixed-up code into the standard solution to benefit customers and future projects.

 

But the formalized support path does require time and energy and remember that not all Microsoft support consultants are engineers that you can discuss X++ topics with. But how can the process of contributing to the D365 community be easier?

But did you know that Microsoft have a program for Community Driven Engineering with Dynamics 365 F&O? This covers not only bugs, but also new features. Community driven engineering (CDE) is a Microsoft effort to make external engineers more efficient at providing recommended bug fixes as minor features to Microsoft, as well as to make Microsoft more efficient in accepting fixes from the community. If the fix is accepted, it will be merged into the main Dynamics 365 F&O branch. I have tried the program, and reported in a fix for auto-report as finished, and the fix was accepted, and hopefully in the near future the entire community can benefit from it.

How to start?

If you have the right skills and the willingness to share and give away your fixes (or features) you can sign up at https://aka.ms/Communitydrivenengineering. You need to be accepted into the program, and your user must be whitelisted before you can access. The CDE also have a private Yammer group, that you get access to when accepted. But I must warn you. This program is meant for the most experienced and technical people we have in our community, and that are deep into X++ and AzureDevOps. You must have approval from CxO-level in your organization that you can share code with Microsoft. (Lawyer stuff)

Here is the overall flow for the external engineer:

  1. You create a bug or a Feature in CDE Azure DevOps
  2. The bug or Feature is reviewed by the MS team and accepted or rejected
  3. You create a branch for this work and commit in this branch
  4. When done you create a Pull Request
  5. The Pull Request is reviewed by the MS team and feedback is provided
  6. After some iterations the Pull Request will be approved and complete
  7. The MS team will take over the code and include in a future release

Here are the more technical details of how it works.

The following text is copied from the onboarding documentation of the CDE.

It takes approximately one hour to get started with CDE, the majority of which is the initial build time.

  1. Obtain a development VM from LCS with build 8.1.195.20001 (app 8.1, platform update 22) or later. The latest branch I have access to is 10.0.80.19, that basically is 10.0.2 PU 26.
  2. Make sure you have opened Visual Studio at least once on the VM to sign in and pick default settings.
  3. Install Git on the machine from https://git-scm.com/downloads . The default installation options should work fine.
  4. From an administrator command line instance, clone this repo to a location on the machine.
    pushd k:\
    mkdir git
    cd git
    git clone https://dev.azure.com/msdyncde/_git/cde

  5. Define your user name and email in Git
    git config –global user.name “John Doe”
    git config –global user.email johndoe@example.com

  6. Mount the git repo into the F&O deployment
    pushd K:\git\cde
    powershell .\Mount.ps1
  7. Open Visual Studio as administrator and rebuild the following models
    ApplicationSuite
    ApplicationWorkspaces
    FiscalBooks
    GeneralLedger
    Project
    Retail
    Tax

At this point you can start development(in the SYS layer actually)

How to submit a change?

Changes submitted by the community are committed to the same REL branch matching the version on the dev VM. Once the pull request (PR) is completed, that signals that Microsoft has officially accepted the change and it will show up in a future official release, usually the next monthly release (depending on what day of the month the release closes). The change will only enter the master branch of msdyncde through a future official release. Syncing to the tip of a REL branch will pull in other community changes submitted from that version.

  1. Create a Bug or Feature depending on whether the change is related to incorrect behavior of existing code, or new behavior.
    https://dev.azure.com/msdyncde/cde/_workitems
    New work item > bug
    Fill in the title, assign it to yourself, and set the Area to your best guess as to where the behavior belongs (will help us review appropriately)
    In repro steps and system info, provide information on why this change is necessary
  2. In Git, create a topic branch to work on. Branches are usually named by username/bug number.
    git checkout -b johndoe/482
    git push –set-upstream origin johndoe/482

  3. In Visual Studio make changes to Application Suite SYS code as normal. Changes are actually being made directly in the Git folder.
  4. Push your changes to VSTS.
    git add -A
    git commit -m “Message explaining what is being changed”
    git push

  5. Send a pull request from VSTS
    https://dev.azure.com/msdyncde/_git/cde/pullrequests?_a=mine
    New pull request
    Source branch = johndoe/482, Destination branch = Rel_8.0.30.8022 (or whatever version you have)
    Fill in the title and description, link the work item > Create

Any feedback from Microsoft reviewers (or other Community reviewers) will show up in the PR. Changes can be made to the PR by editing in Visual Studio, and doing git add / commit / push again. Once Microsoft has signed off, all comments have been resolved, a work item is linked, and all other polices have been met, then you can click Complete to complete the pull request. When a PR is completed, that is official acceptance by Microsoft that the change will become part of a future official release, usually the next monthly release.

Behind the scenes

  • The powershell script starts by checking what version of source code exists on the VM by examining the K:\AosService\PackagesLocalDirectory\ApplicationSuite\Descriptor\Foundation.xml file.
  • It then checks out the REL branch associated with that version, which matches the platform and other model versions currently on the machine.
  • The development config files are updated to allow changes to SYS models, which is normally disallowed on dev VM’s.

In addition to having an accelerated approach to get fixes into main branch, participants also have some more benefits. You will have access to the latest & greatest code changes through all code branches that Microsoft makes available. You can search through the code and see if there are code changes that affects extensions or code that is local to you installations. You can also see how the Microsoft code is evolving and improvements are made available in the standard application. You will also build gradually very valuable network towards the best developers in the world, where you will discuss technical topics with the actual people creating the world’s best ERP-system.

One final joke for those considering going into this program: Git and sex are a lot alike. Both involve a lot of committing, pushing and pulling. Just don’t git push –force


D365F&O – Auto-report as finished in a Retail scenario

For many years I have had the opportunity to work on Dynamics 365 topics involving Kitting, Value Added Services(VAS) and Bill-of-Materials(BOM). Today I would like to write about the released product parameter “Auto-report as finished” in a retail scenario, and you can read more about report as finished at the Microsoft docs. To explain the business scenario, let’s take hot-dogs. A hot-dog is normally assembled as the customer wants, but in this scenario, we have a standardized hot-dog with 4 ingrediencies.

As a retailer, I would like to sell the finished product, but keep track of the raw materials. To do this you need to create a BOM, and when the hot-dog is sold, Dynamics 365 will automatically report a hot-dog as finished, and draw the ingrediencies from the store warehouse. It is possible to use a production order, but for retailers this is overkill. Something much easier is needed. Instead of exact BOM’s, then average BOM’s can also be used, since knowing exactly how much onion or mustard the customer will apply is not an exact science.

Dynamics 365 have a nice feature for this; Auto-report as finished.

What this parameter does, is then when the product is physically deduced (or Sold) a BOM journal will be created and posted. This will create issue-transaction (sold) from your inventory.

Here I have created a BOM for my hot-dog:

When creating a sales order and posting a packing slip you will see that a Bom journal is automatically created and posted.

The posted BOM Journal looks like this, and here we see that a hot-dog is added to the warehouse, while the ingrediencies are subtracted from the warehouse.

For retailers, this means that we can sell goods in the POS, and when the statement posting process is creating and posting sales orders, the auto-report as finished functionality will be posted. So, no need of any production order, or manually posting Report as Finished journals. So, Dynamics 365 have an alternative to retail kit’s, if a more standardized BOM’s are used. The BOM can then also be used for cost calculations on food and retail produced items. Comparing the counting and the actual transactions will also help to know how accurate the BOM are for describing the cost picture of the products. Master planning will also catch this, and you can get replenishment to work on ingrediencies.

BUT!!! There are some issues.
As a workaround and to make this work you will have to specify default warehouse per site per item in the default order settings.(I know this is an impossible task if you have 500 products and 500 stores, as this would mean you have to create 250.000 default order settings). I have a support request going with Microsoft to change this, so that this is not needed, and that the warehouse can be inherited from the parent transaction. So, if you get error like this, then you have done nothing wrong, and hopefully it will be fixed on future releases.

STOPP HERE, unless you like X++

Here is something for the “technical” guys; The code that automatically triggers this auto-report as finished is actually the class InventUpd_Physical.updatePhysicalIssue(). For those of us, that have worked quite some time with Dynamics, we understand that this class is very central, because all physical inventory transactions are posted through this class. The behavior of auto-posting BOM’s will therefore influence all places where a physical transaction is posted.

Microsoft have created a method on the movement classes named ” canBeAutoRepAsFinished()”, that let’s them refuse this behavior on certain transaction types.

If you don’t want to wait until Microsoft fixes the feature where the warehouse dimension is inherited from parent BOM, then you do have an option to extend the class BOMReportFinish.initInventDimFromBOMRoute(), and here set the InventLocationId from the parent. Her is at least my suggestion to fix the issue in the standard code(without extension):

Here is the code for validating that warehouse storage dimension is used on the BOM-line, and sending this back to the report as finished class.

Take care and I’ve got to get back to work. When I stop rowing, the mothership just goes in circles.

Dynamics 365 F&O – Selecting the correct Tier level on your sandboxes

When purchasing Dynamics 365 F&O, you a get of Microsoft managed (but self-service) environments that is included with the standard offer. (Production, Tier-2 Standard Acceptance Testing and a Tier-1 Develop/Build and test environment. Microsoft have described this on the environment planning docs. I will not discuss Tier-1 environments here, as these environments is optimized for development experiences. Do not perform performance testing on a tier-1 environment.Tier-2+ environments is based on the same architecture as a production environment and uses the Azure SQL Database service.

When running an implementation project, it is common to purchase additional tier 2+ environment that is used of different purposes as shown in the table below (from Microsoft Docs)

Selecting the correct level is important and is depending on what the environment is going to be used for. As a guidance, Microsoft have the following baseline recommendation:

On the projects where I have been involved, we most often have 3 or 4 Tier 2+ environment and the purpose are changed through the project.

The flow of data between these environments can be included into a Sprint Cycle. The process will start with defining the general parameters in the golden configuration environment (1). Here all system setup, number sequences, and master data will be uploaded/entered from the legacy systems. The Test/Stage/Migration environment (2) will be created based on the golden environment + transactional data packages/initial startup data. Then there will be a database refresh from Test (2) à UAT (2), where all test scripts will be run and approved. The results and configuration changes/master data are then fed into the golden environment ready for the next data movement cycle. The reason why we do this, is to ensure that the golden environment and the migration environment is not corrupted through testing. At Go Live, and when the UAT is approved (after a few iterations), then the Migration environment will be copied to the production environment. This can only happen once. Subsequent updates to the production environment must be done manually or using data packages.

(1)- Tier-2 Golden environment (before PROD have been deployed). This environment is often changed to become staging environment that contains an exact replica of the production environment. I prefer golden environments as a Tier-2, as this simplifies the transfer of data using the LCS self-service database refresh.

(2)- Tier-2 data migration. This environment is used for making transactional data ready for being imported to the production environment at Go-Live.

(3)- Tier-2/3 User acceptance. Here the system is really tested. Lot’s of regression testing and running test scripts. The focus is functionality. If there are concerns on performance, a Tier-5 environment can be purchased for a shorter period to validate that system can handle the full load of a large-scale production environment. For performance testing, it is recommended to also invest in automation of the test script. (Unless you ask the entire organization to participate in a manual test).

The performance of a system is a combination of the raw computing power of the VM’s hosing the AOS, and the sizing of the Azure SQL. With Dynamics 365 we don’t have any way’s of influencing the sizing of this. It is all managed by Microsoft, and they will size the production environment according to number of users and transactions per hour. But the Azure SQL boundaries that Microsoft is most often related to the following sizing steps.

I don’t exactly understand how Microsoft is mapping the Tier-2..5 towards these steps, but I have experienced that a Tier-2 level in some cases are a P1, P2, P4 and P6. More information on the DTU capacity can be found here, and the summary is that we can expect 48 IOPS per DTU. So, a P6 will provide 48000 IOPS. If you want to check your DTP limit, then open SQL manager towards the Azure SQL database, and execute the following script:

SELECT
*
FROM
sys.dm_db_resource_stats ORDER
BY end_time DESC;

And then the DTU limit should be shown here: This is from a Tier-2 environment belonging to the initial subscription, and this seams to have 250 DTU’s(P2)

But what puzzles me is if I go into another Tier-2 add-on environment I have 500 DTU (P4)

And in the third Tier-2 add-on environment I have 1000 DTU (P6)

So there seams not to be a consistency between the DTU’s provided and the Tier-2 add-on purchased. As far as I know, in this case the production environment is 1000 DTU’s(Or P6) in some of my customers.

The AOS’es on the Tier-2 environment seams to mostly be D12/DS12/DS12_v2 with 4 CPU and 28 Gb RAM and 8x500Gb storage, capable of giving out 12.800 IOPS.

What also puzzles me is the number of Tier-2 AOS’s that is deployed. Some environments have one AOS, and one BI server.

While other Tier-2 environments have two AOS’es and one BI server

I assume that the differences are related to how the subscription estimator have been filled out, and that this may have an impact on what is deployed as sandbox Tier-2 environments.

Dynamics 365 do have some performance indicators under the system administrator menu, that gives some numbers, but I cannot see a clear correlation between the environments and the performance. Maybe some of you smart guys can explain how to interpret these performance test results? What is good, and what is not?

If we take the “LargeBufferReads”, how does your environments perform?

Dynamics 365F&O – Enabling new hidden functionality (SYSFlighting)

With Dynamics 365 version 10, the innovation wave from Microsoft is continuing to accelerate. All customer will use the same base source code of the Dynamics 365 solution, and it will be maintained and updated every month. But for many customers, stability also have its value. New functionality every month is not always what existing customers want to implement. New functionality could mean new trainings and new testing. Me on the other hand loves new features, because it enables new possibilities and solutions.

Microsoft have a solution for this, and that not all new functionality is enabled by default. Instead, the new functionality must be manually enabled based on support request through LCS. Two specific functionalities that is already documented is new functionality in Data Management framework and Business events. In the documentation pages you can see how to enable this hidden functionality, but the essence is that you have to run a SQL commend (only available for non-production environments) :

INSERT INTO SYSFLIGHTING (FLIGHTNAME, ENABLED, FLIGHTSERVICEID) VALUES (‘XXXXX’, 1, 12719367)

PS! This is NOT something you can enable by your self in a production system.

A small tip, to search for in Docs.microsoft.com is the term “SYSFLIGHTING“. And then you will see the articles on documented hidden features.

But there are more, but undocumented features in two categories; Application and Platform. And these can be seen as two macro’s in the source code, named ApplicationPlatformFlights and ApplicationFoundationFlights. I have taken a snapshot of them here and based on the names we do get some indication of what they are used for. What they are, and how to use them I expect will be documented in the future.

PS! I look forward in exploring the “AnalyticsRealTimeReporting“, “DMFEnableAllCompanyExport“, “AnalyticsReportWebEditor“, “BusinessEventsMaster“, “ApplicationPlatformPowerAppsPersonalization“.

Happy Flighting

Near real-time replenishment in Dynamics 365 F&O

There is a lot of good stuff on the horizon for Dynamics 365. I highly recommend that you check out the following article of some new planning services that will come in the April 2019 release.

https://docs.microsoft.com/en-gb/business-applications-release-notes/April19/dynamics365-finance-operations/planning-service

To make this happen, I would expect the planning to go deeper into the SQL stack, and also to maximize the utilization of in-memory processing of the transactions.

For Retailers, this will be highly appreciated, where limited space in the stores means that shelf replenishment several times each day is common. Especially for perishable products with limited shelf-time. Keeping things fresh and presentable is a necessity for the customer to buy. The ability to more quickly react to customer demands ensures that the customers actually find the products in your store. And the same aspect, when there are a slower sale, the ability to adjust down the replenishment according to activity. This saves cost and increases profit. In Retail, it is the small improvements that in sum creates the big results.

For the planning service to work, it needs the transactions to take action on. In Dynamics 365 for Retail we must choose between the ability to aggregate the transactions coming from the POS/Channel databases, or more quickly posting the statements. I’m looking forward to many good discussions on this area.

The future is faster

Retail Enterprise Architecture mapping using ArchiMate and ARDOQ

The warning; The blog post is High Level, but the benefits can be mind-blowing.

Enterprise Architecture is about understanding and change. In today’s business, change is everywhere and the essential part to survive. But change is not easy. To have insights and understanding of your own organization is essential for change and risk assessment. Understanding how people, processes and technology are connected will give focus to achieve high value benefits. In my profession we use the Microsoft Dynamics technology stack as a main driver for implementing improvements. But we also acknowledge that Dynamics 365 is not the only system at work. Even though Dynamics 365 is a central component, there always will be many other systems, processes and technologies that is included in the enterprise architecture (EA). We need a way to describe all these connections in uniformed way, that allows us to communicate a model for enterprises dynamically.

But why should EA mapping be a central part of your business? here are 6 business motivators and benefits of having a structured approach of the EA mapping:

Increased stability and availability. It is critical vital that all central systems have a near 100% availability. POS and back-end systems must always work, and the supporting processes must be streamlined to secure that risks related to business improvements and changes are minimized and understood. The EA mapping documents the relationships and show consequences changes.
Guaranteed Performance. Having acceptable system response 24/7, that can deal with business pikes must be planned and built around the system. Systems must deal with a variable load, handling that the sudden event changes the transaction volume. Any disruptions quickly result in customers walking away. The EA mapping must document components central for performance compliance, and the business actors involved
Scalable capacity. New stores or changes in the business model can quickly change the requirement for transaction and processing capacity. To be cost effective, the capacity scalability must dynamic according to the actual need. Both in terms to scaling up and down. The EA mapping documents components central for scalability, and the business actors involved.
Strong security. Cyberattacks are increasing and it is vital important to secure information and transactions. Being GDPR compliant puts demands on systems and internal processes on how to handle own and customer information. Security, tractability and audit trail builds trust into the system and documenting compliancy. The EA mapping documents governance and role compliance, and the business actors involved.
Right focus. There are always new business opportunities and process improvements. Keeping track on where to focus will lead to better and faster implementation of changes in a secure and stable manner. New ideas must be analyzed, and risk assessed, and also to understand the implications. The EA mapping can assist in focusing on what changes have the highest priorities and benefits.
Cost control. Being a retailer involves large investments in technology like POS, Mobile apps, customer portals and enterprise systems. Moreover, there may be large fluctuations in system usage throughout the year. By purchasing these features in the subscription form, it is possible to equalize the operating costs and that you only pay for what is needed. Good liquidity is archived by balancing cost full investments towards the revenue stream and securing actual return on these investments

To move forward a “language” is needed to describe an enterprise architecture model where you can visualize, plan, implement and maintain all relationships that exists today, in transitions and the final vision.

Architecture Layers using ArchiMate

The overall mapping can be modelled in 5 main layers; Here I would like to focus on the symbolism used for identifying. The notation here is ArchiMate, that is open and independent enterprise architecture modeling language to support the description, analysis and visualization of architecture within and across business domains in an unambiguous way.

Motivation Elements defines the overall drivers and goals that the enterprise have. Much of the vision is located here. The Motivation elements can also be seen as a vertical layer, in close relationship to all layers.

The Strategy layer defines the overall course of action and a mapping towards resource and business capabilities.


The Business layer defines the business processes and the services the enterprise is providing, and the here the main business processes are defined. To simply the modeling it is relevant to start with the Business Objects, Business processes, Business Roles, Business actors, Business events, Business Services and Business Rules and Logics.

The Application layer contains application services and capabilities, their interactions and application processes. Here Dynamics 365 and much of the power platform is located. To simply the modeling it is relevant to start with Data objects, Application functions and Application components.


The Technology and physical layer describes the software and hardware(physical or virtual) capabilities that are required to support the deployment of business, data, and application services; this includes IT infrastructure, middleware, networks, communications, processing, standards, etc. The underlaying structure of Microsoft Azure would typically be described here. To simply the modeling it is relevant to start with Artifacts, System Software, Technology Service, Device and Communication network.

Architecture Relationships using ArchiMate

The real beauty comes, when the relationships between architecture elements are being defined. But to do this, a set of predefined relationships needs to be defined. The most common used is the following one

If putting this together in a combined setup I get the following relationship diagram of what is relevant to document.

(*Credits to Joon for this visualization)

As seen here, the business processes are a realization of the application functions, and this clarifies how a proper Enterprise Architecture modelling is documents. With this model, we can what business actors is assigned to what Business roles. This again shows the business process assignment to the role. The Business processes are there to realize business services.

Building the Architecture model using Ardoq

The architecture relationships can be challenging to describe using tools like Visio. Often, we see that great work is done, but not used to the potential. An alternative is to use cloud based mapping tools as ardoq, that covers most aspects in documenting relationships between business processes, applications, roles, risks and transitions. This is not a commercial for this tool, but I find it great. So, I decided to try to use Ardoq to model the Contoso demo data.

Here I will focus on the Application Layer, as this is the layer where the application functionality and data are located. First, I create the application components:

Then I create the Application Functions, and I also import the Business Roles that is available in the Contoso demo dataset.

Next job is to build the relationship between the application functions(D365), business processes(vertical processes) and business roles. This will allow me to visualize and to trace dependencies across all the EA mappings. Let’s take an example looking into the responsibilities of an employee named April Mayer.

I can here see that she is related to the business roles; Accounts payable clerk and manager. If I click on the “Accounts payable clerk” I jump into the view of this business role, and I can see that it is related to the business processes of accounts payable, and an association to April Mayer.

Jumping to accounts payable allows be to see the business processes involved.

I can also visualize the entire Enterprise Architecture Map will all objects and relations,

And zoom into specific on the relations; This graph shows me that April Meyer belongs to the role “Employee”, Accounts payable manager and clear. The Accounts payable clerk is associated with the business process “Accounts payable”. The clerk role is associated with the Financial management modules in Dynamics 365.

Here is another visualization, that shows the how the business objective of “Marketing” can be achieved, and what Business roles are involved, what Business processes, Application functions and what application components are also involved.

Knowing the relation and the ability to communicate is a key to happy Enterprise Architecture mapping.

Give is a try, the result can be very powerful.

Additional information

1. A high value blogger on Enterprise Architecture is http://theenterprisingarchitect.blogspot.com/.

2. Homepage of archimate: http://pubs.opengroup.org/architecture/archimate3-doc/toc.html .

3. Homepage of ARDOQ : https://ardoq.com/ Give it a try !

MPOS – Open full (kiosk) screen mode when having dual display

For a retailer, every saved “click” is appreciated, and the ability to remove any noise appreciated.

When starting MPOS in maximum mode, you will often see that you have a title bar at the top, and the app-bar at the bottom.

In windows 10 you can also use the “tablet-mode” to get the MPOS into full screen mode.

BUT! If you have a dual display setup, it the tablet mode does not work.

If you want to remove them, there is a smart keyboard short-cut:

Shift-Windows-Enter

This will put the MPOS in full screen mode, and giving a nicer appearance without the bar’s.

Then the questions is how to make this always happen, when starting the MPOS ? This was actually not a easy task, but a colleague of me (Espen) made it possible , du using a powershell script.

The following page contains a small powershell script, that opens a UWP app in full (kiosk) screen mode:

Add this to a “start up folder”, and create a new powershell script containing ;

[Path]\StartUWPAppFullScreen.ps1
-app
Shell:Appsfolder\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App

 

Then create a shortcut towards this new powershell app.

How initial investigations (by Sven Erik) shows that the MPOS app ID is Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App and let’s hope this ID stay’s permanent.

Then the MPOS looks nicer for the user, without noice.

 

 

 

Retail assortments and planned orders extensions

Microsoft have created an excellent description of this in the Assortment management doc-page. Retail assortments are essential to define what products that should be available across retail channels. Assigning a product to an assortment, will assign the product to the stores that have the assortment. This makes it possible to sell the product in the store.

But there is something missing, and that is to use assortments for replenishment and procurement. Retailers want the master planning to only suggest procurement and transfers on products that belongs to the stores assortments. You can control requirement parameters per warehouse, but there is no standard way to use the assortment to control the generation of planned orders.

This blog post is to show how to make a very small extension that allows you to make sure that only products that belongs to a stores assortment will generate planned orders. The solution I will make use of is to look into how the product lifecycle state functionality is working, and extend this with an assortment planning parameter. I have called this parameter “Is lifecycle state active for assortment procurement

What it will do, is to validate if a product is in the assortment of the store, and if it is, then the product will be requirement calculated and will generate planned orders. If the product is not in the assortment of the store, that no planned orders will be generated.

To make this happen, I needed to create 4 extensions. The three first is adding a new field on the product lifecycle form.  For an experienced developer this is easy to create, and no need to spend time on in this this blog-post.

The real fun is how to manage and control the planned order generation. This happens in in an extension to the ReqSetupDim class.  Here there is a lookup to the assortment on the product, and checks if the store have this assortment.  If not, then the masterplanning will not generate any planned orders. I therefore create an extention class and use the new method wrapping/CoC feature to add some additional code.

///
<summary>
/// Contains extension methods for the ReqSetupDim class.
/// </summary>

[ExtensionOf(classStr(ReqSetupDim))]
final class ReqSetupDim_extension
{

    ///
<summary>
    /// Validates if a product should be assortment planned
    /// </summary>

    /// The parm of the ReqSetupDim class.
    /// false if the product is not assortment planned; otherwise, return default value.
    public boolean  mustReqBeCreated(InventDim _inventDimComplete)
    {
        Boolean ret = next mustReqBeCreated(_inventDimComplete);

        if (ret)
        {
            if (inventdim.InventLocationId)
            {
                InventTable                 inventtable;
                EcoResProductLifecycleState ecoResProductLifecycleState;

                //Fetching fields from  inventtable
                select firstonly ProductLifecycleStateId, Product from  inventtable where inventtable.ItemId == this.itemId();

                //validating if the product is active for planning and that also assortment planning is enabled.
                select firstonly RecId from ecoResProductLifecycleState
                        where   ecoResProductLifecycleState.IsActiveForAssortmentPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.IsActiveForPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.StateId == inventtable.ProductLifecycleStateId;

                if(ecoResProductLifecycleState)
                {
                    RetailStoreTable                    store;
                    EcoResProduct                       product;
                    RetailAssortmentLookup              assortmentLookupInclude;
                    RetailAssortmentLookup              assortmentLookupExclude;

                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupInclude;
                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupExclude;

                    //Finding OMOperatingUnitID from the inventlocationId
                    while select firstonly OMOperatingUnitID from store
                        where store.inventlocation == inventdim.InventLocationId
                    {
                        //Check if the product is in the assortment of the store in question
                        select RecId from product
                            where product.RecId == inventtable.product
                        exists join assortmentLookupInclude
                            where   assortmentLookupInclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupInclude.lineType == RetailAssortmentExcludeIncludeType::Include
                        exists join assortmentLookupChannelGroupInclude
                                where   assortmentLookupChannelGroupInclude.OMOperatingUnitId == store.OMOperatingUnitID
                                    &amp;&amp;  assortmentLookupChannelGroupInclude.AssortmentId == assortmentLookupInclude.AssortmentId
                        notexists join assortmentLookupExclude
                            where   assortmentLookupExclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupExclude.lineType == RetailAssortmentExcludeIncludeType::Exclude
                        exists join assortmentLookupChannelGroupExclude
                            where   assortmentLookupChannelGroupExclude.OMOperatingUnitId == store.OMOperatingUnitID
                                &amp;&amp;  assortmentLookupChannelGroupExclude.AssortmentId == assortmentLookupExclude.AssortmentId;

                        if (!product)
                        {
                            ret = false; //The product does NOT belong to the stores assortment, and should not be planned
                        }
                    }
                }
            }
        }
        return ret;
    }
}

I also have code to restrict creation of manual purchase orders, and where simular code can be used, but let’s hope that Microsoft can further extend standard Dynamics 365 with assortment based procurement planning.

Copy with pride, and let’s hope next year will give us 365 more opertunities.

POS Invoice Pay – #Dyn365F&O

A very nice omnichannel capability made available in Dynamics 365 version 8.1, is the ability for customers to pay their invoices directly in the POS. A scenario is that a customer is allowed to purchase “on-account” and then later pay all the invoices. Let’s say that the customer is in a hotel, and allows the customers to buy food, drinks and services throughout the stay. At the end of the stay the customer pays for all the services at the reception. Like “pay-before-your-leave”.

There is no requirement that the goods have to be sold on a POS. It is fully omnichannel capable. So, the orders can be created in the call-center, WEB or in stores. I would like to share this with you and how you can set it up in the Contoso demo data set. If you open the functionality profiles, you will find the possibility to enable paying:

  • Sales order invoice
  • Free text invoice
  • Project invoice (Yes! Even project invoices!)
  • Sales order credit note

The next thing you need to do is to add a “Sales invoice” – button to the transaction screen. (I’m using Houston store, and button grid F2T2)

This will add a sales invoice button to the POS design, that allows for paying invoices in POS.

The next thing is to create a POS transaction/order. First select a customer (like Karen), and then use the on-account button to sell the goods.

On the payment screen you can say how much you would like to put on account, and you also see that the credit limit and balance is available.

The next step requires that the there are some periodic batch jobs, that needs to run;

1. Run the “P-job”, to fetch the transactions from the channel database.

2. Run the “Calculate statement” (manually or in batch)

3. Run the “Post statement” (This process will create the sales order and the invoice)

!Make sure the statement is posted and invoiced before continuing!

The option you now have is to continue to the process in Dynamics 365, and create an automatic sending of the invoice to the customer through print management, or have the customer come to the “reception” and pay for the goods directly.

To pay the order, select the Karen customer, and use the Sales Invoice button.

If you have done all right, you should find the invoice in the list now. (If you have enabled aggregation in the parameters, you will have a single invoice per customer)

I can then select the invoice (or multiple), and pay it using cash, card, loyalty (And even on-account again)

This opens up for some very nice omnichannel processes, and I hope that Microsoft invests further in this. It would be nice to actually see the actual lines on the invoices that is being paid, and to even print-out the invoice if the customer requires this. Also I suggest that for retailers, use the modern report possibility to make the invoice look awesome.

Take care friends, and thanks for all your support and encouragement!

Retail category managers, Simplify your import of released products in #Dyn365FO

It is a category manager’s job to try to maximize profit from selling products within a specific category. This may be looking after a broad category such as ‘confectionery’ or they may focus closely on a more specific category, such as ‘snacking’. A category manager will analyze complex data collected on shopper behavior from a range of different sources, and then translate it into meaningful information. The category manager’s duty is to ensure that their company is providing the market with the products that consumers desire.

Retail Category managers love Excel. It is used for almost everything, and they perform much of the analyzing, lookup, data collection and decision making in Excel. When implementing Dynamics 365 we are often faces with large set of excel spreadsheets that needs to be imported. I have seen users import 8 different excel spreadsheets for importing products. This blog post is about how to simplify the process of keeping retail master data in single excel sheet and easily importing and updating products. For this, Dynamics 365 data management framework is used. One of the problems I often se uses are struggling with, is the issue that the source excel spread sheet is a single spreadsheet, but it needs to be imported into several data entities. For a retailer some of the most common master data entities are:

Data entity

Description of data entity

Products V2

Contains Product number, Product name and dimension groups

Released products V2

Contains most fields on the released product

Item – bar code

Contains the item barcodes used for scanning

Default order settings

Contains information like minimum purchase quantity etc.

External item descriptions for vendors

Vendors item numbers and descriptions

Product category assignments

The connection to the retail category hierarchy.

 

It is possible to create a single excel spreads sheet that overs all of these entities, and in a single run import or update the retail products.

So how to exactly do this?

Create an Excel spreadsheet with exactly the following.

I recommend creating two sheets. First one is a “read me” sheet, that explains the “template” sheet.

Use exactly the column names as described here. This will make the mapping between the columns and the data entity automatic. Here I also use color coding to show what entity each column mainly belongs to.

Field

Example Value

Comment

Tables

ITEMNUMBER

1005157

Product number

Released Products, Products

PRODUCTNUMBER

1005157

Product number

Released Products

PRODUCTNAME

Jalla Coffee 500G

Item name

Released Products, Products

PRODUCTSEARCHNAME

4001392 Jalla Coffee FILTER 500G

Seach name

Released Products, Products

SEARCHNAME

Jalla Coffee FILTER 500G

Seach name

Released Products

PRODUCTDESCRIPTION

Jalla Coffee Original is a useful coffee that can be enjoyed on most occasions. A carefully selected mix of coffee types, mainly from Brazil, guarantees a round and full-bodied coffee with long aftertaste

Full item description

Released Products, Products

PRODUCTSUBTYPE

Product

Should always be “product”

Released Products, Products

PRODUCTTYPE

Item

Item or Service

Released Products, Products

STORAGEDIMENSIONGROUPNAME

SiteWhLoc

Name of the storage dimension group

Released Products, Products

ISPURCHASEPRICEAUTOMATICALLYUPDATED

Yes/No

Should last purchase price be updated automatically

Released Products

ISUNITCOSTAUTOMATICALLYUPDATED

Yes/No

Should cost purchase price be updated automatically

Released Products

PRODUCTGROUPID

WHI

WHI(warehouse controlled) or, SRV(service)

Released Products

INVENTORYUNITSYMBOL

PCS

Inventory unit

Released Products

PURCHASEUNITSYMBOL

PCS

Purchase unit

Released Products

SALESUNITSYMBOL

PCS

Sales unit

Released Products

PURCHASEPRICE

0

Latest purchase price in local currency

Released Products

UNITCOST

0

Latest cost price Sin local currency

Released Products

SALESPRICE

0

Default sales price in local currency

Released Products

NETPRODUCTWEIGHT

0,5

Weight of the product

Released Products

PRIMARYVENDORACCOUNTNUMBER

20086

Primary vendor

Released Products

PURCHASESALESTAXITEMGROUPCODE

Middle

Purchase item tax groups

Released Products

SALESSALESTAXITEMGROUPCODE

Middle

Sales item tax groups

Released Products

BUYERGROUPID

P108

Grouping related to buyergroup

Released Products

TRACKINGDIMENSIONGROUPNAME

None

Tracking dimension

Released Products, Products

BASESALESPRICESOURCE

PurchPrice

Base sales prices on purchase price ?

Released Products

DEFAULTORDERTYPE

Purch

Standard verdier

Released Products

ITEMMODELGROUPID

FIFO

item model group

Released Products

PRODUCTCOVERAGEGROUPID

Min/Max

Coverage group

Released Products

COUNTGROUPID

PER

Gcount group

Released Products

PURCHASEPRICEQUANTITY

1

Purchase price quantity

Released Products

UNITCOSTQUANTITY

1

Cost price quantity

Released Products

DEFAULTLEDGERDIMENSIONDISPLAYVALUE

-D30-320—P108

Financial dimensions(=”-D30-320—“&B34)

Released Products

Product Dimension

P108

Just a helping colum

Help column for DefaultLedgerDimension

ProductCategoryHierarchyName

Retail category

Retail hierarcy name

Product category assignments

ProductCAtegoryName

Coffee

Category node

Product category assignments

VendorProductNumber

4001392

Vendors item number

External item descriptions for vendors

VendorProductDescription

Jalla Coffee FILTER 500G

Vendors item name

External item descriptions for vendors

VendorAccountNumber

20086

Vendor number

External item descriptions for vendors

BARCODESETUPID

EAN13

Barcode type

Item – Bar Code, Released products

BARCODE

7041011050007

Barcode

Item – Bar Code

PRODUCTQUANTITYUNITSYMBOL

PCS

barcode unit

Item – Bar Code

ISDEFAULTSCANNEDBARCODE

Yes

Scanning yes/no

Item – Bar Code

PRODUCTQUANTITY

1

Barcode quantity

Item – Bar Code

PURCHASEUNDERDELIVERYPERCENTAGE

20

Purchase under delivery percentage allowed

Released Products

PURCHASEOVERDELIVERYPERCENTAGE

20

Purchase over delivery percentage allowed

Released Products

MINIMUMPROCUREMENTORDERQUANTITY

x

Minimum purchase quantity

Default Order Settings

MAXIMUMPROCUREMENTORDERQUANTITY

x

Maximum purchase quantity

Default Order Settings

STANDARDPROCUREMENTORDERQUANTITY

x

Standard purchase quantity

Default Order Settings

PROCUREMENTQUANTITYMULTIPLES

x

Multiple purchase quantity

Default Order Settings

 

The template excel spread sheet columns should contain exactly the columns as listed above:

Then start building the excel spread sheet (this is the time consuming part). This can also be regarded as the “master file” for products. And mass update and mass import of products is done using this file. Remember that you can add more columns and also include calculated fields. Like in this case, the default dimension (used for financial dimension have the formula like =”-D30-320—“&B34 making sure that cell B34 is merged into the financial dimension.

Create the data management import project.

In the data management workspace, create a import project, and use the “+ Add file”, and select the excel file by using the “upload and add”. Then select all the entities and what page in the excel spread sheet that should be imported.

– Select file
– Select entity name
– Select sheet lookup
– Then repeat by select entity name and sheet lookup until all date entities needed are selected

After done this correctly you should have an import project with the following entities:

You should also click on the “view map” symbol if there are a warning, and just delete the lines where there are no mapping generated. Like what I have done here to the “Products V2” entity.

The mapping will be done automatically for you, and will only select the fields that is relevant for each data entity.

Your data entity is now ready to be used. I recommend to use the data management workspace, and select the import project and then “run project”

Then for each data entity I upload exactly the same excel spreadsheet :

And then click on the “import”. If there are any errors, then fix them in the excel sheet or make changes to the staging.

What we then have accomplished is to have a single excel spreadsheet that the category manager can maintain and work with, and it can uploaded(several times) into the import project. For trade agreement sales and purchase prices I normally recommend creating a separate excel spread sheet

Then the excel loving category managers will be happy, and they can import thousands of products in a very short time

 

 

 

 

 

 

 

D365F&O Retail: Combining important retail statement batch jobs

The Retail statement functionality in D365F&O is the process that puts everything together and makes sure transactions from POS flows into D365F&O HQ. Microsoft have made some improvements to the statement functionality that you can read here : https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/statement-posting-eod. I wanted to show how to combine these 3 processes into a single batch job.

The following drawing is an oversimplification of the process, but here the process starts with the opening of a shift in the POS (with start amount declaration), and then start selling in POS. Each time the job P-0001 upload channel transaction is executed, the transactions are fetched from the channel databases, and imported to D365F&O. If you are using shift-based statements, a statement will be calculated when the shift is closed. Using shift-based closing can be tricky, but I highly recommend doing this! After the statement is calculated and there are no issues, the statement will be posted, and an invoiced sales order is created. Then you have all your inventory and financial transactions in place.

 

What I often do see, is that customers are using 3 separate batch jobs for this. The results in the user experience that the retail statement form contains many calculated statements waiting for statement posting. Some customers say they only want to see statements where there are issues (like cash differences after shift is closed).

By combining the batch jobs into a sequenced batch job, then the calculated statements will be posted right again, instead of waiting until the post statement batch job is executed. Here is how to set this up:

1. Manually create a new “blank” batch job

 

2. Click on “View Tasks”.

3. Add the following 4 classes:

RetailCDXScheduleRunner – Upload channel transaction (also called P-job)

RetailTransactionSalesTransMark_Multi – Post inventory

RetailEodStatementCalculateBatchScheduler– Calculate statement

RetailEodStatementPostBatchScheduler – Post statement

Here I choose to include upload of transactions, post inventory, calculate statement and post statement into a single batch-job.

Also remember to ignore task failures.

And remember to click on the “parameters” to set the parameters on each task, like what organization notes that should be included.

On each batch task I also add conditions, so that the previous step needs to be completed before the batch-job starts on the next.

Then I have 1 single batch job, and when executing it spawns subsequent tasks nicely.

The benefit of this is that when you are opening the statements workspace you mostly see statements where there are cash differences, or where the issues on master data.

Take case and post your retail statements.

 

 

 

A quick look at download Retail distribution jobs (CDX)

Commerce Data Exchange (CDX) is a system that transfers data between the Dynamics 365 F&O headquarters database based and retail channels databases(RSSU/Offline database). The retail channels databases can the cloud based “default” channel database, the RSSU database and offline databases that is on the MPOS devices. If the look at the following figure from Microsoft docs, this blog post is explaining how to practically understand this.

What data is sent to the channel/offline databases?

In the retail menus you will find 2 menu items; Scheduler
jobs
and scheduler subjobs. Here the different data that can be sent is defined.

When setting up Dynamics 365 the first time, Microsoft have defined a set to ready to use scheduler jobs that get’s automatically created by the “initialize” menu item, as described here.

Scheduler jobs is a collection of the tables that should be sent, and sub jobs contains the actual mapping between D365 F&O and channel database fields. As seen in the next picture, the fields on the table CustTable in D365 is mapped towards the AX.CUSTTABLE in the channel database.

To explore what is/can be transferred, then explore the Scheduler jobs and scheduler subjobs.

Can I see what data is actually sent to the channel/offline databases?

Yes you can! In the retail menu, you should be able to find a Commerce Data Exchange, and a menu item named “Download sessions”.

Here you should see all data that is sent to the channel databases, and here there are a menu item names “Download file”.

This will download a Zip file, that contains CSV files, that corresponds to the Scheduler
jobs
and scheduler subjobs.

You can open this file in Excel to see the actual contents. (I have a few hidden columns and formatted the excel sheet to look better). So this means you can see the actual data being sent to the RSSU/Offline channel database.

All distribution jobs can be set up as batch jobs with different execution reoccurrence. If you want to make it simple but execute download distribution job 9999 to run every 30 minutes. If you have a more complex setup and need to better control when data is sent, then make separate distribution batch-jobs so that you can send new data to the channel databases in periods when there are less loads in the retail channels.

Too much data is sent to the channel databases/offline database and the MPOS is slow?

Retail is using change tracking, and this makes sure that only new and updated records is sent. This makes sure that amount of data is minimized. There is an important parameter, that controls how often a FULL distribution should be executed. By default it is 2 days. If you have lots of products and customers, we see that this generates very large distribution jobs with millions of records that will be distributed. By setting this to Zero, this will not happen. Very large distributions can cripple your POS’es, and your users will complain that the system is slow, or they get strange database errors. In version 8.1.3 it is expected to be changed to default to zero, meaning that full datasets will not be distributed automatically.

Change tracking seams not to be working?

As you may know, Dynamics 365 have also added the possibility to add change tracking on data entities when using BOYD. I have experienced that adjusting this affect the retail requirement for change tracking. If this happens, please use the Initialize retail scheduler to set this right again.

Missing upload transactions from your channel databases?

In some rare cases it have been experienced that there are missing transactions in D365, compared to what the POS is showing. The trick to resent all transactions is the following:

Run script: “delete crt.TableReplicationLog” in the RSSU DB. And the next P job will sync all transactions from RSSU DB (include missing ones).

 

Using Cloud POS as your retail mobile device

Handheld functionality for retailers is a question I get a lot. Then typical in the area of counting, replenishment, receive and daily POS operations. In version 8.1 Microsoft have taken a small step forward to make it easier to use any handheld device that supports a common browser. Because Cloud POS (CPOS) runs in a browser, the application isn’t installed on the device. Instead, the browser accesses the application code from the CPOS server. CPOS can’t directly access POS hardware or work in an offline state.

What Microsoft have done is to make the CPOS change according to the screen size, to work more effectively on your device. To make it simple, I just want to show you how it looks on my iPhone.

Step 1: Direct your browser towards the URL of where the CPOS is located. In LCS you will find the URL here:

Step 2: Activate your POS on mobile device by selecting store and register, and log in

Step 3: Log into CPOS and start using it. Here are some sample screens from my iPhone, where I count an item using CPOS.

You can also “simulate” this in your PC browser, but just reducing the size of your browser window before you log into CPOS. Here I’m showing the inventory lookup in CPOS.

What I would love to see more of is:

– Barcode scanning support using camera

– The ability to create replenishment/purchase orders in CPOS

– More receive capabilities like ASN/Pallet receive etc.

– Improved browser functionality (like back-forward browsing etc)

To me it seems clear that we will see additional improvements in CPOS, making it the preferred mobile platform for Dynamics 365 for Retail. As we get a little, I hope to see more of this as Microsoft is definitely investing in this area. In our own customer projects we will be developing more and more functionality using RTS (Real Time Service calls) to add more features to be used together with CPOS.

To take this to the next level, please also check evaluate to create a hybrid app, that incorporate CPOS in a app friendly way. Sources say that this will also allow us to build extensions like camera barcode scanning

The direction is right and my prediction for the future is that: Mobile Retail device = CPOS.

Report your bugs, free-riders!

Microsoft Dynamics 365 is the fastest innovation and most agile business software in the world. A very feature rich solution with a packed very fast moving roadmap. We see new possibilities and features coming monthly in platform update, fall/spring releases. if you look at the entire platform-stack including windows, office, and platform (power* apps) new features being made available on a daily basis. Being first and fast have changed and challenged the Dynamics 365 ecosystem. Mostly for the good.

But we have to recognize that it is people (and highly productive) behind this innovation tsunami. In such an environment there are thousands of elements that must to fit together. If you look towards the number of combinations on how you can use and setup Dynamics 365, I would assume that this is millions of combinations in the core product. And when adding office and power* apps, combinations just increases exponentially.

People are people, and there is a limitation to the numbers of combinations that can be tested, both from a manual and automated testing scenario. This leads to scenarios that there is no capacity to test everything before the product is released. It is not possible to test all of the millions of combinations, and I know that even Microsoft do not have unlimited people and resources to cover every test scenario.

This evidently results in issues and bugs that will be found when implementing Dynamics 365, and these needs to be reported to Microsoft support so that the fixes becomes part of the future solution.

Searching, testing, reporting a solution takes time and do cost money! Each time I find a bug, I report this to Microsoft so that all of the community can benefit of a fix. But as some have recognized that reporting issues/bug is requiring effort and resources. You report the bug, analyze the issue, report the issue, Microsoft provides hotfixes, the hotfix needs to be validated and testing and then deployed to the environment. This takes time, but is necessary!

With this blog post I urge both partners and customers to report your findings to Microsoft, so that all the rest of us can benefit that we are an ecosystem together. As I hope most of you know, we are quickly moving towards Dynamics 10, that is often referred to as the “ever-green” solution. This means that there are ONE version, that all customers are using, and that follows the Microsoft roadmap. When one customer reports an issue, and it is fixed, then all benefit from this.

Then there is the issue with the “free-riders”. These are the people that recognize the issue, find workarounds and DON’T take the investment in time and resources of reporting the issue. They know and see the issue, but choose to live with it or ignore it. Then in many cases, Microsoft if not even aware of any issue, and the issues just continues to be present in future releases. The best way is to report what you see to Microsoft support or to Microsoft ideas. Then Microsoft can take action on it, because they know of it.

So, I urge my fellow community friends to not be a Free-Rider, but report your issues. This will ensure that we all can share the resource/time burden among us, and we also improve and strengthen Dynamics 365, that we all will benefit from.

PS! Dynamics 365 is the BEST business application in the world!

Focus18 – EMEA – London

The User Groups for Dynamics 365, AX, CRM, BC/NAV, and Power BI road-trip named Focus is arriving to Europe and is making a stop in London from 5-6 September, 2018 featuring dive deep sessions covering advanced topics on D365 Finance and Operations and Customer Engagement. Additionally, specific topics to the Retail space including modern POS, inventory management, sales orders, ecommerce, credit card processing and more. This is great stuff!

It is a privilege for me to participate and present together with great MVP’s, Microsoft experts and the Dynamics 365 community. If you want to check out my sessions, I will have the following sessions:

Deep dive into retail pricing and discounts. 

This session is about what product sales price and discount options that exists in Dynamics 365 for Retail – “out-of-the-box”.  With actual and real examples of how to implement and maintain your retail prices.

 

Learn, Try, Buy for Retailers.

The “Learn, Try and Buy for Retailers” is an accelerated onboarding approach that enables you to evaluate if a cloud enabled Dynamics 365 for Retail is the right direction, and to be able to learn as much as possible prior to performing a business- and solution analysis. This is available for agile and iterative approaches, and this sessions shows why buying a small Dynamics 365 license is an affordable investment to purchase before scope of implementation have been defined. Using VSTS (Visual Studio Team Services) is a central topic in this session.

Power BI and Retail.  How to get the numbers.

This sessions shows how to publish retail transactions into a Azure SQL database or CDS(Common Data Services), and then analyze the retail sales in Power BI.

Check out https://www.focusemea.com/locations/london as there are many other very interesting sessions.

 

See you in London!

 

 

Microsoft Business Applications sessions on-demand and Dynamics 365 version 10

The Microsoft Business Applications sessions are now available on-demand https://www.microsoft.com/en-us/businessapplicationssummit/sessionsondemand

I enjoyed the following sessions:

Client usability and productivity improvements in the October release and beyond for Microsoft Dynamics 365 for Finance and Operations

Monitoring Microsoft Dynamics 365 for Finance and Operations with Lifecycle Services

Microsoft Dynamics 365 for Retail: Reliable data management and payment processing

Microsoft Dynamics 365 for Retail: Delivering cloud driven intelligence and tools to enable enterprise manageability

 

I also want to highlight the following session, where Microsoft is explaining Dynamics 365 version 10 (Thanks Shelly)

Microsoft managed continuous updates and support experience for Microsoft Dynamics 365 Finance and Operations

Vote on Dynamics 365 ideas

Do you know that you can influence the direction of Dynamics 365? But you may be unsure as to whether it really will make a difference. Microsoft have a site where the community can add ideas and vote on them. Go to https://experience.dynamics.com/ideas/ and create your ideas. If the idea is valid and they get enough votes, Microsoft will act and include them in their product backlog. But equally important is the ability to vote on other’s ideas.

  • Voting is the most important way to make the community voice heard on the issues that concerns the roadmap for Dynamics 365.
  • Voting gives you an opportunity to be part of the priority that affects Dynamics 365.
  • If YOU don’t Vote Others will make the decisions for YOU!

As we speak, there are 1673 ideas for Microsoft Dynamics 365 for Finance and Operations and 212 ideas for Microsoft Dynamics 365 for Retail. Microsoft employees are some of most actives to add ideas to their site.

The ideas portal allows you to see as the ideas more from an idea to being part of the product:

An important unofficial note is that for an idea to be moved from “New” to “Under Review” it requires at least 10 votes. Also discussions is possible on the ideas, and to add additional substance to the requirements.

You can also keep track of your own ideas and votes you have submitted.

If I have a few minutes of spare time, I like to go in and look at the new ideas submitted and read them. When there are ideas I like, I vote on them.

The more we use this channel to give ideas and feedback, the more important it will be. So please go in and vote at https://experience.dynamics.com/ideas/

(And if you find some of mine, please give it a vote )

MSDYN365FO: Automate repetitive tasks – the easy way

Here the other day, I got the task of posting a few thousand Retail Kit orders / BOM-Journals because they failed at the first time. I started, and managed to manually post 50 journals before my fingers got cramps and I started to feel dizzy. I could not multiselect the journals and post them, so I had to manually click “post” on each journal.

I surely sent a SR to Microsoft explaining that this should be easier in standard, and that SR is in process. But it will probably end up as a “As-Designed” state, or “post it to ideas.microsoft.com”.

But there is an easier low-tech way to solving this. Just install a Mouse-ghost app, and it will repeat the task for you. So I used the app “Mouse Recorder Premium” to post all the 1300 journals, and it went smoothly. Just record the clicks and then repeat for a 1000 times.

To make sure I did not “lock” my PC while this was performing, I started the task in a Hyper-V VM, and then it can run in the background.

That’s today’s small trick to get rid of repetitive tasks

D365FO – Some nice excel tricks

When working with importing master data into Dynamics 365 you will experience that they are available in different data entities. In a typical retail project you would need to import data like released products, item barcodes, external item numbers price. It is also common that we get the master data in many files and in different formats. It is therefore quite beneficial to know a few tricks so that it becomes easer to work with loads of data. Here are my tips.

Export all/selected rows (You should know this!)

From any grid in D365FO you can export selected/all rows to excel by right clicking on the grid. The tip is therefore to make a personalization to the grid, so that it contains the fields you want to export to excel.

Then Excel opens with the selected columns. (PS! This export is limited to 10.000 rows)

Use excel to create a filter

Let’s say we have a excel spread sheet with item numbers, and want to filter in D365FO on these items. Here is a very valuable tip.

  1. Copy the items column from excel and paste as rows in a new excel sheet.(Transpose)

  1. Then copy the row, and paste into notepad

  2. Then do a search, replace in notepad, where you copy the space/tab and replace it with comma (,)

  3. Then copy the content here and use it on a “match” filter in D365FO

     

  4. Then you have created a filter on the selected field. It seams the “match” filer is capable of handling quite a lot of text.

This is nice when some asks you to “Please fix these 200 items”. You then filter them and quite quickly go through them to fix it.

Learn Excel VLOOKUP

VLOOKUP is essential to learn, because it let’s you check and lookup data across multiple excel sheet. A typical scenario in the retail world is when the vendor sends a new pricelist, and you want to import them. Often this is delivered as a excel sheet with the vendor item number, item barcode and the price. Most retailers prefers to have their own item numbers. But then you have the issue of mapping the item barcode from the vendor pricelist and trying to find your own product number. Here is how I recommend my customers to do it:

  1. Export all D365FO item barcodes to excel (There is an entity for this, or open the barcodes from the retail menu)
  2. In the vendor excel price list, create a VLOOKUP field to lookup the D365FO product number based on the item barcode.

  3. Then you can create an excel sheet where you have your own product number, and you can import them using “open in excel” or through a data management import job.

     

     

Happy weekend friends !

Measure sales per Retail Category in Power BI

Drill down on sales per category, employee, and department is key essentials for Retailers. Doing this gives a more specific view of what’s generating sales and what isn’t. Having insights into top categories or departments might help make decisions about purchasing and marketing. A good point of sale comes with reporting and analytics, so you can quickly get the data you need, whenever you need it — without manual calculations.

Power BI is a must have for all retailers, and this blogpost is about creating a retail category hierarchy in power BI.

If you have worked with Retail Categories, you know that there exists a “parent-child” relationship between the categories as illustrated from the following data in the Contoso demodata set.

In power BI it is possible to also create such hierarchies, but it requires some minor changes to reflect this. My inspiration came from Power BI Tutorial: Flatten Parent Child Hierarchy. I will not go through how I build a retail power BI analysis, but I can share that I use ODATA entities, and here is the entities I’m using:

More information on the data model is available in DOCS her.

The “trick” is to create a new column named “Path“, and a column named CategoryL[X] for each level in the hierarchy, that for the RetailProductHierarchyCategories looks like this:

Here are the column formulas

Path = PATH(RetailProductHierarchyCategories[CategoryName];RetailProductHierarchyCategories[ParentCategoryName])

CategoryL2 = PATHITEM(RetailProductHierarchyCategories[Path];2)

CategoryL3 = PATHITEM(RetailProductHierarchyCategories[Path];3)

CategoryL4 = PATHITEM(RetailProductHierarchyCategories[Path];4)

CategoryL5 = PATHITEM(RetailProductHierarchyCategories[Path];5)

…etc

Then I create a new hierarchy column for, where I specify

And I use the Hierarchy Slicer that is available in the power BI marketplace.

In power BI I then get a Retail Category slicer, and can filter and measure sales per category in power BI

Microsoft are in process of aligning ourselves with future of Power BI and create the new version of Retail Channel Performance with New Common Data Service for Analytics capability coming to Power BI https://powerbi.microsoft.com/en-us/cds-analytics/

Keep on rocking #MSDYN365FO!

First Aid Kit for Dynamics 365 for Retail; A messy blog post

First, I want to say that Microsoft Dynamics 365 for Retail is the best retail system in the world. What we can do is just amazing! This blog post is going to be a mess without meaningful structure, because the purpose of this post is to quickly give 911-help to retailers, so that the they can continue their daily operations. I this blog post is primary focusing on the MPOS(Modern POS) with offline database and when having a local RSSU(Retail Store Scale Unit). Also, this blog post will be incrementally changed and new topics will be added. So please be welcome to revisit later.

MPOS Hardware

Microsoft do not give recommendations on hardware, but they have tested some hardware. I also can share what is working for a scenario where an offline database on the MPOS should be installed.

HP RP9 G1 AiO Retail System, Model 9018
Microsoft Windows 10 enterprise 64-bit OS – LTSB
HP RP9 Integrated Bar Code Scanner (as a secondary mounted scanner)
128GB M.2 SATA 3D SSD
◾ 16 Gb Ram
Intel Core i5-6500TE 3.3 6M 2133 4C CPU
HP RP9 Integrated Dual-Head MSR -Right (For log-on card reading)
HP L7014 14-inch Retail Monitor-Europe (for dual display)
HP LAN THERMAL RECEIPT PRINTER-EUROPE – ENGLISH LOCALIZATION (TC_POS_TERMALPRINT_BTO)

A small tip; OPOS devices are slow and unpredictable. Try to avoid them. But in this hardware we still had to use OPOS for the receipt printer and the cash drawer.

All drivers related to this machine is available her.

Payment terminals

Building payment connectors is time consuming, but Microsoft have provided documentation and samples that is available her. For me, I prefer ISV solutions for this.
◾ Ingenico iPP 350 Payment terminal (Requires a ISV payment solution)

Additional Scanners

◾ SYMBOL DS9808

Datalogic – Magellan 3200Vsi

Remember to open the scanner documentation, and to scan barcodes to program them to make sure to Enable Carriage Return/Line Feed, adjust beeping etc.

Generic preparation recommendations when having issues

In the following chapter is some preparation steps that you should be prepared to do.

Install TeamViewer on the MPOS device

To make sure that a professional quickly can analyze the device, we always try to use or install team viewer on the RSSU and MPOS devices. This makes it possible to access the machines. Please follow security precautions when using TeamViewer.

Start collecting information

Dynamics 365 for Retail contains a comprehensive set of events that is logged in the system, and that is available for IT resources. Please check out the following pages for additional steps to troubleshoot.

https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-component-events-diagnostics-troubleshooting

The following section contains issues experienced with manually installing Dynamics 365 MPOS.

If you cannot figure it out quickly, create a Microsoft support request as fast as you can. Normally Microsoft responds fast and can give recommendations quite quickly, but often they will need information on the actual machine to see if there are issues related to software and hardware. MPOS and RSSU is logging a tremendous set of information that is relevant for a support case. Take pictures, screen dumps and collect data.

Event logs

Always look into the event logs on the MPOS and the RSSU. Also learn to export the event logs as they can give valuable information on what is wrong. The following event logs are of interest.

•    Windows > Application
•    Windows > Security
•    Windows > System
•    Application and Services Logs > MPOS/Operational

Machine information

Collect Microsoft System Information, such as devices that are installed in the MPOS or device drivers loaded, and provides a menu for displaying the associated system topic. To collect this data do

  • Run a Command Prompt as an Administrator
  • Execute MSINFO32.exe
  • Go to Menu File > Save as machine.nfo

Backups of the local database

Take backups of the RSSU and local database, as this can be handy to analyze the data composition of the database. Some times Microsoft will ask for exact database version and information like:

  • What version of SQL is this?

    Further, is this Standard, Enterprise, Express, etc.?
    => Run query select @@version and share the resulting string.

  • How large is the SQL DB at this time?
  • Plenty of space available on the hard drive still?
  • What is the current size of the offline database and RetailChannelDatabase log file?

RSSU installation and Checklist

The setup and installation of RSSU is documented in the Microsoft DOCS https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-store-scale-unit-configuration-installation

  • Operating system is Windows 10 Enterprise LTSB with separate disk for SQL. SSD disks is highly recommended!
  • SQL Server 2016 standard edition with full text search installed locally on server.
    – I would not recommend SQL Express on a RSSU with multiple MPOS’es installed.
  • Install .NET 3.5 ,4.6, IIS and run Windows update before setup
  • Make sure that SSL certificates(RSSU and MPOS) have been installed and setup on the machine. Remember to add them to you Azure account
  • Verify that you have an Azure AD credentials that you can use to sign in to Retail headquarters.
  • Verify that you have administrative or root access to install Retail Modern POS on a device.
  • Verify that you can access the Retail Server from the device. (like ping with https://XXX.YY.ZZ/RetailServer/healthcheck?testname=ping)

  • Verify that the Microsoft Dynamics 365 for Retail, Enterprise edition, environment contains the Retail permission groups and jobs in the Human resources module. These permission groups and jobs should have been installed as part of the demo data.

A small, but important information about the RSSU. It is designed to always have some kind of cloud connection. If it loses this connection, then strange issues starts to occur. Especially in relation to RTS calls (Realtime Service Calls)

Set Async interval on RSSU

This has been described in a previous blogpost.

Installation of MPOS issues

There are a number of pre-requisites that needs to be followed that is available on Microsoft DOCS. Read them very carefully and follow them to the letter. Do not assume anything unless stated in the documentation. Also read https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-device-activation. Here are my additional tips:

When having customizations or extensions

If you have made extensions, remember to make sure that the developer that have made the deployable package have build the package with “configuration = Release”. There are scenario’s where the MPOS installation can give issues like this.

There are scenarios where making a MPOS build with configuration = debug for internal use, please take a look at the following Microsoft blog-post.

Having the right local SQL express with the right user access on the MPOS

If you are making a retail POS image (With Windows and SQL preinstalled), please make sure to select the right SQL version(Currently SQL 2014 SP2). If SQL express is not already installed, then the MPOS installer will automatically download and install it. But the file is 1.6 Gb, and it is therefore recommended to manually install the SQL express, or have it as part of the standard image. SQL Express is available her, and select the SQLEXPRADV_x64_ENU.exe


There are ways of using SQL Express 2017 with MPOS, but I recommend to wait doing this until Microsoft officially includes this in their installer. Also remember that the SQL Express have some limitations, like it can only use 1 Gb of Ram, and have a 10 Gb database size limitation.

I recommend creating two users on a MPOS machine:

– A PosUser@XXX.YYY, that is a user with very limited rights on the machine, and customers often wants auto login to the machine using this user. But this user also needs administrator elevation when it should do administrator stuff on the machine.

– A PosInstaller@XXX.YYY, that have administrator rights on the local MPOS machine.

When installing, remember to add both the PosUser and PosInstaller as users in the SQL when installing the SQL Express, else the installer struggles to create the offline databases.

Cannot download MPOS package from Dynamics 365

If you try to manually download the installation package, windows explorer have been setup to sometimes deny this.

The reason for this could be a certificate problem with the package. The work-around for this, is to use Chrome when downloading.

Cannot install the MPOS Offline package

When installing the MPOS the following error may come. In many cases the user must be leveraged to administrator. If you receive the following error, it means that the version you are installing is older than the existing version, and the current version must be uninstalled first. Do not try to install a higher version than is deployed in your Cloud RSSU default database, as this is not supported. Also if you need to “down-grade” a MPOS, then uninstall the MPOS first, and then reinstall the older release.

PowerShell scripts for manual uninstalling of MPOS

In 95% of any situation, just uninstalling the MPOS app should work. But if you are out of options, Microsoft have created an uninstall powershell script.

Cd “C:\Program Files (x86)\Microsoft Dynamics 365\70\Retail Modern POS\Tools”

Uninstall-RetailModernPOS.ps1

I often experience that we need to run the uninstall in the following sequence:

1. Run it as a local administrator

2. Then a “uninstall” icon appears on the desktop, that we need to click on

3. Run it again as a local administrator

Then the MPOS is gone, and you can reinstall the correct MPOS.

Connectivity issues

Here are some tips on connectivity issues, and how to solve them.

MPOS is slow to log in

When starting the MPOS, it sometimes can use a few seconds before available. We see this, it you typical have a slow internet connection with high latency. The MPOS is doing some stuff towards the cloud, and this just takes time.

MPOS cannot go online after being offline

I think this behavior currently is some bug that can happen in certain situations and if the RSSU looses internet connectivity. Microsoft are investigating the causes. If not possible to go online after the MPOS have been in offline, it is possible to reactivate the MPOS to get online. In the event log you may see issues like this : “UpsertAndValidateShifts”

Rename the file: C:\Users\[POS-User]\AppData\Local\Packages\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt\AC\Microsoft\Internet Explorer\DOMStore\DSSWV5L9\microsoft.dynamics.retail[1].xml

Then reactivate the MPOS with RSSU address, register and device and login with the D365.posinstaller.

IMPORTATANT: Remember to select hardware station when logging into the MPOS afterwards!

This is not a supported “fix” from Microsoft, and it is expected that Microsoft will find a permanent solution to this issue.

MPOS cannot connect with the payment connector

The following is mainly related to some issues that could be happening if having a third party payment connector using PINPAD. In most generic cases this is not relevant for those that is using standard or other payment connectors.

1. First check that Hardware station is selected on the MPOS.

2. The next step is to reboot the PC

3. If still not working, copy the file MerchantInformation.xml to the folder “C:\ProgramData\Microsoft Dynamics AX\Retail Hardware Station”. AND to C:\Users\[POS-User]\AppData\Local\Microsoft Dynamics AX\Retail Hardware Station. This will ensure that the payment is working as expected also in offline mode. The MerchantInformation.xml is a file that is downloaded from the cloud the first time the POS is started. If changing the hardware profile

4. Is still not working, open the hardware profile and in the profile ” set the EFT Service to Payment connector and test connector. This will download the MerchantInformation.xml again.

Then run the 1090 distribution job. After X minutes, try to restart the MPOS, and try to perform a payment. This should also automatically regenerate the MerchantInformation.xml. Microsoft is working on a fix for this, and you can follow the issue her.

PS! Normally a production environment should not need to have connection to the Microsoft test connector

Retail offline database exceeds 10 Gb limit

To ensure that a POS don’t exceed the SQL Express 10 Gb disk restrictions, I have created a SQL script that reduces size of the log file.  Please evaluate to implement on all POS’es.

Getting strange errors like “The channel does not exist or was not published”

In some rare situations you could experience getting errors like.

Our experience is that this could happen if the database on the RSSU is overloaded, and are not able to respond to MPOS connections. Log into the RSSU and check out if the CPU, database og disks are not able to respond. If you have SQL express on the RSSU, we have experienced this. Also try to not push to many distribution jobs too frequently. In a situation we uploaded 400.000 customers, while running the distribution job 1010 (customers) every 5 minutes. That “killed” the RSSU when having SQL express.

Getting strange errors like “A database error occurred”

We have also experienced this when the RSSU is overloaded. Remember that the Microsoft recommendation on the RSSU hardware needs to be scaled accordingly to hos many MPOS’es is connected and how much data and transaction volume. Get an SQL expert to evaluate the setup of the RSSU prior to go live and remember to volume test the setup.

Hot to fix ? Scale up your RSSU.

Getting strange errors like “We where unable to obtain the card payment accept page URL”

We have also experienced the following issue. The solution was simple; Remember to enable the local hardware station on the MPOS.

Getting strange errors like “StaffId”, when returning a transaction

In a situation where there are connection between the MPOS and the RSSU, but the RSSU don’t have a connection to the cloud, AND you perform a “return transaction”. You may get the following error.

“Return transaction” is defined as an operation that require online RTS (Real-Time-Service calls). The following list defines all POS operations, and if they are available in offline mode.
The solution in this situation is therefore to use the POS operation “Return Product” instead on the MPOS.

Keep and eye on your devices.

In the menu item Channel client connection status you can see last time each device was connected.

Functional issues

With functional issues I refer to issues that is related to user errors and more functional issues that can occure.

Dynamics 365 for Retail on version 8

Even though version 8 have been launched for Dynamics 365 for Finance and Operations, I have not seen that Retail yet(10 may 2018) is supported on version 8. So before going forward on version 8, please check with Microsoft support.

Barcode scanned as tendered currency amount

This is a funny issue, that can occur. Some background story is in place here. A customer wants to pay for the product in another currency, and the cashier selected the “pay currency” on the MPOS, ready to key in the amount that the customer is paying. But unfortunately, the cashier scanned the product barcode, and then the MPOS committed the sale as the customer had paid 7.622.100.917,80 in currency, and should have 5.707.750.079.417 in return (local currency). Lesson learned; Always remember to set the parameters “Overtender maximum amount” and the Amounts fields.

How to fix it? You actually need to create a Microsoft support request to have perform make some changes in the database. This takes time, and it have to be first performed in the staging environment that is updated. It can take a lot of time! So make sure you set these parameters right before you go live.

Cannot post retail statement, because of a rounding issue.

This is a known issue, and Microsoft have a hotfix for this. Always make sure you periodically update you system with the latest hotfixes. Here is my small tip on this; Try 4-5 time to click on post, and then it suddenly goes though and get’s posted. We do not know why ??

Retail statement (Legacy) and Retail Statement

In version 7.3.2, Microsoft released a new set of functionality for calculating and posting retail statements. You can read more about it her. Microsoft recommend that you use the Retail statements configuration key for the improved statement posting feature, unless you have compelling reasons to use the Retail statements (legacy) configuration key instead. Microsoft will continue to invest in the new and improved statement posting feature, and it’s important that you switch to it at the earliest opportunity to benefit from it. The legacy statement posting feature will be deprecated in a future release.

Access hidden Retail menu items.

The form “Retail Store transactions” contains all retail transactions that is received from the MPOS/RSSU’s, and here you will find, sales, logins, payments etc. This first step for any user should be to personalize this form, and only show the relevant fields and columns(Not done here).

You can dig deeper into the transactions, by clicking the “Transactions menu”

If I here open the “Payment transactions” I get a filtered view of the payment transactions related to that receipt.

BUT! In many cases you would like to look on ALL the payment transactions, and not only the those related to a specific receipt. But there are no menu items that let’s you see all payment transactions in one form.

Here is my tip. Right click on the form and then you can see the Form name. Click on that …

And you should be able to see the menu item name.

Then copy your D365FO URL, and replace the menu item name, and open it in another browser tab.

Then you get a nice list of all payment transactions regardless of what receipt is connected to

This procedure can be used most places in Dynamics 365. For retail, this is excellent because some times you need to find specific transactions. If you need to reconcile banked transactions (where you have a Bag number), then you can use this approach to see all banked bag numbers in a single form. But here is a list of the most common ones:

Sales transactions(items) &mi=RetailTransactionSalesTrans
Payment transactions &mi=RetailTransactionPaymentTrans
Discount transactions &mi=RetailTransactionDiscountTrans
Income/Expense transactions &mi=RetailTransactionIncomeExpenseTrans
Info code transactions &mi=RetailTransactionInfocodeTrans
Banked declaration transactions &mi=RetailTransactionBankedTenderTrans
Safe tender transactions &mi=RetailTransactionSafeTenderTrans
Loyalty card transactions &mi=RetailTransactionLoyaltyRewardPointTrans
Order/Invoice transactions &mi=RetailTransactionOrderInvoiceTrans

Unit conversion between <unit 1> and <unit 2> does not exist.

If you use Retail Kitting, and have kits with intraclass unit conversions, then there is an issue, that Microsoft is working on. This is scenarios where the included kit line is stocked in pcs and consumed in centiliters. Luckily Microsoft is working on this, and we expect a fix on this.

Wrong date format on the POS receipt.

In EN-US we have the date format MM/DD/YYYY. In Europe we use DD/MM/YYYY. The date format on the receipt is controlled by the language code defined on the store. We often prefer to have EN-US as the language on stores, but this gets wrong date format. Therefore to get the right date format on the receipt, you either have to maintain product names/descriptions in multiple languages (like both EN-US and EN-GB), and specify that the languageon the POS store should be EN-GB. We are working on finding a better and more permanent solution to this.

Dual display.

Microsoft writes: “When a secondary display is configured, the number 2 Windows display is used to show basic information. The purpose of the secondary display is to support independent software vendor (ISV) extension, because out of the box, the secondary display isn’t configurable and shows limited content. ” In short…. You have to create/develop it yourself in the project. This requires a skilled Retail developer that masters RetailSDK, C# and javascript.

Credit Card payment with signature

In certain situations it could happen that the payment terminal is capable of processing the payment, but for some reason this is not closing the “waiting for customer payment”. In most cases this is related to the payment terminal being able to perform offline transactions, and then the payment terminal will print a receipt where the customer must sign. In such cases we have created a separate payment method called “pay with signature”, that is posted in exactly the same way as a credit card payment method. Then the cashier is able to continue the payment processing, and register that the payment was ok, and then print out the receipt.

Something very wrong was done by the cashier, then suspend the transaction

If there for some reason, the cashier is not able to continue on the transaction, the casher have the option of suspending the transaction, and then continue. Then later, the POS experts can resume the transaction, and find out what went wrong.

Setting up MPOS in tablet mode

The MPOS works very nice in tablet mode. But if you have dual display, the PC cannot be put into tablet mode. We have not found a way to fix, and if you know, please share.

MPOS resolution and screen layout does not fit the screen

Do not just set the MPOS resolution to the screen resolution. If there is a “title bar”, you need to subtract that title bar height from the screen layout. This is important in scenarios where you have dual displays.

Use lock screen and not log off on the registers.

The log-out/in process to more “costly” from a resource perspective than the lock operation.

Keep the MPOS running (but logged out) when not using the device.

As the Dynamics 365 periodically sends new data to the MPOS offline database, this will be done through the day/night. Then the MPOS is “fit-for-fight” when the user logs in.

Run Distribution jobs in batch

My guide lines on retail distribution jobs is that all Retail jobs will start with the R-prefix, followed by the number. Download distribution jobs will be R1000-1999. Upload Distribution jobs will be R2000-2999. Processing batch jobs will be R3000-3999. Retail supply chain processes will be named R4000-4999.

There are a number of jobs distributing data from Dynamics 365 to the store databases (RSSU) and the offline databases. The jobs and suggested recurrence I suggest is

That’s my tips for today. If you have read this completely to the end, I’m VERY impressed, and let me know in the comments.

Failed ERP implementation will change partners to become trusted advisors.

A norwegian customer won a compensation case against an ERP implementation partner after the customer terminated the parties’ agreement on the supply of a new ERP. The customer was compensated by the Norwegian district court assessed at 288 mNOK (36,7 mUSD). Originally the contract was worth 120 mNOK. You can read the complete story here http://www.selmer.no/en/nyhet/felleskjopet-agri-wins-district-court-case. The court decision is expected to be appealed.

Luckily this was NOT a Dynamics 365 implementation, and the customer is actually replacing the failed ERP system with Dynamics 365. The reason why I wanted to write about this story is that it has implications on how much risk and responsibility an ERP implementation partner can take. A major part of the ERP partners are smaller companies with less than 100 employees, than cannot take the risk of getting into such a situation. There are always problems and risks that is beyond what a ERP partner can control. Partners are not the developer company of the standard software. They are implementing, and in some cases adding additional extensions. Also the cloud based software are running on azure that is beyond the control of the partner.

How can this change partners behavior? Partners are changing towards becoming verticalized trusted advisors, but with limited responsibilities. We can give recommendations based on what we know about the software and how to use it efficiently but the costs are more on a T&M(Time and Material) basis. It will more be the customer them selves that is responsible for the implementation and time-tables.

Some customers will not accept this change, but other do. There are currently resource constrains in the Dynamics 365 partner channel and we partners avoiding customers that takes a back-seat approach towards their implementation projects. The sales focus will change towards those customers that take more of the responsibility themselves, and that do understand to take a more dynamic and agile approach. A 400-page requirement document is not a good start for an ERP project, as we see the digitalization possibilities are accelerating. We also see that customers don’t run a 2 year ERP implementation project before going live. They run a 90 days project to get live with only parts of their requirements. The project then takes on other areas and they extend their use of the Dynamics 365.

At the end, I include some trusted advisor recommendations that I think can inspire anyone that is about to start a project.

D365FO – Speed up Retail RSSU download performance

If you don’t know what RSSU is, I suggest reading this, but the RSSU is about having a database locally in your store that MPOS or CPOS can connect to. It is typically used if you have an unreliable or slow internet connection.

One of the things you can evaluate is to implement the Azure Express Route, and Microsoft have released a whitepaper for Dynamics 365. This can really speed up the connectivity performance.

Another thing I see is annoying is that the local RSSU is only picking up the distribution files every 15 minutes. The Cloud channel database is really fast. This means that when sending new products or prices to the RSSU, it can take up to 15 minutes before this data is available in the MPOS. That is really annoying to wait 15 minutes when testing.

In the Microsoft documentation we are instructed to use the Data Sync interval to speed up the synchronization. But somehow it does not work.

But there is a way around this. On the local RSSU there is a configuration file, where you can modify how often the RRSU should request new data to be downloaded.

Then change the following two lines:

Then just restart the AsyncClient Services and reset the IIS on the RSSU box. Then the distribution of data to the RSSU is really speeding up.

But what is the recommended setting from Microsoft ?

This is recommended to make the RSSU request packages at an interval that is a proper fraction of what the packages are generated at. So if you are sending new products every 10 minutes? Do 5 minutes. If you are sending new products every 5 minutes, do 2 minutes download interval. The higher frequency the more often the RSSU will request data, and some consider this as a waste of bandwidth.

Good luck in your retail implementation

D365FOE-Moving to a new tenant

Companies change, merge, sell, purchase each other, and we encounter requirements where it is a requirement to move to a new/other Azure AD tenant.

But…. That’s not a small thing. We requested through a Microsoft support ticket on how to do this, and hoping this was a small formality, and that Microsoft had some magic tricks of doing this. But they don’t. But I can explain the process we are on to achieve this.

  1. Create Azure subscription on new tenant.
  2. Buy a new required licenses in new CSP-subscription for D365FO DEV/TEST/PROD instance.
  3. Add admin user on new tenant to the new LCS.
  4. Setup new azure connector in existing LCS project with the new subscription.
  5. Deploy new DEV/TEST/PROD environments for the new connector in the new tenant
  6. Setup new VSTS in the new tenant.
  7. Copy all checked-in code from old to new VSTS.
  8. Import all checked-in code from new VSTS to new DEV environment.
  9. Compile and install the code packages into the new stage environment.
  10. Request DB copy from “old” PROD to the “old” stage environment.
  11. Export an Azure back-pack from the “old” stage environment.
  12. Import the Azure back-pack into the “new” Dev environment.
  13. Run AdminUserProvisioning tool with admin user from new tenant to swap tenant.
  14. Repopulate email settings, users and other settings lost by the copy.
  15. Check, Check, Check….Fix, Fix, Fix.
  16. Request DSE to copy new stage to new PROD (only possible once).
  17. Check, Check, Check….Fix, Fix, Fix.
  18. Suspend/end the “old” CSP subscription.

In the process you will lose all documents that is record attached/stored in the old environment. There are also some other expected issues.

Do expect to spend some time on such a process. And it’s a good thing to perform the DB copy two times (first time just for validation and test). Microsoft is looking into how to improve this process, but this is how we are performing it.

If any in the community have better ideas, feel free to share it

BIG credits to my colleague HAKAM.

Great stuff on the D365 roadmap

What we currently see is that more and more power user functionality is introduced step-by-step to make Dynamics 365 ready for the next natural technological step; to become a true SaaS solution built as a Azure service fabric. Check out this video from Microsoft for what I hope is the future and architecture direction for Dynamics 365. But before we get there, there have to be a natural transition of making Dynamics 365 more configurable and less dependent on creating your own customizations and extensions.

Now and then I try to keep an eye on the D365 roadmap for signs on this transition, and today I found these nice features that I think will be highly valuable. I have copied the descriptions from the roadmap, and the release date is not clear, but I look forward to present these great enhancements to my customers.

1. Power users can add custom fields to forms without developer customization

Many application customizations involve adding one or more fields to existing tables and including them in application forms. Most of your customizations may be comprised of adding fields.

Customizations are expensive because they require developer intervention for development, test, and code life cycle management. Customizations also need to be managed and migrated from one environment to another.

We are making it easier to add custom fields to forms in Dynamics 365 for Finance and Operations, Enterprise edition. No longer will developer customization be needed. Instead, a power user will be able to add a custom field to a table and then place that field on the form using personalization. An IT administrator will then be able to share the personalization with others in your organization.

2. Product lifecycle state

The product lifecycle state will be introduced for released products and product variants. You can define any number of product lifecycle states by assigning a state name and description. You can select one lifecycle state as the default state for new released products. Released product variants inherit the product lifecycle state from their released product masters. When changing the lifecycle state on a released product master, you can choose to update all existing variants that have the same original state.

To control and understand the situation of a specific product or product variant in its lifecycle, it is a best practice in Product lifecycle management solutions (PLM) to associate a lifecycle state with a variable state model to products. This capability will be added to the released product model. The main purpose of this extension is to provide a scalable solution that can exclude obsolete products and product variants, including configurations, from master planning and BOM-level calculation.

Impact on master planning – The product lifecycle state has only one control flag: Is active for planning. By default, this is set to Yes for all product lifecycle states. When the field is set to No, the associated released products or product variants are:

  • Excluded from Master planning
  • Excluded from BOM level calculation

For performance reasons, it is highly recommended to associate all obsolete released products or product variants to a product lifecycle state that is deactivated for master planning, especially when you work with non-reusable product configuration variants.

Find obsolete released products and products variants – You can run an analysis to find and update obsolete released products or product variants.

If you run the analysis in a simulation mode, the released products and product variants that are identified as obsolete will be displayed on a specific page for you to view. The analysis searches for transactions and specific master data to find the released products or product variants that have no demand within a specific period. New released products that are created within the specific period can be excluded from the analysis.

When the analysis simulation returns the expected result, you can run the analysis by assigning a new product lifecycle state to all the products that are identified as obsolete.

Default value during migration, import, and export

When migrating from previous releases, the lifecycle state for all released products and product variants will be blank.

When importing released products through a data entity, the default lifecycle state will be applied.

When importing released product variants through a data entity, the product lifecycle state of the released product master will be applied.

Note, the ability to set individual product lifecycle states using the data entities for released products or product variants is not supported.

3. Users can pin PowerApps to forms and share with peers to augment functionality

Have you built a PowerApp that uses or shows data from Dynamics 365 for Finance and Operations, Enterprise edition? Or have you been using a PowerApp built by someone in your organization? Would you like to use PowerApps to build last-mile applications that augment the functionality of Finance and Operations?

Your users can build PowerApps without having to be expert developers to extend ERP functionality. PowerApps developed by yourself, your organization, or the broader ecosystem can now be used to augment ERP functionality by including them within the Finance and Operations client.

Your users will be able to pin PowerApps to pages in Finance and Operations. After they’ve been added, these changes can be shared with peers in your organization as personalizations.

 

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer

Dynamics 365 : Adding check-digits to number-sequences

In Dynamics 365 we are using number sequences to automatically create identifiers like product number, customer number etc. I’m a fan of having these numbers as “clean” as possible, and I always try to convince my customers use pure numbers. Why ? Take a look at the keyboard:

The numb-pad is the fastest way of typing in data. I also see that users normally perform lookup and see the description of what they are selecting anyway.

But let take a scenario; We will use a number sequence to create products numbers. We will then typical get product numbers like this :

Then I have often seen that another problem arises; typing errors from the num-pad are actually getting a “hit”, because when using a number sequence we can almost always find a product that have the same number as the user wrongly typed.

If you try using your credit card online you will see that the number is not accepted if any number is wrong. The solution there is to build in check-digits in the number.

I created a very small extension to solve this in Dynamics 365, with just a few lines of code. In the following example The “green” part is from the number sequence, and the yellow part is from the modulo 10 check digit calculation.

In this way the user can never type the wrong product(or any other identifier), unless it is 100% correct.

In the screen for number sequences I added an option to add the check digit to my generated numbers.

I wanted to share this with you, because it is so simple:

1. Create an extension on the table “NumberSequencetable”. Then add the extended datatype (YesNo) as a field, and name it “AddCheckDigit”.

2. Add this field to the “Setup field group”

Then we have the parameter in place, and it is available on the number sequence as shown earlier.

3. Then create a new class and replace all code with the following :

Here I’m creating an extension to the NumberSeq class, and I’m creating one method; Num, that will add the modulo10 number to my number sequence.

Where I check if my new “AddCheckDigit” is enabled, and I’m also saying that do not do this for continuos number sequences, manual, and I also say that the number sequence must be allowed to be changed to a higher number.

That’s it

Now you can have check-digits on products, customers, vendors, sales orders, purchase orders etc.

PS! I have not tested this code 100%, but the community is full of brainpower that hopefully can share additional findings on bugs or flaws.

If you like this, vote at idea sat https://ideas.dynamics.com/ideas/dynamics-operations/ID0002954

Agile POD’s: Organize for efficiency

Have you ever seen the TV-series “House of Lies“. This is a quite funny TV comedy series that focuses on the extrovert lifestyle of a management consulting team. First of all it is a comedy and not very realistic, but it manages to illustrate the concept of how to create the most efficient organization form for solving problems; The Agile POD.

Agile pods are small custom agile teams, ranging from four to eight members, responsible for a single task, requirement, or part of the backlog. This organizational system is a step toward realizing the maximum potential of agile teams by involving members of different expertise and specialization, giving complete ownership and freedom, and expecting the best quality output. This blogpost is about how to organize a consulting business that is non-silo organized around an actual service product.

In many consulting companies today, we see increasingly alarming signs that prevents the full utilization of the people and resources. Some of the signs can be seen as:

– Many non-direct-operative managers. (If you have >5 levels from bottom to top you have an issue)
– To many internal meetings. (Why Meetings Kill Productivity)
– To much time are used to generate budgets, forecasts and excel spreadsheets.  (No actual customer value)
– Organized into silo-team with similar expertise. (Functional, Technical, Support etc)
– New project teams for each project. (Spends 2 months of getting to know your team members)
– Outdated internal systems and processes.
– Mixed marketing message and costly (pre) sales and implementation processes
– Many partners is currently not ready for the Dynamics 365 cloud based disruption (Sticks to waterfall, while agile accelerate)

Agile POD’s is a different way of organizing a team for efficiency. How does a agile POD look like? In this example we have a small 5 person permanent team. This team is specialized for running a some tasks/phases in the initial Dynamics 365 implementation; The Agile preparation phase.

In this example the POD owner is the Solution Architect. The roles in the POD can be described as:

The solution architect:

He runs the POD, and he also have all the responsibility of the POD. It is the POD owner that recruit the POD-members. The Solution Architect is the “face” of the POD, and will organize the work in the POD and also discuss the solutions with the key decision takers at the customer. Very often the solution architect have lot’s experience. In agile terms this is also the SCRUM-master and also very operational.

The Finance expert:

When implementing Dynamics 365, there is an always a need to know how to connect the operational processes into accounting and reporting. This person is highly knowledgeable in Financial Management reporting, Power BI, Excel. He also knows how to improve the reporting from the financial perspective by defining financial dimensions, setting up Tax, Bank, Fixed Assets, HR and Budgeting/Forecasting.

The Vertical Domain Expert:

How to implement best-of-breed processes is the vertical domain experts expertise. In Retail-domains this means expert on Master data, Categorization, Stores, POS, Devices etc.

The Technical Architect:

In a cloud based system, there is a need to understand how environments are deployed, setup and make it all ready for an efficient Application Lifecycle Management. The Architect knows the ITIL-framework. When a change is needed the technical architect will create the necessary documentation/VSTS backlogs for developers to execute on.

The Junior consultant:

The junior consultant is here to learn, offload and support the team. As experience increases the junior will eventually more responsibility and hopefully some day move into other positions in the team.

Within the team we are looking to T-shaped person, that have a width to their expertise, and also a few deep expert knowledge domains. A gaming company called Valve(That delivers the Steam gaming store) described what we are looking for with the following picture of the T-shaped model. Take a look at their employee handbook. This same concept and idea is relevant for Dynamics 365 consulting companies.

The Agile POD’s must therefore specialize their own services. Each POD-team must therefore build WBS (Work-Breakdown-Structures) that enables the delivery to combinedly utilizes the entire POD.

The idea is that a POD-team is sent out to the field, then delivers the pre-defined services, and returns safely afterwards. Then it is off to the next client to again deliver the same service. As you may understand, it is therefore important that the services delivered is predefined. In this concept there is not one team that delivers a complete implementation. In larger implementations it would be a sequence of Agile POD’s that cover the implementation.

This way of organizing is not a new way of doing things. This working concept have been applied for decades at entrepreneurs and building companies. When building a house this is not done with a single team. It is done by a sequence of teams that is specialized. A POD team will have responsibility of a limited set of tasks, that needs to be performed in a predefined sequence.

By organizing operational skills into POD’s executed in a sequence, we now have a balanced unit. One pain in Dynamics 365 consulting companies I often see is that bottlenecks arises around a few selected roles. Typical on the solution architects. This unbalance will result on high utilization on these roles, with other roles have low utilization, because work is not correctly distributed. We also see that consultants are being placed into project teams because they have free time, and not because they have the right knowledge. This increases costs and reduces satisfaction for customers. Ultimately it also reduces profitability for the implementation partner.

Agile POD’s does not solve every thing, but it makes the center core operational services lean and efficient. Any consulting company still needs sales, project management and customer support as separate functions.

As seen in the figure above the each vertical focus area will have a management functions, that focuses on building Agile POD’s. The idea is not to hire single consultants but to create new POD’s. The POD itself must define the services that the POD can deliver. The role of a vertical department management is therefore to how on recruiting new POD’s. As Valve explains it, the hiring becomes the most important thing in the universe.

A model for money and revenue must also be established. All departments must be self-financing and make sure that they are balanced according to how the revenue stream is defined. One element that is common in the consulting business is bonuses. I personally don’t like the idea of bonuses but I see that it is very difficult without it.(Necessary evil) In the model below is a an example on how different departments can be revarded.

Marketing and Sales: The concept of cloud based systems it that the customer don’t need to purchase all the software upfront. They rent/hire the software in the cloud, and only pays a monthly fee. The Marketing and Sales divisions must therefore be financed by the monthly license revenue, and the bonus would be accumulating. The purpose is therefore to make sure new customers are onboarding and that existing customers hare happy with the services. As a new seller in this way of organizing it, there will not be much bonus in the start, as you have few customers onboarded. But as more customers get’s on board, the bonus will be accumulating, and after 2-3 years there will be a decent bonus and a decent ground for investing more in marketing.

Project and Management consulting: As described earlier, these roles are the only more “permanent” roles that exists in the project. They will ask Agile POD’s to come inn and solve specific tasks. Their services are based on T&M(Time and Material), and their bonus will be based on the revenue(not margin) on the project.

The Agile POD’s: These services are charged in a combination of T&M and Predefined Product Services. The Predefined Product Services is the key here. Create WBS-structures where the price and delivery is clearly defined. The bonus here is a team bonus. Internally in the team it is distributed according to a key. But the POD-team can also choose to use the bonus for other purposes also like training or conferences. Remember that an agile POD is a self-contained unit with costs, revenues and margins. If the POD is not profitable it will be dissolved and the team unattached/fired.

Platform Services: This department is making sure all services/software around the Dynamics 365 is working as expected. This means making sure the azure/tendents are set up correctly, that Office is working and that services like CDS(Common Data Services) and PowerApps are setup as expected. All their services should be Predefined Product Services. And the bonus would be based on margin. Why ? Because we want to become better and better delivering these predefined services. The faster this is delivered the more margin is generated. This is a Win-Win situation for both the customer and for the consulting company.

Customer support/After Sales: Customer support and aftersales is all about delivering excellent customer service after the project have gone live. It’s revenue should be based on support agreements and add-ons. The bonus for the department is based on accumulated revenue, because these services should be reoccurring services that the customer pays for each month. If the customer is happy about the services provided then they will continue to use this service. The alternative for the customer is to use Microsoft Premier Support that can be quite costly and not that relevant in most cases.

At the end of this blogpost I would like to visualize how we envision the Agile POD’s, where we have training on our services and delivering excellent customer services on time and on budget.

giphy-downsized2

And if we don’t, then this is the consequence:

formula-1-fire-gif-1632727

Additional details on Agile POD’s can be found here:

https://www.globant.com/build/agile-pods

https://www.agileconnection.com/article/using-agile-pods-realize-potential-your-team

Video : https://www.youtube.com/watch?v=IwJKRaocdxI

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

Dynamics 365 Pre go-live checklist

I asked Microsoft if I could share their Pre go-live checklist that is used in the Fast-Track program. And they said yes

So here is a copy for you, and what customers must be prepared to answer before Microsoft is deploying the production environment.

Pre Go-live Health Check list:

  1. Solution acceptance by users: UAT
    1. Is UAT completed successfully? How many users participated in UAT?
    2. Did UAT test cases cover entire scope of requirements planned for go-live?
    3. How many bugs/issues from UAT are still open?
    4. Any of the open bugs/issues a showstopper for go-live?
    5. Was UAT done using migrated data?
  2. Business signoff:
    1. Business has signed off after UAT that the solution meets business needs?
    2. Solution adheres to any company/industry specific compliance (where necessary)
    3. Training is complete
    4. All features going live are documented, approved and signed off by customer
  3. Performance:
    1. How was the performance in UAT? Is it acceptable for go-live?
    2. If Performance testing was done, then are there any open actions from it?
  4. User & Security setup:
    1. How many security roles are being used. All security roles are setup and tested?
    2. Users that will need access at go-live have been setup with correct security role?
  5. Data Migration:
    1. Data migration status – Masters & Open Transactions/Balances
    2. Business has identified owners for data validation?
    3. Review cut-over plan: Business & Partner teams are comfortable with the plan?
    4. Does the Data migration performance fits within cut-over window?
  6. Configuration Management:
    1. Are the configurations updated in Golden Configuration environment based on changes in UAT?
    2. Data stewards/owners identified and process in place for post go-live changes in Master/Configuration data?
    3. All Legal Entities configured for Go-Live?
    4. Are configurations documented?
  7. Integrations:
    1. Review list of integrations and readiness plan for each
    2. Latency requirements and performance criteria are met
    3. Integration support is in place with named contacts/owners
  8. Code Management
    1. Production fixes/maintenance process defined?
    2. Code promotion (between environments) process is in place, documented and the entire team knows and understands the process
    3. Code promotion schedule for production is in place?
    4. Emergency process for code promotion to production is defined?
  9. Monitoring and Microsoft Support
    1. LCS diagnostics setup and knowledge transfer to customer
    2. Issue resolution and Escalation process defined – LCS support is verified?