D365 – My Covid-19 10 day’s response story

Hi Friends.

I hope you all are hanging in there and can still work and deliver excellent experiences with Dynamics 365.

I wanted to share my Covid-19 10-day response story on how fast a reduced scope Dynamics 365 implementations has been made available. Some weeks ago, we and Microsoft were contacted by an important player in the health industry, that urgently wanted to establish purchasing- and supply-chain processes for medications and equipment’s. The key element here was the urgency because it was unclear in what directions the pandemic would take here. What the customer needed was tools that could process information about supply providers and what kind of supplies is needed for readiness stockpiling. Our first step was to setup Dynamics 365 (CRM) to store relations and this was done in a few days. Then the next step was to setup and go live with a “minimum viable product” of Dynamics 365 finance and supply-chain apps. We had a goal of doing this in 10 working days. This is the story I would like to share.

Day 1: Onboarding, tools, and deployment

In the initiation of a project, I always have a document named “Welcome to the [Customer]-project”. This is a great document, because it contains all the essential information about the onboarding to a project and can be shared to all participants. It is typically a 6-7 pages document explaining the onboarding process and the main objectives. It also contains references to LCS, SharePoint/Teams sites, DevOps and URL’s to environments. The most valuable element is a full overview of all the people that will somehow be involved in the project. In this project we decided on a small efficient 4 person team(POD), and fast-track support from Microsoft.

Microsoft quickly processed licenses, and we quickly deployed the LCS project. The first we started was to deploy the Tier-2 sandbox, and we named this the ‘UAT’ environment, and this was to be used as the master data/golden environment in the start. We also deployed the Tier-1 sandbox and named this “Test”, and would be used to have access to Visual Studio etc. The initial version we deployed was 10.0.10.

We have a ready implementation templates that is imported into DevOps, that contains the main structure of requirements and tasks. We scope this down to the actual processes we need.


We also have a ready folder structure for the team’s site where we can store and complete all documentation. By the end of the first day we had established the tools needed for starting the project.

Day 2: Working with the generic tasks in the backlog

We established a 30-minutes daily sprint meeting with main implementation major actors, where the plan is presented, and where the today’s tasks are prioritized. We did not have the time to create large word documents, to we decided to document the solution in DevOps, and organizing all the system setup around the entity templates as they can be extracted from D365. I exported the templates to Excel, and then import them to DevOps using the Azure DevOps Office® Integration, and this gives be 419 tasks to setup as much as possible in standard.

This makes it possible for we to have a step-by-step task list of all the elements I need to build the “Golden environment”. Also, each task is being assigned, and the actual setup is documented with a direct URL to the D365 form, and a screen dump of the actual setup.

On the first day we where able to process close to 200 tasks and setting up the most generic parts of the system.

Day 3: Working with the finance task backlog

When working on the finance setup we have a standard chart-of-accounts we imported, and we had to setup financial dimensions. We are also setting up the accounting structure, creating a few inventory posting profile and setting up tax parameters. Normally this is quite strait forward and we can use much from previous projects.

Day 4-5: Working with products

Now the Excel skills is put to the test. We have a excel sheet that contain most of the product master data. In total over 33.000 products, and each product have classifications, attributes, properties, and vendor/producer information. We quickly decided to use the same item numbering as was present in the excel sheet. Each column in the sheet was classified if:

  • This is a field we have in D365?
  • Should field become a category in a hierarchy?
  • Should the field because an attribute?

To get the products inn it was a very advanced copy/paste/merge of data into excel sheets that we then imported into Dynamics 365. At the end, we realized that all information we had could be imported, and without any information loss. It was hard work, but the end result was promising containing a list of all medical supplies available and classified into the medical ATC structure.



We also imported barcodes, vendors, producers, employees, address information, external items names/descriptions, attributes.

Day 6: Frist demo, UAT and deploy production environment

On day 6 we were ready to show the actual master data, and the initial view of the system. The customer was impressed by how fast we where able to build a system and processes that was familiar to their operation.

We decided to update the system to 10.0.11, and in parallel with the setup of the system we had been working closely with the Microsoft fast track solution architect to make the environments ready for production deployment. After a few iterations we got the production environment up and running and performed a DB-refresh of the production environment with the master data we had in the tier-2 sandbox. This meant that now we had an environment available to start performing transactional process testing and trimming the systems. I know that this is not the normal way of doing this, but thanks to Microsoft’s understanding of the urgency we where allowed to go this “fast-track” route. In DevOps we established the processes we wanted to test and optimize.

Day 7: Test dual write, business events and power platform

As earlier described, we also implemented some of the “CRM” elements first. Now we could enable the dual write, and synchronize vendors, employees, and other information into the CDS. Our first step was just to validate that it was working as expected in the UAT, and it worked as a charm We can share these master data across the D365 platform.

The next thing was to test how we could use the business event framework to integrate towards a 3’rd party WMS provider. Dynamics 365 have a business event that is kicking in when performing a purchase order confirmation. We decided to enable purchase order change management to have a strict workflow and ensure that we would rely on the purchase confirmation process.

This allows us to create a solution where the business event is catched by a power automate flow, that fetched all the lines of the purchase confirmation. And then transforms this into the format that the WMS provider needs. We can also enrich the data sent to the WMS provider, so that it is sufficient with all needed master data in their system. The next step is to import receive lines from the 3’rd party WMS provider. This will happen by power automate creating an arrival journal, and then a batch job in D365 is posting it, and then posting a product receipt. It all ends with a new business event being triggered (Purchase order received) that will send a message to the WMS provider that the goods now have been received. What we then archive is that the on-hand in each system is synchronized, and without any major delay caused by processing.

In total we have setup quite a lot of batch jobs, that handles all from cleaning, posting, and planning. We used the takings from the following blogpost as a template for batch jobs.

Day 7: Master planning and Planning Optimization

We do expect that quite a lot of requisitions and requirements will be processed through the system. So, using the new planning optimization engine from Microsoft suited the project well. Calculating the requirement on all products is extremely fast and done within minutes. This will allow for faster reaction time to new requirements and potentially reduce stockout situations caused by vendor lead time.

On day 7 we also imported all employees and created some approval position hierarches. This way we can extend the workflow processing for approvals.

Day 8-9: Testing, Testing, Testing in UAT

We started day 8 by refresh the UAT environment and executing testing according to key central the business requirements defined in DevOps. We found 3-4 issues, that was reported to Microsoft (Index performance etc), that was quickly fixed within hours by the excellent support architects. We also wanted to provide a bit visually nicer purchase order form-letter, that was more presentable, and decided to import the modern reports package from Microsoft. This makes it a bit easier to adjust.

We did try out the configurable business documents, but in this case it would take a bit more time to learn properly (that we did not have..) to set up correctly. Any issues we found, was also fixed in the PROD environment.

The main processes we focused on was the procurement processes, with approval steps, and manual coordination with vendors.

Day 10: Project closure and training

On day 10, we summarized on how far we had come, and created a project closure/summary report that also contains next steps and more backlog suggestions. We have suggested additional focus on Azure Data Lake, Power BI and implementation of a vendor portal. We also planned to perform training and making final changes to enable end-user onboarding. What we see is that making a system ready is not just setting up the system but implementing the use of the system in the daily operation. This is expected to take more time, and we are ready to respond

Final words and tips

I really hope this system will show it value and will be regarded as small but valued contribution to the covid-19 response. Microsoft have published the following page where there are resources that can help. Microsoft have also launched a program where you can get a 200 seat Dynamics 365 Customer Service system for free for 6 months to Covid-19 response related activities. Se https://dynamics.microsoft.com/en-us/covid-19-offer/

If you have any similar stories, please share them. The Dynamics 365 community cares and stands united in this Corona-19 fight!

D365 Importing JSON data the hard way!

I recently created a solution where I’m importing products and all related data for the grocery industry, and I wanted to share my experience so that others may follow. This is not a “Copy-Paste” blogpost, but more show my approach to the process that can be used when working with more advanced and complex JSON integrations. Many industries have established vertical specific databases where producers, distributor’s and stores are cooperating and have established standards on product numbers, product naming, GTIN, Global Location Number (GLN) etc. In Norway we have several, and the most common for the grocery industry here is TradeSolution. Most products is available to the public at VetDuAt.no, but they also have a Swagger API where the JSON data can be fetched and imported to D365.

One of the experiences I had when starting this journey, is that D365 is not modelled according how the data in these industry specific public databases. Much is different, and the data is often structed differently. We also see that the product databases are quite rich in terms of describing the products with physical dimensions, attributes, packing structure, allergens, nutrition’s etc.

To give you a small figure of the complexity you often can find, here is a subset of the JSON hierarchy:

I needed to decide how I should import this data. Should I just import what I have fields for in D365? Should I extend D365 will lot’s and lot’s of new fields? Or should I model according to how the external database is presenting the data? I decided on the latter and import the data as it was presented. This would give the best result and the least information loss in the process. I decided to go for a model where D365 is requesting a JSON file from the Swagger API, and then placing the JSON structure in a C# class structure. Then extracting the data from the C# objects and place the data into a new module I named EPD. The next step the process does is to take these data and populate the standard 365.

The benefit I see is that I’m not overextending the std Microsoft code. The data is available in D365, and can be used in Power BI etc. I would like to share some of the basic steps when fetching such large data structures from external services.

Fetch the JSON from the service.

To fetch a JSON file, I’m using some .net references, that helps handle Active Directory and http connections. The first method shows how to get an accesstoken, and this is relevant of the swagger services requires this. The next method is where the swagger URL is queried, and the JSON file is returned. In additional some success/error handling.

So at this time we have the JSON file, and we want to do some meaningful with it. Visual studio have a wonderful feature, where you can paste a JSON, and convert it into classes. To make this work, you will have to create a C# project.

This will generate the C# class, and in this example the number of sub-objects and the number of properties is in the hundreds, and the properties can be objects and event array’s of objects.

In addition I need to have a method that takes that JSON file, and deserializes the content into the class methods.

Store the JSON object data into D365 tables.

So at this time, we have been able to fetch the data, and in the following code, I’m getting accesstoken, getting the JSON, deserializing the it into an C#-object, and parsing it forward for more processing.

 

Now, let’s start inserting this data into a new D365 table. For simplicity reasons, I have created a D365 table for each data object in the JSON file. This allow me to store the entire hierarchical JSON structure into D365 tables for further processing. As soon as I have the data stored in D365, I can create the codes that moves it forward into the more functional tables in D365.

A lesson learn was that when creating sub tables to store hierarchical JSON data, it is sometimes needed to create relationship between the records in multiple tables. Sometimes also uniqueness is required, and the best way I have found (so-far) is to create a GUID field, and use this GUID to relate the data in the different tables. This can easily be accomplished with the following code.

Create the std D365 data using data entities through code.

At this stage I have ALL the data in D365, and I can start processing the data. Here is a subsection of how I create released products by using standard
data entities, where a table containing the JSON data is sent in, and I can create the products and all sub tables related to products.

This approach has resulted in solution, where it is easy for the end-user to fetch data from external systems, and import them into D365. Here is a form showing parts of the “staging” information before it is moved into D365 standard tables. (This form in in Norwegian, and showing a milk )

I would like to thank the community for all the inspired information found out there. Especially Martin Dráb (@goshoom) that have been very active in promoting the “Paste JSON as classes” in Visual studio.

 

 

 

 

 

 

D365 : Automatic license disablement and login reminder

When assigning licenses to a Dynamics 365 user, it would be beneficial if the system disabled and removed a license from a user if the user has not used the system for X days. X minus 5 days the system should send out a message to the user like this:

“This is a login REMINDER for Dynamics 365. Kurt Hatlevik has not logged into for at least 25 days. Your last login was 2/20/2020 12:10:00 AM. Login to Dynamics 365 is required at least once within a 30 days window or your account may be deactivated without notice. Please login within the next few days to ensure access is maintained.

Reactivation will require user administrator approval and will be dependent upon license availability.”

This would make the system more secure, and it will also free up licenses for users that are not using the system.

If you also think this could be beneficial, please vote on this idea her : https://experience.dynamics.com/ideas/idea/?ideaid=c12972cf-6a6c-ea11-b698-0003ff68dcfc# 

D365 and the supply structures in grocery retail industry

Today I will write a bit about the supply chain structure we see in the retail grocery industry, and challengers Dynamics 365 may face, and how to address them. The grocery industry has for many years seen that industry collaboration brings benefits and synergies throughout the value chain. We see industry collaboration that offers a range of services to its owners, customers and partners. In the country where I’m from, the main collaboration initiative is TradeSolution, and is owned by the main grocery chains in Norway. TradeSolution operates and maintains central registers, databases, and various IT, reporting and analysis services in Norway, but we see much of the same pattern in other countries and other industries also.

One essential element is to have a unification of how to identify products and how the products are packed, ordered and shipped. In Norway we have the term EPD (Electronic Product-Database), that makes it easy for the entire Norwegian grocery marked to purchase and sell products. Much of the information shown in the blogpost here is originating from TradeSolutions public pages here.

What is EPD?

In Dynamics 365, one of the most essential SCM elements are products and released products, and the associated master data tables related to this. In the grocery industry it is actually the packaging that is the center of it all. The products etc is actually properties of a packing structure. It would be an oversimplification to say that EPD is products. EPD is describing not only the products, but also the packaging of the products. The EPD standard is describing the products in up to 4 levels: basis, inner box, outer box and pallet(with SCCS). Each level identified with a GTIN. See also my old blogpost about SSCC.

So far so good. We can model this in Dynamics 365 by having a product defined as a “Basis”, and use the inner box, outer box and pallet as unit conversions. In D365 we also have the possibility to create barcodes for each unit of measurement (UOM). It would also be quick to assume that the EPD number is an external item description.

Unfortunately, the grocery industry is a bit more complex. Let’s take a quick look on the EPD numbers of Coca Cola. It is actually 7 packing structure/EPD numbers, and these are shown to the right(7digits). All of the represents different packaging of the same basis unit, and can have different properties and attributes.

What we also see is that some boxes are marked with a “F”, that means this is a consumer unit. So talk in D365 language, is can be sold to consumers. Some are also marked with a “B” that means that this is the unit that the EPD number is purchased in. So if we take a detailed look at EPD 4507224, we see that it is defined what units you can sell, and what units you can purchase. On a single EPD number there is only one level you can choose to purchase of. Here are 2 examples that describes the complexity. First example is an EPD, where the grocer can sell in basis unit and in inner box unit (EPD 4507224)

The next example is where the grocer can also sell basis unit and in another inner box unit type (EPD 2142941)

As you can see here, the conversion between inner boxes to pallet results in different quantities.

To further add complexity we can add the definition mix to the element. The ordering is happening on the inner box level, but it actually contains separate products that is sold through the stores.

On last element is also the concept of unmarked variants. Like this package of yogurts.

Summary EPD

  • A product is identified by a EPD number (EPDnr)
  • A unit is identified by a GTIN (Global Trade Item Number)
  • A unit is called «pakning» in EPD
  • A product can have up to 4 levels of units (hierarchy)
  • A product can be a mix of multiple «basis» or «mellom/innerbox» units
  • A “basis” unit can be shared by many products
  • The first level of the units is called «basis» in EPD (often referred to as a customer unit or base unit)
  • The top level of the units is called «topp» in EPD (often referred to as a load carrier unit)
  • The levels between «basis» and «topp» (if any) are called « mellom/innerbox/outerbox » units
  • A basis unit can consist of units without identification called unmarked variants («umerkede varianter»)
  • Within an EPD structure, only one of the packings is used for ordering.
  • Multiple packings can be used for sale.

Some key issues we have faced with Dynamics 365 on how the industry is modelling products is the following:

  1. Cost: As seen, a product can be sold in many different UOM’s, and we also see that the industry can have different purchase prices depending on which EPD number you choose to order. Meaning that a 4 pcs pack have a different cost than a 24 pcs pack. As the product can be purchased in multiple UOM with different prices, it is difficult to model the cost pricing correctly, because the inventory transactions will be on the lowest item. The inventory transaction costing is based on the lowest level, meaning basis. This costing problem is the reason why I suggest FIFO in retail grocery implementations.
  2. On-hand: Keeping track of how many basis units, or other consumer units is difficult, because you do not always know with the consumer is breaking up a coca cola inner box. Where should the cost come from, when having multiple purchasing units as shown in figure. This makes it difficult in Dynamics 365 to 100% correctly model the revenue per pcs sold.
  3. Unit conversion: As shown in the example, the same unit (like pallet) can contain different number of basis products. This means that it is insufficient to unify the UoM per product. UoM conversion is EPD dependent. Clear relationships between the UoM must also be modelled. A product may have multiple definitions of an inner box, outer box and pallet.
  4. External item descriptions: Dynamics 365 external item description cannot be used, because it only supports one external item description per vendor. UoM is not taken into consideration.
  5. Attributes: In the grocery industry, there may be different attributes per EPD number, and also different attributes per UoM.

How to model this in Dynamics 365?

To solve the distribution requirements, we see in the grocery industry, it is required to do some front-end remodeling of how products are represented. The grocery industry are focused on packaging and Dynamics 365 is product oriented. The key here is that EPD is Object Oriented, a product can be represented in several packaging structures.

The entities we have at our disposal in Dynamics 365 is the following:

  1. Products and released products
  2. Unit of measurement and conversion
  3. Barcodes
  4. External item descriptions
  5. BOM’s

But Dynamics 365 is what is it, and any change on the architecture of how products and transactions are handled is not on the near roadmap. We must try to model this structure in a way, such that the EPD standard and Dynamics 365 standard is modelled to work jointly together.

First, lets try to model how the EPD(Only subset) from a grocery supply perspective(Not D365!). An EPD can consist of multiple packaging structures, and a package main contain packages. At the bottom of the packing structure there is a reference to a basic package, that describes the product.

 

 

When importing EPD based products I see the following as a solution:

  1. EPD will be a separate entity/Table, and modelled as the grocery industry have it.(New tables in D365, the feeds the std D365 tables)
  2. D365 products will be defined as the “Basic Package”
  3. The EPD package structure populate the barcode table and the product specific unit of measurement table. Because there is several packaging, the traditional naming of the unit of measurement cannot be used. The unit of measurement conversion is actually dependent on the EPD number. In essence, this means having unit’s of measurement named :

    PCS – Basic unit for the lowest basis product
    IB-4507224 – Unit for the inner box
    OB-4507224 – Unit for the outer box
    LC-4507224 – Unit for the load carrier

    With this we can create the unit of measurement conversion between the different types.

Let’s say we have the following simple product:

This would be modelled in D365 with a released product:

I would here have to define 4 unit of measurements:

I would then have to define the following unit conversions to describe the unit conversions between the different EPD packing structures.

The more EPD packing structures present, the more unit conversions needs to be defined. (In the coca cola example there will be 6 more conversions)

We also need to store GTIN per packing unit per EPD:

We also have the Physical dimensions menu item, that now let’s us describe the physical dimension on the product per EPD unit.

 

In Dynamics 365 we can only select one suggested purchasing unit. So if you have multiple EPD associated with a product you will have to choose one, and this is the unit that is suggested.

The purchase order would then look like this, and where the unit is describing the EPD number.

To keep track of all unit conversions, GTIN/Barcodes etc will be an impossible manual job. Since EPD is an industry standard, all of these data is imported through WEB-services.

TradeSolution have their webservices that offer the possibility to send EPD structures to D365. This way, all packing structures of products can be automatically imported, distributed into std D365 and adjusted when needed.

The suggestion is not 100%, but it would make sure that grocery retailers can procure and sell the products, while also have the concept of packing structures in place.

Let’s conquer the grocery industry also

 

 

 

 

 

 

 

D365 – What have changed (pmfTablehasChanged)

This short post is for you hardcore X++ developers that create magic everyday. D365 have the following method, that allows you to validate if any fields on a record have been changed. If it returns true, then something has changed, and if false, then nothing has been changed. There are scenario’s where you would like to know if there have been any changes to the record before you update/write to the Db, to save some roundtrips to the Db.

Then this is nice, and 100% std

Happy coding friends.

Batch Jobs; Take control of the executions

Dynamics 365 can be automated quite a lot with the use of batch jobs. With batch jobs, your Dynamics 365 solution becomes “alive”, and we can set up the system to automate many manually processes. Lets say to have the following “vanilla process”, and wants to automate as many steps as possible.



This document covers the Batch jobs needed to be setup for this process to be as automated as possible. I wanted to put a structured system on all the batch jobs that is typically used in a production system. But this also generates a lot of data, that you don’t normally need. It is therefore common to create both functional batch jobs that processes and executes functionality, and also execute cleanup jobs that removes irrelevant data.

Batch job Naming conventions

To make it simpler to understand the batch jobs a simple structure of naming the batch jobs have been created. The first character is just “A”, to make sure that the sorting of the batch jobs is in the best possible way, and that the batch jobs can be sorted according to name. The next is a 3 digit number and at the last there is a then a description that explains the batch job.

ID

Description

A001-A099

System administration batch jobs

A100-A199

Data management batch jobs

A200-A299

General ledger batch jobs

A300-A399

Procurement and sourcing batch jobs

A400-A499

Sales and marketing batch jobs

A500-A599

Retail batch jobs

A600-A699

Inventory management batch jobs

A700-A799

Warehouse management batch jobs

Reach of these ranges are then set up as batch groups, and you can better control what AOS servers is executing what type of batch jobs:


In this blog post more than 87 batch jobs have been specified, and that keeps the Dynamics 365 system updated and as automatic as possible

Job description
A001 Notification clean-up
A002 Batch job history clean-up
A003 Batch job history clean-up (custom).
A004 Daily Diagnostics rule validation
A005 Weekly Diagnostics rule validation
A006 Monthly Diagnostics rule validation
A007 Named user license count reports processing
A008 Databaselog cleanup
A009 Delete the inactivated addresses
A010 Scan for orphaned document references.
A011 Report data clean up
A012 Cryptography crawler system job that needs to regularly run at off hours.
A014 Updates system notification states.
A015 Deletes non-active and orphaned system notifications.
A016 Database compression system job that needs to regularly run at off hours.
A017 Database index rebuild system job that needs to regularly run at off hours
A018 Deletes expired email history.
A019 Process automation polling system job
A020 Scan for document files that have been scheduled for physical deletion.
A021 System job to clean up expired batch heartbeat records.
A022 System job to seed batch group associations to batch jobs.
A023 System job to clean up unrecovered user session states.
A024 Change based alerts
A025 Due date alerts
A026 Email distributor batch
A027 Email attachment distributor
A103 Entity Store Deploy measurement
A103 Refresh data entity
A200 Clean up ledger journals
A201 Import currency exchange rates
A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.
A205 Update purchase and sales budget
A206 Source document line processing
A207 Source document line processing queue cleanup
A208 Ledger journal monitor
A300 Purchase update history cleanup
A300 Purchase update history cleanup
A301 Delete request for quotation
A303 Draft consignment replenishment order journal cleanup
A303 Run Forecast planning
A304 Run Master planning
A305 Post product receipt
A403 Sales update history cleanup
A405 Order packing slip
A406 Order invoice
A407 Calculate sales totals
A500 All retail distribution jobs (9999)
A501 Upload all channel transactions (P-0001)
A502 Process Assortment
A503 Update listing status
A504 Product availability
A505 Generate related products based on customer transactions
A506 Process delivery modes
A507 Synchronize orders job
A508 Update search Product data
A509 Update search Customer data
A510 DOM batch job
A511 DOM fulfillment data deletion job
A512 Default channel database batch job
A513 Recommendation batch job
A514 Retail scheduler history data removal batch job
A515 Create customers from async mode
A516 Retail transaction consistency checker orchestrator
A517 Retail transactional statement calculate batch scheduler
A518 Retail transactional statement post batch scheduler
A519 Retail financial statement calculate batch scheduler
A520 Retail financial statement post batch scheduler
A521 Process loyalty schemes
A522 Post earned points in batches
A523 Process loyalty lines for other activities
A524 Retail time zone information job
A600 Calculation of location load
A601 Inventory journals clean-up
A602 Inventory settlements clean up
A605 On-hand entries cleanup
A606 Warehouse management on-hand entries cleanup
A607 On-hand entries aggregation by financial dimensions
A608 Cost calculation details
A609 CDS – Post integration inventory journals
A700 Work creation history purge
A701 Containerization history purge
A702 Wave batch cleanup
A703 Cycle count plan cleanup
A705 Work user session log cleanup
A706 Wave processing history log cleanup
A707 WMS Replenishment
A708 Automatic release of sales orders

I will not go in detail of all the jobs, but here I at least refer to where you can find the menu item or what class is used in the batch job tasks. Also take a look at blog post by the D365 Solution architecture team, that is a subset of the batch jobs presented in this blog post.

System administration batch jobs

These are general system batch jobs that can perform cleanups and other general executions.

ID

Name, path and recurrence

Description and recurrence

A001 A001 Notification clean-up

System administration > Periodic tasks > Notification clean up

Daily

This is used to periodically delete records from tables EventInbox and EventInboxData. Recommendation would also be if you don’t use Alert functionality to disable Alert from Batch job.

A002 A002 Batch job history clean-up

System administration > Periodic tasks > Batch job history clean-up

Daily

The regular version of batch job history clean-up allows you to quickly clean all history entries older than a specified timeframe (in days). Any entry that was created prior to – will be deleted from the BatchJobHistory table, as well as from linked tables with related records (BatchHistory and BatchConstraintsHistory). This form has improved performance optimization because it doesn’t have to execute any filtering.

A003 A003 Batch job history clean-up (custom).
System administration > Periodic tasks > Batch job history clean-up (custom)

Manually

The custom batch job clean-up form should be used only when specific entries need to be deleted. This form allows you to clean up selected types of batch job history records, based on criteria such as status, job description, company, or user. Other criteria can be added using the Filter button.

A004 A004 Daily Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Daily

Incorrect configuration and setup of a module can adversely affect the availability of features, system performance, and the smooth operation of business processes. The quality of business data (for example, the correctness, completeness, and cleanliness of the data) also affects system performance, and an organization’s decision-making capabilities, productivity, and so on. The Optimization advisor workspace is a tool that lets you identify issues in module configuration and business data. Optimization advisor suggests best practices for module configuration and identifies business data that is obsolete or incorrect.
A005 A005 Weekly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Weekly

Performs a weekly validation and diagnostics.
A006 A006 Monthly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Monthly

Performs a monthly validation and diagnostics based on the rules.
A007 A007 Named user license count reports processing

Class : SysUserLicenseMiner

Daily

Batch job that counts number of users that have been using the system. The data is used in the Named user license count report. D365 creates this execution automatically, but you have to rename it to fit this structure.
A008 A008 Databaselog cleanup

System administration > Inquiries > Database > Database Log

Weekly

This job cleans up the database log, and makes sure that only (let’s say) 100 day’s of history remains. In the query criteria I set created date time less than “d-100”, to ensure that I keep 100 day’s of database log. This is general housekeeping and dusting in the system, and keeping the system nice and tidy.
A009 A009 Delete the inactivated addresses

Organizational administration > Periodic >Delete inactivated addresses

Weekly

Deletes addresses that have been set to inactive.
A010 A010 Scan for orphaned document references.

Class : DocuRefScanOrphansTask

Daily

Batch job that is setup automatically by the system, and scans for document references where the source record is deleted.
A011 A011 Report data clean up

Class: SrsReportRunRdpPreProcessController

Daily

Cleans up any data generated for SSRS reports.
A012 A012 Cryptography crawler system job that needs to regularly run at off hours.

Class: SysCryptographyCrawlerTask

Every 3 days

Auto created at D365 setup …Not sure what this is, yet…..
A013 A013 Data cache refresh batch

System administration > Setup >

Data cache >Data cache parameters

Every 10 minutes

The data cache framework is used to cache data sets and tiles. Enabling of the data cache framework will redirect certain queries against a cache table instead of executing them against the underlying source tables.
A014 A014 Updates system notification states.

Class : SystemNotificationUpdateBatch

Every minute

Updates notifications,
A015 A015 Deletes non-active and orphaned system notifications.

Class : SystemNotificationScanDeletionsBatch

Daily

Deletes non-active and orphaned system notifications
A016 A016 Database compression system job that needs to regularly run at off hours.

Class: SysDatabaseCompressionTask

Daily

Compresses the database
A017 A017 Database index rebuild system job that needs to regularly run at off hours

Class: SysDatabaseIndexRebuildTask

Daily

Rebuilds indexes to ensure good index performance
A018 A018 Deletes expired email history

Class: SysEmailHistoryCleanupBatch

Daily

Deletes expired email history
A019 A019 Process automation polling system job

Class: ProcessAutomationPollingEngine

Every minute

Using business events, the polling use case can be re-designed to be asynchronous if it is triggered by the business event. Data will be processed only when it is available. The business logic that makes the data available triggers the business event, which can then be used to start the data processing job/logic. This can save thousands of batch executions from running empty cycles and wasting system resources.
A020 A020 Scan for document files that have been scheduled for physical deletion.

Class: DocuDeletedFileScanTask

Hourly

Scan for document files that have been scheduled for physical deletion
A021 A021 System job to clean up expired batch heartbeat records.

Class : SysCleanupBatchHeartbeatTable

Daily

Cleans up the new internal monitoring BatchHeartbeatTable table (Only after PU32), and used for priority-based batch scheduling.
A022 A022 System job to seed batch group associations to batch jobs.

Class:
SysMigrateBatchGroupsForPriorityBasedScheduling

Daily

See priority-based batch scheduling.
A023 A023 System job to clean up unrecovered user session states.

Class:
SysUnrecoveredUserSessionStateCleanup

Daily

Cleans up sessions that is unrecovered.
A024 A024 Change based alerts

System administration > Periodic tasks > Alerts > Change based alerts

Hourly (or faster)

Events that are triggered by change-based events. These events are also referred to as create/delete and update events.

See also Microsoft docs.

A025 A025 Due date alerts

System administration > Periodic tasks > Alerts > Due date alerts

Hourly (or faster)

Events that are triggered by due dates.

See also Microsoft docs.

A026 A026 Email distributor batch

System administration > Periodic tasks > Email processing > Email distributor batch

Send emails. See also Microsoft docs.
A027 A027 Email attachment distributor Send emails, with attachments. For workflow.

Data management batch jobs

Data management executions can generate a lot of data, and to maintain performance and avoid data growth, it is relevant to clean up staging tables and job executions. Also document any of your recurring executions to make it easy and simple to maintain a overview of your data imports and exports that are recurring.

ID

Name, path and recurrence

Description

A100

[Cannot be executed in batch]

Data management workspace > “Staging cleanup” tile

Manually

Data management framework makes us of staging tables when running data migration. Once data migration is completed then this data can be deleted using “Staging cleanup” tile.

A101

A101 Job history cleanup

Data management workspace > Job history cleanup

Daily

The clean up job will execute for the specified amount of time. If more history remains to be cleaned up after the specified about of time has elapsed, the remaining history will be cleaned up in the next recurrence of the batch job or it can be manually scheduled again.

A102

A102 BOYD Data management export

Data management workspace >export in batch

Hourly

If you have a data management export to BYOD, then this can be executed in batch. There are other options that also can be evaluated for this purpose. See A102 BOYD Data management export

A103

A103 Refresh data entity

System administration à Setup à Entity Store

Monthly

To refresh the entity store (the built in embedded power BI). The refresh updates the aggregated measurements, and is only relevant of there are updates or changes that affect these.

General ledger batch jobs

ID

Name, path and recurrence

Description

A200

A200 Clean up ledger journals

Periodic tasks > Clean up ledger journals

Weekly

It deletes general ledger, accounts receivable, and accounts payable journals that have been posted. When you delete a posted ledger journal, all information that’s related to the original transaction is removed. You should delete this information only if you’re sure that you won’t have to reverse the ledger journal transactions.

A201

A201 Import currency exchange rates

Currencies > Import currency exchange rates

Daily

Automatically imports exchange rates from the bank.

A202

A202 Purchase budget to ledger

Inventory management > Periodic tasks > Forecast updates > Purchase budget to ledger

Monthly

Posts the purchase budget to ledger

A203

A203 Sales budget to ledger

Inventory management > Periodic tasks > Forecast updates > Sales budget to ledger

Monthly

Posts sales budget to ledger

A204

A204 Update purchase and sales budget

Inventory management > Periodic tasks > Forecast updates > Update purchase and sales budget

Monthly

Updates the purchase and sales budget.

A205

A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.

General Ledger > Periodic tasks > Batch transfer for subledger journals

Daily

Batch transfer for subledger journals

A206

A206 Source document line processing

Class: SourceDocumentLineProcessingController

Every 10 minutes

Used for accounting distribution. See Microsoft docs.

A207

A208 Source document line processing queue cleanup

Class: SourceDocumentLineProcessingQueueCleanupController

Weekly

Used for cleaning up accounting distribution. See Microsoft docs.

A208

A208 Ledger journal monitor

Class: LedgerJournalTableMonitorController

Every 6 hours

Monitors if ledger journals should be blocked or opened.

Procurement and sourcing batch jobs

ID

Name, path and recurrence

Description

A300

A300 Purchase update history cleanup

Periodic tasks > Clean up > Purchase update history cleanup

Weekly

This is used to delete all updates of confirmations, picking lists, product receipts, and invoices generate update history transactions.

A301

A301 Delete request for quotation

Periodic tasks > Clean up > Delete requests for quotations

Manually

It is used to delete requests for quotation (RFQs) and RFQ replies. The corresponding RFQ journals are not deleted, but remain in the system.

A302

A302 Draft consignment replenishment order journal cleanup

Periodic tasks > Clean up > Draft consignment replenishment order journal cleanup

Weekly

It is used to cleanup draft consignment replenishment order journals.

A303

A303 Run Forecast planning

Master planning > Forecasting > Forecast planning

Weekly

Demand forecasting is used to predict independent demand from sales orders and dependent demand at any decoupling point for customer orders. See also at Microsoft docs, where using additional azure services to perform the calculation.

A304

A304 Run Master planning

Master planning > Master planning > Run > Master planning

Daily

Master planning is used to generate planned (purchase) orders, based on the coverage settings. We expect this service to be enhanced with more real-time oriented planning engine. The master planning batch job execution is located at. Also check out the Microsoft docs on this (large) subject.

A305

A305 Post product receipt

Procurement and Sourcing > Purchase orders > Receiving products > Post product receipt

Automatically post purchase receipt when all lines have been registered,

Sales and marketing batch jobs

ID

Name, path and recurrence

Description

A400

A400 Delete sales orders

Periodic tasks > Clean up > Delete sales orders

Manually

It deletes selected sales orders.

A401

A401 Delete quotations

Periodic tasks > Clean up > Delete quotations

Manually

It deletes selected quotations.

A402

A402 Delete return orders

Periodic tasks > Clean up > Delete return orders

Manually

It deletes selected return orders.

A403

A403 Sales update history cleanup

Periodic tasks > Clean up > Sales update history cleanup

Weekly

It deletes old update history transactions. All updates of confirmations, picking lists, packing slips, and invoices generate update history transactions. These transactions ca be viewed in the History on update form.

A404

A404 Order events cleanup

Periodic tasks > Clean up > Order events cleanup

Weekly

Cleanup job for order events. Next step is to remove the not needed order events check-boxes from Order event setup form.

A405

A405 Order packing slip

Sales order > Ordershipping > Post Packingslip

Hourly

Set up automatic packingslip posting of the sales order is completely picked. (If this is the process). This means that as soon as the WMS have picked the order it gets packingslip updated.

A406

A406 Order invoice

Accounts payable > Invoices > Batch invoicing > Invoice

Hourly

Set up automatic invoice posting of the sales order is completely packingslip updated. (If this is the process).

A407

A407 Calculate sales totals

Periodic tasks > Calculate sales totals

Recalculate the totals for the sales order. This is typically used in scenario’s when the sales order is part of a “Prospect to cash” scenario. See docs.

Retail batch jobs

ID

Name, path and recurrence

Description

A500

A500 All retail distribution jobs (9999)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

This batch job is sending all distribution jobs to the retail channel database. This data like products, prices, customers, stores, registers etc. The distribution job is a “delta” distribution, meaning that only new and changed records are sent. There is a lot of more to be discussed on how to optimize the 9999-distribution job, and for really large retail installations some deep thinking is required. For smaller installations it should be OK to just use the setup that is automatically generated when initializing D365 retail/Commerce.
A501

A501 upload all channel transactions (P-0001)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

The P-0001 is sending the retail transactions back from the POS to the D365 HQ, where the retail transactions can be posted and financially updated.
A502

A501 Process Assortment

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process Assortment

Hourly

This job processes the assortment based on the assortment categories set on an item, and based on the assortment set up, puts the items in the relevant stores’ assortment. When defining an assortment, you have in D365 the possibility to connect organization hierarchies to retail category hierarchies. The process assortment will perform the granulation of this, so that D365 have a detailed list of each product that is present in each store. The assortment is setup under Retail and Commerce à Catalogs and assortments à Assortments and more details is available on Microsoft docs.
A503

A503 Update listing status

Retail and Commerce > Retail and Commerce > Products and Inventory > Update listings

Daily

The listing status is related to publishing a retail catalog to an online store. The Microsoft documentation is not the best in this area, and the closes explanation I have is that it is related to the listing status on the catalog.
A504

A504 Product availability

Retail and Commerce > Retail and Commerce > Products and Inventory > Product availability

Daily

The batch job for product availability is calculate if a product is available on online store. Checkout this blogpost for further details. SiteCore eCommerce integrations can benefit from this, and in essence it populates the data needed for distribution job 1130, and that maintains the following tables into the channel database
A505

A505 Generate related products based on customer transactions

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Generate related products

Daily

This job will automatically populate related products based on sales transaction purchase history. The two relationships created are ‘customers who bought this item also bought’ and the ‘frequently bought together’ relation types. This data can then further be used in eCommerce scenario’s. Fore deep details, take a look at the class ‘RetailRelatedProductsJob’
A506

A506 Process delivery modes

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process delivery modes

Daily

This job sets up delivery modes on a new store when added to organization hierarchy ‘retail store by department’. On the modes of delivery you can assign a organizational hierarchy, and this batch job assigns the specific modes of deliveries to each store. The modes of delivery is used in omnichannel scenario’s where the customer can have their products sent home etc.
A507

A507 Synchronize orders job

Retail and Commerce > Retail and Commerce IT > Synchronize orders

Hourly

If you have setup your channels to create sales order asynchrony, this job will create the sales orders and post payments. Also take a look at the following Microsoft docs on how sales orders and payments are synchronized from an online store.
A508

A508 Update search Product data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of products, that makes it faster and easier to search for products in the call center.
A509

A509 Update search Customer data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of customers, that makes it faster and easier to search for customers in the call center.
A510

A510 DOM batch job

Workspace > Distributed Order Management > Dom processor job setup

Hourly

Run distributed order management on retail sales orders to determine what warehouse should deliver the sales order
A511

A511 DOM fulfillment data deletion job

Workspace > Distributed Order Management > DOM fulfillment data deletion job setup

Daily

Cleans up the DOM data that is no longer the valid calculation.
A512

A512 Default channel database batch job

Class : RetailCdxChannelDbDirectAccess

Every 3 minutes

This job main duty is to check all Download sessions and Upload sessions with status “Available”, then it will apply the data to respective target DB’s (AX or channel DB). See also this blog.
A513

A513 Recommendation batch job

Class FormRunConfigurationRecommendationBatch

Weekly

Se Microsoft docs.
A514

A514 Retail scheduler history data removal batch job

Retail and Commerce > Headquarters setup > Parameters > Retail scheduler parameters

Class: RetailCdxPurgeHistory

Daily

Deletes CDX history. Typical only keeping 30 days of CDS history
A515

A515 Create customers from async mode

Retail and Commerce > Retail and Commerce IT > Customer > Create customers from async mode

Hourly

If customers should be created async (parameter), then this job will create the customer.
A516

A516 Retail transaction consistency checker orchestrator

Retail and Commerce > Retail and Commerce IT > POS posting > Validate store transactions

Hourly

Performs validation on the unposted POS transactions. See Microsoft docs.
A517

A517 Retail transactional statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Creates transactional statement. Se the following blog post.
A518

A518 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Create and posts sales orders. Se the following blog post.
A519

A519 Retail financial statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate financial statement in batch

Daily

Retail statement Trickle feed financial statement calculate. Creates financial statement. Se the following blog post.
A520

A520 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post financial statement in batch

Daily

Retail statement Trickle feed financial calculate. Posts shift declaration Se the following blog post.
A521

A521 Process loyalty schemes

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty schemes

Processes loyalty schemes. See Microsoft docs.
A522

A522 Post earned points in batches

Retail and Commerce > Retail and Commerce IT > Loyalty > Post earned points in batches

Loyalty points should be posted in batch. See Microsoft docs.
A523

A523 Process loyalty lines for other activities

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty lines for other activities

Other Loyalty points in batch. See Microsoft docs.
A524

A524 Retail time zone information job

Monthly

Generates timezone information up until 2054. Ensures that timezone used in the store does not causes inconsistent dates.

Inventory management batch jobs

ID

Name, path and recurrence

Description

A600

A600 Calculation of location load

Inventory management > Periodic tasks > Clean up > Calculation of location load

Daily

WMSLocationLoad table is used in tracking weight and volume of items and pallets. Summation of load adjustments job can be run to reduce the number of records in the WMSLocationLoad table and improve performance.

A601

A601 Inventory journals clean-up

Inventory management > Periodic tasks > Clean up > Inventory journals cleanup

Weekly

It is used to delete posted inventory journals.

A602

A602 Inventory settlements clean up

Inventory management > Periodic tasks > Clean up > Inventory settlements cleanup

Manually/Yearly

 

It is used to group closed inventory transactions or delete canceled inventory settlements. Cleaning up closed or deleted inventory settlements can help free system resources.

Do not group or delete inventory settlements too close to the current date or fiscal year, because part of the transaction information for the settlements is lost.

Closed inventory transactions cannot be changed after they have been grouped, because the transaction information for the settlements is lost.

Canceled inventory settlements cannot be reconciled with finance transactions if canceled inventory settlements are deleted.

A603

A603 Inventory dimensions cleanup

Inventory management > Periodic tasks > Clean up > Inventory dimensions cleanup

Manually/Yearly

This is used to maintain the InventDim table. To maintain the table, delete unused inventory dimension combination records that are not referenced by any transaction or master data. The records are deleted regardless of whether the transaction is open or closed.

Inventory dimension combination record that is still referenced cannot be deleted because when an InventDim record is deleted, related transactions cannot be reopened.

A604

A604 Dimension inconsistency cleanup

Inventory management > Periodic tasks > Clean up > Dimension inconsistency cleanup

Manually/Yearly

This is used to resolve dimension inconsistencies on inventory transactions that have been financially updated and closed. Inconsistencies might be introduced when the multisite functionality was activated during or before the upgrade process. Use this batch job only to clean up the transactions that were closed before the multisite functionality was activated. Do not use this batch job periodically.

A605

A605 On-hand entries cleanup

Inventory management > Periodic tasks > Clean up > On-hand entries cleanup

Monthly

This is used to delete closed and unused entries for on-hand inventory that is assigned to one or more tracking dimensions. Closed transactions contain the value of zero for all quantities and cost values, and are marked as closed. Deleting these transactions can improve the performance of queries for on-hand inventory. Transactions will not be deleted for on-hand inventory that is not assigned to tracking dimensions.

A606

A606 Warehouse management on-hand entries cleanup

Inventory management > Periodic tasks > Clean up > Warehouse management on-hand entries cleanup

Weekly

Deletes records in the InventSum and WHSInventReserve tables. These tables are used to store on-hand information for items enabled for warehouse management processing (WHS items). Cleaning up these records can lead to significant improvements of the on-hand calculations.

A607

A607 On-hand entries aggregation by financial dimensions

Inventory management > Periodic tasks > Clean up > On-hand entries aggregation by financial dimensions

Weekly

Tool to aggregate InventSum rows with zero quantities.

This is basically extending the previously mentioned cleanup tool by also cleaning up records which have field Closed set to True!

The reason why this is needed is basically because in certain scenarios, you might have no more quantities in InventSum for a certain combination of inventory dimensions, but there is still a value. In some cases, these values will disappear, but current design does allow values to remain from time to time.

If you for example use Batch numbers, each batch number (and the combined site, warehouse, etc.) creates a new record in InventSum. When the batch number is sold, you will see quantity fields are set to 0. In most cases, the Financial/Physical value field is also set to 0, but in Standard cost revaluation or other scenarios, the value field may show some amount still. This is valid, and is the way Dynamics 365 for Finance and Operations handles the costs on Financial inventory level, e.g. site level.

Inventory value is determined in Dynamics 365 for Finance and Operations by records in InventSum, and in some cases Inventory transactions (InventTrans) when reporting inventory values in the past. In the above scenario, this means that when you run inventory value reports, Dynamics 365 for Finance and Operations looks (initially) at InventSum and aggregates all records to Site level, and reports the value for the item per site. The data from the individual records on Batch number level are never used. The tool therefore goes through all InventSum records, finds the ones where there is no more quantity (No open quantities field is True). There is no reason to keep these records, so Dynamics 365 for Finance and Operations finds the record in InventSum for the same item which has the same Site, copies the values from the Batch number level to the Site level, and deletes the record. When you now run inventory value reports, Dynamics 365 for Finance and Operations still finds the same correct values. This reduced number of InventSum records significantly in some cases, and can have a positive impact on performance of any function which queries this table. 

A608

A608 Cost calculation details

Inventory management > Periodic tasks > Clean up > Cost calculation details

Monthly

Used to clean up cost calculation details.

A609

A609 CDS – Post integration inventory journals

Inventory management > Periodic tasks > CDS integration > Post integration inventory journals

Fetches journals from the CDS (Common Data Service) and posts them. This applies only of the CDS is in use.

Warehouse management batch jobs

ID

Name, path and recurrence

Description

A700

A700 Work creation history purge

Warehouse management > Periodic tasks > Clean up > Work creation history purge

Weekly

This is used to delete work creation history records from WHSWorkCreateHistory table based on number of days to keep the history provided on dialog.

A701

A701 Containerization history purge

Warehouse management > Periodic tasks > Clean up > Containerization history purge

Weekly

This is used to delete containerization history from WHSContainerizationHistory table based on number of days to keep the history provided on dialog.

 

A702

A702 Wave batch cleanup

Warehouse management > Periodic tasks > Clean up > Wave batch cleanup

Weekly

This is used to clean up batch job history records related to Wave processing batch group.

A703

A703 Cycle count plan cleanup

Warehouse management > Periodic tasks > Clean up > Cycle count plan cleanup

Weekly

This is used to clean up batch job history records related to Cycle count plan configurations.

A704

A704 Mobile device activity log cleanup

Warehouse management > Periodic tasks > Clean up > Mobile device activity log cleanup

Weekly

This is used to delete mobile device activity log records from WHSMobileDeviceActivityLog table based on number of days to keep the history provided on dialog.

A705

A705 Work user session log cleanup

Warehouse management > Periodic tasks > Clean up > Work user session log cleanup

Weekly

This is used to delete work user session records from WHSWorkUserSessionLog table based on number of hours to keep provided on dialog.

A706

A706 Wave processing history log cleanup

Warehouse management > Periodic tasks > Clean up > Wave processing history log cleanup

Weekly

This is used to clean up history records related to Wave processing batch group.

A707

A707 WMS Replenishment

Warehouse management > Replenishment > Replenishments

Calculate location replenishments on the warehouse locations.

A708

A708 Automatic release of sales orders

Warehouse management > Automatic release of sales orders

Releases sales orders to the warehouse so that the picking can start.

Monitoring Distribution jobs

The Retail IT workspace is specifically created to monitor all distribution jobs, sending data to RCSU and POS. If there are failed sessions, they will be seen here. Also the current download (To RCSU) and Upload (From RCSU) is shown here.


Monitoring Batch jobs

The best place to monitor all current batch jobs is through the system administration workspace. Here all failed, running, waiting and withheld batch jobs are shown. This workspace also has additional system administration features.



D365 – To exist or not, that is the question!(part 2)

Some years ago I created a free community solution for “Not-Exists Join“. Not exists join means that we can filter and search on data that does not have any relational records. This answers questions like;

– Show me all customers that have no sales orders the last X days

– Show me all items with no inventory transaction. Show me items with no movement last 30 days.

– Show me all items that have no price.

Countless community friends have used this for AX 2012. But since Dynamics 365 was released this solution could not be applied. To make it properly I have decided to push a request through the CDE (Community Driven Engineering), and hopefully making it available to all D365 customers as part of the standard solution. All code is ready and checked-in , and I’m just waiting for Microsoft review.

The way the CDE works, is that partners and customer that have code or bugfixes can work together with Microsoft on implementing changes. It is Microsoft that have the final decision, and they will also make it part of their IP. But for all you community friends, here is a sneak peek of what I’m working on together with Microsoft.

The advanced filter and query in Dynamics 3656 are a very powerful tool. Here you can search and filter on most fields and add join relations to the query.

But there is one area that the advanced query screen is not handling. That is “not-exist-join”. Let’s say I want a list of all the customers that don’t have sales orders. The standard D365 will not help here. The purpose of this document is to show how to implement “not-exists-join” into standard.

Functional Solution

In the joins form, a new section of relations has been added that represents the tables that can be “not-exist-join” added:

In this sample the customers will no sales orders will be in the query result/form. But the feature are generic, and all 1:n relations can also be selected as a “Not exists” relation.

When will you have this in standard? Maybe 10.0.10?? It depends on Microsoft and final approval of the code and feature. But hopefully it should not be in the far future. But “cheer and share” and maybe we as the community can accelerate this very requested feature.

D365 community ROCK’s and Happy DAX’ing!!