Microsoft Business Applications sessions on-demand and Dynamics 365 version 10

The Microsoft Business Applications sessions are now available on-demand https://www.microsoft.com/en-us/businessapplicationssummit/sessionsondemand

I enjoyed the following sessions:

Client usability and productivity improvements in the October release and beyond for Microsoft Dynamics 365 for Finance and Operations

Monitoring Microsoft Dynamics 365 for Finance and Operations with Lifecycle Services

Microsoft Dynamics 365 for Retail: Reliable data management and payment processing

Microsoft Dynamics 365 for Retail: Delivering cloud driven intelligence and tools to enable enterprise manageability

 

I also want to highlight the following session, where Microsoft is explaining Dynamics 365 version 10 (Thanks Shelly)

Microsoft managed continuous updates and support experience for Microsoft Dynamics 365 Finance and Operations

D365FO – Some nice excel tricks

When working with importing master data into Dynamics 365 you will experience that they are available in different data entities. In a typical retail project you would need to import data like released products, item barcodes, external item numbers price. It is also common that we get the master data in many files and in different formats. It is therefore quite beneficial to know a few tricks so that it becomes easer to work with loads of data. Here are my tips.

Export all/selected rows (You should know this!)

From any grid in D365FO you can export selected/all rows to excel by right clicking on the grid. The tip is therefore to make a personalization to the grid, so that it contains the fields you want to export to excel.

Then Excel opens with the selected columns. (PS! This export is limited to 10.000 rows)

Use excel to create a filter

Let’s say we have a excel spread sheet with item numbers, and want to filter in D365FO on these items. Here is a very valuable tip.

  1. Copy the items column from excel and paste as rows in a new excel sheet.(Transpose)

  1. Then copy the row, and paste into notepad

  2. Then do a search, replace in notepad, where you copy the space/tab and replace it with comma (,)

  3. Then copy the content here and use it on a “match” filter in D365FO

     

  4. Then you have created a filter on the selected field. It seams the “match” filer is capable of handling quite a lot of text.

This is nice when some asks you to “Please fix these 200 items”. You then filter them and quite quickly go through them to fix it.

Learn Excel VLOOKUP

VLOOKUP is essential to learn, because it let’s you check and lookup data across multiple excel sheet. A typical scenario in the retail world is when the vendor sends a new pricelist, and you want to import them. Often this is delivered as a excel sheet with the vendor item number, item barcode and the price. Most retailers prefers to have their own item numbers. But then you have the issue of mapping the item barcode from the vendor pricelist and trying to find your own product number. Here is how I recommend my customers to do it:

  1. Export all D365FO item barcodes to excel (There is an entity for this, or open the barcodes from the retail menu)
  2. In the vendor excel price list, create a VLOOKUP field to lookup the D365FO product number based on the item barcode.

  3. Then you can create an excel sheet where you have your own product number, and you can import them using “open in excel” or through a data management import job.

     

     

Happy weekend friends !

First Aid Kit for Dynamics 365 for Retail; A messy blog post

First, I want to say that Microsoft Dynamics 365 for Retail is the best retail system in the world. What we can do is just amazing! This blog post is going to be a mess without meaningful structure, because the purpose of this post is to quickly give 911-help to retailers, so that the they can continue their daily operations. I this blog post is primary focusing on the MPOS(Modern POS) with offline database and when having a local RSSU(Retail Store Scale Unit). Also, this blog post will be incrementally changed and new topics will be added. So please be welcome to revisit later.

MPOS Hardware

Microsoft do not give recommendations on hardware, but they have tested some hardware. I also can share what is working for a scenario where an offline database on the MPOS should be installed.

HP RP9 G1 AiO Retail System, Model 9018
Microsoft Windows 10 enterprise 64-bit OS – LTSB
HP RP9 Integrated Bar Code Scanner (as a secondary mounted scanner)
128GB M.2 SATA 3D SSD
◾ 16 Gb Ram
Intel Core i5-6500TE 3.3 6M 2133 4C CPU
HP RP9 Integrated Dual-Head MSR -Right (For log-on card reading)
HP L7014 14-inch Retail Monitor-Europe (for dual display)
HP LAN THERMAL RECEIPT PRINTER-EUROPE – ENGLISH LOCALIZATION (TC_POS_TERMALPRINT_BTO)

A small tip; OPOS devices are slow and unpredictable. Try to avoid them. But in this hardware we still had to use OPOS for the receipt printer and the cash drawer.

All drivers related to this machine is available her.

Payment terminals

Building payment connectors is time consuming, but Microsoft have provided documentation and samples that is available her. For me, I prefer ISV solutions for this.
◾ Ingenico iPP 350 Payment terminal (Requires a ISV payment solution)

Additional Scanners

◾ SYMBOL DS9808

Datalogic – Magellan 3200Vsi

Remember to open the scanner documentation, and to scan barcodes to program them to make sure to Enable Carriage Return/Line Feed, adjust beeping etc.

Generic preparation recommendations when having issues

In the following chapter is some preparation steps that you should be prepared to do.

Install TeamViewer on the MPOS device

To make sure that a professional quickly can analyze the device, we always try to use or install team viewer on the RSSU and MPOS devices. This makes it possible to access the machines. Please follow security precautions when using TeamViewer.

Start collecting information

Dynamics 365 for Retail contains a comprehensive set of events that is logged in the system, and that is available for IT resources. Please check out the following pages for additional steps to troubleshoot.

https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-component-events-diagnostics-troubleshooting

The following section contains issues experienced with manually installing Dynamics 365 MPOS.

If you cannot figure it out quickly, create a Microsoft support request as fast as you can. Normally Microsoft responds fast and can give recommendations quite quickly, but often they will need information on the actual machine to see if there are issues related to software and hardware. MPOS and RSSU is logging a tremendous set of information that is relevant for a support case. Take pictures, screen dumps and collect data.

Event logs

Always look into the event logs on the MPOS and the RSSU. Also learn to export the event logs as they can give valuable information on what is wrong. The following event logs are of interest.

•    Windows > Application
•    Windows > Security
•    Windows > System
•    Application and Services Logs > MPOS/Operational

Machine information

Collect Microsoft System Information, such as devices that are installed in the MPOS or device drivers loaded, and provides a menu for displaying the associated system topic. To collect this data do

  • Run a Command Prompt as an Administrator
  • Execute MSINFO32.exe
  • Go to Menu File > Save as machine.nfo

Backups of the local database

Take backups of the RSSU and local database, as this can be handy to analyze the data composition of the database. Some times Microsoft will ask for exact database version and information like:

  • What version of SQL is this?

    Further, is this Standard, Enterprise, Express, etc.?
    => Run query select @@version and share the resulting string.

  • How large is the SQL DB at this time?
  • Plenty of space available on the hard drive still?
  • What is the current size of the offline database and RetailChannelDatabase log file?

RSSU installation and Checklist

The setup and installation of RSSU is documented in the Microsoft DOCS https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-store-scale-unit-configuration-installation

  • Operating system is Windows 10 Enterprise LTSB with separate disk for SQL. SSD disks is highly recommended!
  • SQL Server 2016 standard edition with full text search installed locally on server.
    – I would not recommend SQL Express on a RSSU with multiple MPOS’es installed.
  • Install .NET 3.5 ,4.6, IIS and run Windows update before setup
  • Make sure that SSL certificates(RSSU and MPOS) have been installed and setup on the machine. Remember to add them to you Azure account
  • Verify that you have an Azure AD credentials that you can use to sign in to Retail headquarters.
  • Verify that you have administrative or root access to install Retail Modern POS on a device.
  • Verify that you can access the Retail Server from the device. (like ping with https://XXX.YY.ZZ/RetailServer/healthcheck?testname=ping)

  • Verify that the Microsoft Dynamics 365 for Retail, Enterprise edition, environment contains the Retail permission groups and jobs in the Human resources module. These permission groups and jobs should have been installed as part of the demo data.

A small, but important information about the RSSU. It is designed to always have some kind of cloud connection. If it loses this connection, then strange issues starts to occur. Especially in relation to RTS calls (Realtime Service Calls)

Set Async interval on RSSU

This has been described in a previous blogpost.

Installation of MPOS issues

There are a number of pre-requisites that needs to be followed that is available on Microsoft DOCS. Read them very carefully and follow them to the letter. Do not assume anything unless stated in the documentation. Also read https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-device-activation. Here are my additional tips:

When having customizations or extensions

If you have made extensions, remember to make sure that the developer that have made the deployable package have build the package with “configuration = Release”. There are scenario’s where the MPOS installation can give issues like this.

There are scenarios where making a MPOS build with configuration = debug for internal use, please take a look at the following Microsoft blog-post.

Having the right local SQL express with the right user access on the MPOS

If you are making a retail POS image (With Windows and SQL preinstalled), please make sure to select the right SQL version(Currently SQL 2014 SP2). If SQL express is not already installed, then the MPOS installer will automatically download and install it. But the file is 1.6 Gb, and it is therefore recommended to manually install the SQL express, or have it as part of the standard image. SQL Express is available her, and select the SQLEXPRADV_x64_ENU.exe


There are ways of using SQL Express 2017 with MPOS, but I recommend to wait doing this until Microsoft officially includes this in their installer. Also remember that the SQL Express have some limitations, like it can only use 1 Gb of Ram, and have a 10 Gb database size limitation.

I recommend creating two users on a MPOS machine:

– A PosUser@XXX.YYY, that is a user with very limited rights on the machine, and customers often wants auto login to the machine using this user. But this user also needs administrator elevation when it should do administrator stuff on the machine.

– A PosInstaller@XXX.YYY, that have administrator rights on the local MPOS machine.

When installing, remember to add both the PosUser and PosInstaller as users in the SQL when installing the SQL Express, else the installer struggles to create the offline databases.

Cannot download MPOS package from Dynamics 365

If you try to manually download the installation package, windows explorer have been setup to sometimes deny this.

The reason for this could be a certificate problem with the package. The work-around for this, is to use Chrome when downloading.

Cannot install the MPOS Offline package

When installing the MPOS the following error may come. In many cases the user must be leveraged to administrator. If you receive the following error, it means that the version you are installing is older than the existing version, and the current version must be uninstalled first. Do not try to install a higher version than is deployed in your Cloud RSSU default database, as this is not supported. Also if you need to “down-grade” a MPOS, then uninstall the MPOS first, and then reinstall the older release.

PowerShell scripts for manual uninstalling of MPOS

In 95% of any situation, just uninstalling the MPOS app should work. But if you are out of options, Microsoft have created an uninstall powershell script.

Cd “C:\Program Files (x86)\Microsoft Dynamics 365\70\Retail Modern POS\Tools”

Uninstall-RetailModernPOS.ps1

I often experience that we need to run the uninstall in the following sequence:

1. Run it as a local administrator

2. Then a “uninstall” icon appears on the desktop, that we need to click on

3. Run it again as a local administrator

Then the MPOS is gone, and you can reinstall the correct MPOS.

Connectivity issues

Here are some tips on connectivity issues, and how to solve them.

MPOS is slow to log in

When starting the MPOS, it sometimes can use a few seconds before available. We see this, it you typical have a slow internet connection with high latency. The MPOS is doing some stuff towards the cloud, and this just takes time.

MPOS cannot go online after being offline

I think this behavior currently is some bug that can happen in certain situations and if the RSSU looses internet connectivity. Microsoft are investigating the causes. If not possible to go online after the MPOS have been in offline, it is possible to reactivate the MPOS to get online. In the event log you may see issues like this : “UpsertAndValidateShifts”

Rename the file: C:\Users\[POS-User]\AppData\Local\Packages\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt\AC\Microsoft\Internet Explorer\DOMStore\DSSWV5L9\microsoft.dynamics.retail[1].xml

Then reactivate the MPOS with RSSU address, register and device and login with the D365.posinstaller.

IMPORTATANT: Remember to select hardware station when logging into the MPOS afterwards!

This is not a supported “fix” from Microsoft, and it is expected that Microsoft will find a permanent solution to this issue.

MPOS cannot connect with the payment connector

The following is mainly related to some issues that could be happening if having a third party payment connector using PINPAD. In most generic cases this is not relevant for those that is using standard or other payment connectors.

1. First check that Hardware station is selected on the MPOS.

2. The next step is to reboot the PC

3. If still not working, copy the file MerchantInformation.xml to the folder “C:\ProgramData\Microsoft Dynamics AX\Retail Hardware Station”. AND to C:\Users\[POS-User]\AppData\Local\Microsoft Dynamics AX\Retail Hardware Station. This will ensure that the payment is working as expected also in offline mode. The MerchantInformation.xml is a file that is downloaded from the cloud the first time the POS is started. If changing the hardware profile

4. Is still not working, open the hardware profile and in the profile ” set the EFT Service to Payment connector and test connector. This will download the MerchantInformation.xml again.

Then run the 1090 distribution job. After X minutes, try to restart the MPOS, and try to perform a payment. This should also automatically regenerate the MerchantInformation.xml. Microsoft is working on a fix for this, and you can follow the issue her.

PS! Normally a production environment should not need to have connection to the Microsoft test connector

Retail offline database exceeds 10 Gb limit

To ensure that a POS don’t exceed the SQL Express 10 Gb disk restrictions, I have created a SQL script that reduces size of the log file.  Please evaluate to implement on all POS’es.

Getting strange errors like “The channel does not exist or was not published”

In some rare situations you could experience getting errors like.

Our experience is that this could happen if the database on the RSSU is overloaded, and are not able to respond to MPOS connections. Log into the RSSU and check out if the CPU, database og disks are not able to respond. If you have SQL express on the RSSU, we have experienced this. Also try to not push to many distribution jobs too frequently. In a situation we uploaded 400.000 customers, while running the distribution job 1010 (customers) every 5 minutes. That “killed” the RSSU when having SQL express.

Getting strange errors like “A database error occurred”

We have also experienced this when the RSSU is overloaded. Remember that the Microsoft recommendation on the RSSU hardware needs to be scaled accordingly to hos many MPOS’es is connected and how much data and transaction volume. Get an SQL expert to evaluate the setup of the RSSU prior to go live and remember to volume test the setup.

Hot to fix ? Scale up your RSSU.

Getting strange errors like “We where unable to obtain the card payment accept page URL”

We have also experienced the following issue. The solution was simple; Remember to enable the local hardware station on the MPOS.

Getting strange errors like “StaffId”, when returning a transaction

In a situation where there are connection between the MPOS and the RSSU, but the RSSU don’t have a connection to the cloud, AND you perform a “return transaction”. You may get the following error.

“Return transaction” is defined as an operation that require online RTS (Real-Time-Service calls). The following list defines all POS operations, and if they are available in offline mode.
The solution in this situation is therefore to use the POS operation “Return Product” instead on the MPOS.

Keep and eye on your devices.

In the menu item Channel client connection status you can see last time each device was connected.

Functional issues

With functional issues I refer to issues that is related to user errors and more functional issues that can occure.

Dynamics 365 for Retail on version 8

Even though version 8 have been launched for Dynamics 365 for Finance and Operations, I have not seen that Retail yet(10 may 2018) is supported on version 8. So before going forward on version 8, please check with Microsoft support.

Barcode scanned as tendered currency amount

This is a funny issue, that can occur. Some background story is in place here. A customer wants to pay for the product in another currency, and the cashier selected the “pay currency” on the MPOS, ready to key in the amount that the customer is paying. But unfortunately, the cashier scanned the product barcode, and then the MPOS committed the sale as the customer had paid 7.622.100.917,80 in currency, and should have 5.707.750.079.417 in return (local currency). Lesson learned; Always remember to set the parameters “Overtender maximum amount” and the Amounts fields.

How to fix it? You actually need to create a Microsoft support request to have perform make some changes in the database. This takes time, and it have to be first performed in the staging environment that is updated. It can take a lot of time! So make sure you set these parameters right before you go live.

Cannot post retail statement, because of a rounding issue.

This is a known issue, and Microsoft have a hotfix for this. Always make sure you periodically update you system with the latest hotfixes. Here is my small tip on this; Try 4-5 time to click on post, and then it suddenly goes though and get’s posted. We do not know why ??

Retail statement (Legacy) and Retail Statement

In version 7.3.2, Microsoft released a new set of functionality for calculating and posting retail statements. You can read more about it her. Microsoft recommend that you use the Retail statements configuration key for the improved statement posting feature, unless you have compelling reasons to use the Retail statements (legacy) configuration key instead. Microsoft will continue to invest in the new and improved statement posting feature, and it’s important that you switch to it at the earliest opportunity to benefit from it. The legacy statement posting feature will be deprecated in a future release.

Access hidden Retail menu items.

The form “Retail Store transactions” contains all retail transactions that is received from the MPOS/RSSU’s, and here you will find, sales, logins, payments etc. This first step for any user should be to personalize this form, and only show the relevant fields and columns(Not done here).

You can dig deeper into the transactions, by clicking the “Transactions menu”

If I here open the “Payment transactions” I get a filtered view of the payment transactions related to that receipt.

BUT! In many cases you would like to look on ALL the payment transactions, and not only the those related to a specific receipt. But there are no menu items that let’s you see all payment transactions in one form.

Here is my tip. Right click on the form and then you can see the Form name. Click on that …

And you should be able to see the menu item name.

Then copy your D365FO URL, and replace the menu item name, and open it in another browser tab.

Then you get a nice list of all payment transactions regardless of what receipt is connected to

This procedure can be used most places in Dynamics 365. For retail, this is excellent because some times you need to find specific transactions. If you need to reconcile banked transactions (where you have a Bag number), then you can use this approach to see all banked bag numbers in a single form. But here is a list of the most common ones:

Sales transactions(items) &mi=RetailTransactionSalesTrans
Payment transactions &mi=RetailTransactionPaymentTrans
Discount transactions &mi=RetailTransactionDiscountTrans
Income/Expense transactions &mi=RetailTransactionIncomeExpenseTrans
Info code transactions &mi=RetailTransactionInfocodeTrans
Banked declaration transactions &mi=RetailTransactionBankedTenderTrans
Safe tender transactions &mi=RetailTransactionSafeTenderTrans
Loyalty card transactions &mi=RetailTransactionLoyaltyRewardPointTrans
Order/Invoice transactions &mi=RetailTransactionOrderInvoiceTrans

Unit conversion between <unit 1> and <unit 2> does not exist.

If you use Retail Kitting, and have kits with intraclass unit conversions, then there is an issue, that Microsoft is working on. This is scenarios where the included kit line is stocked in pcs and consumed in centiliters. Luckily Microsoft is working on this, and we expect a fix on this.

Wrong date format on the POS receipt.

In EN-US we have the date format MM/DD/YYYY. In Europe we use DD/MM/YYYY. The date format on the receipt is controlled by the language code defined on the store. We often prefer to have EN-US as the language on stores, but this gets wrong date format. Therefore to get the right date format on the receipt, you either have to maintain product names/descriptions in multiple languages (like both EN-US and EN-GB), and specify that the languageon the POS store should be EN-GB. We are working on finding a better and more permanent solution to this.

Dual display.

Microsoft writes: “When a secondary display is configured, the number 2 Windows display is used to show basic information. The purpose of the secondary display is to support independent software vendor (ISV) extension, because out of the box, the secondary display isn’t configurable and shows limited content. ” In short…. You have to create/develop it yourself in the project. This requires a skilled Retail developer that masters RetailSDK, C# and javascript.

Credit Card payment with signature

In certain situations it could happen that the payment terminal is capable of processing the payment, but for some reason this is not closing the “waiting for customer payment”. In most cases this is related to the payment terminal being able to perform offline transactions, and then the payment terminal will print a receipt where the customer must sign. In such cases we have created a separate payment method called “pay with signature”, that is posted in exactly the same way as a credit card payment method. Then the cashier is able to continue the payment processing, and register that the payment was ok, and then print out the receipt.

Something very wrong was done by the cashier, then suspend the transaction

If there for some reason, the cashier is not able to continue on the transaction, the casher have the option of suspending the transaction, and then continue. Then later, the POS experts can resume the transaction, and find out what went wrong.

Setting up MPOS in tablet mode

The MPOS works very nice in tablet mode. But if you have dual display, the PC cannot be put into tablet mode. We have not found a way to fix, and if you know, please share.

MPOS resolution and screen layout does not fit the screen

Do not just set the MPOS resolution to the screen resolution. If there is a “title bar”, you need to subtract that title bar height from the screen layout. This is important in scenarios where you have dual displays.

Use lock screen and not log off on the registers.

The log-out/in process to more “costly” from a resource perspective than the lock operation.

Keep the MPOS running (but logged out) when not using the device.

As the Dynamics 365 periodically sends new data to the MPOS offline database, this will be done through the day/night. Then the MPOS is “fit-for-fight” when the user logs in.

Run Distribution jobs in batch

My guide lines on retail distribution jobs is that all Retail jobs will start with the R-prefix, followed by the number. Download distribution jobs will be R1000-1999. Upload Distribution jobs will be R2000-2999. Processing batch jobs will be R3000-3999. Retail supply chain processes will be named R4000-4999.

There are a number of jobs distributing data from Dynamics 365 to the store databases (RSSU) and the offline databases. The jobs and suggested recurrence I suggest is

That’s my tips for today. If you have read this completely to the end, I’m VERY impressed, and let me know in the comments.

Failed ERP implementation will change partners to become trusted advisors.

A norwegian customer won a compensation case against an ERP implementation partner after the customer terminated the parties’ agreement on the supply of a new ERP. The customer was compensated by the Norwegian district court assessed at 288 mNOK (36,7 mUSD). Originally the contract was worth 120 mNOK. You can read the complete story here http://www.selmer.no/en/nyhet/felleskjopet-agri-wins-district-court-case. The court decision is expected to be appealed.

Luckily this was NOT a Dynamics 365 implementation, and the customer is actually replacing the failed ERP system with Dynamics 365. The reason why I wanted to write about this story is that it has implications on how much risk and responsibility an ERP implementation partner can take. A major part of the ERP partners are smaller companies with less than 100 employees, than cannot take the risk of getting into such a situation. There are always problems and risks that is beyond what a ERP partner can control. Partners are not the developer company of the standard software. They are implementing, and in some cases adding additional extensions. Also the cloud based software are running on azure that is beyond the control of the partner.

How can this change partners behavior? Partners are changing towards becoming verticalized trusted advisors, but with limited responsibilities. We can give recommendations based on what we know about the software and how to use it efficiently but the costs are more on a T&M(Time and Material) basis. It will more be the customer them selves that is responsible for the implementation and time-tables.

Some customers will not accept this change, but other do. There are currently resource constrains in the Dynamics 365 partner channel and we partners avoiding customers that takes a back-seat approach towards their implementation projects. The sales focus will change towards those customers that take more of the responsibility themselves, and that do understand to take a more dynamic and agile approach. A 400-page requirement document is not a good start for an ERP project, as we see the digitalization possibilities are accelerating. We also see that customers don’t run a 2 year ERP implementation project before going live. They run a 90 days project to get live with only parts of their requirements. The project then takes on other areas and they extend their use of the Dynamics 365.

At the end, I include some trusted advisor recommendations that I think can inspire anyone that is about to start a project.

D365FO – Speed up Retail RSSU download performance

If you don’t know what RSSU is, I suggest reading this, but the RSSU is about having a database locally in your store that MPOS or CPOS can connect to. It is typically used if you have an unreliable or slow internet connection.

One of the things you can evaluate is to implement the Azure Express Route, and Microsoft have released a whitepaper for Dynamics 365. This can really speed up the connectivity performance.

Another thing I see is annoying is that the local RSSU is only picking up the distribution files every 15 minutes. The Cloud channel database is really fast. This means that when sending new products or prices to the RSSU, it can take up to 15 minutes before this data is available in the MPOS. That is really annoying to wait 15 minutes when testing.

In the Microsoft documentation we are instructed to use the Data Sync interval to speed up the synchronization. But somehow it does not work.

But there is a way around this. On the local RSSU there is a configuration file, where you can modify how often the RRSU should request new data to be downloaded.

Then change the following two lines:

Then just restart the AsyncClient Services and reset the IIS on the RSSU box. Then the distribution of data to the RSSU is really speeding up.

But what is the recommended setting from Microsoft ?

This is recommended to make the RSSU request packages at an interval that is a proper fraction of what the packages are generated at. So if you are sending new products every 10 minutes? Do 5 minutes. If you are sending new products every 5 minutes, do 2 minutes download interval. The higher frequency the more often the RSSU will request data, and some consider this as a waste of bandwidth.

Good luck in your retail implementation

D365FOE-Moving to a new tenant

Companies change, merge, sell, purchase each other, and we encounter requirements where it is a requirement to move to a new/other Azure AD tenant.

But…. That’s not a small thing. We requested through a Microsoft support ticket on how to do this, and hoping this was a small formality, and that Microsoft had some magic tricks of doing this. But they don’t. But I can explain the process we are on to achieve this.

  1. Create Azure subscription on new tenant.
  2. Buy a new required licenses in new CSP-subscription for D365FO DEV/TEST/PROD instance.
  3. Add admin user on new tenant to the new LCS.
  4. Setup new azure connector in existing LCS project with the new subscription.
  5. Deploy new DEV/TEST/PROD environments for the new connector in the new tenant
  6. Setup new VSTS in the new tenant.
  7. Copy all checked-in code from old to new VSTS.
  8. Import all checked-in code from new VSTS to new DEV environment.
  9. Compile and install the code packages into the new stage environment.
  10. Request DB copy from “old” PROD to the “old” stage environment.
  11. Export an Azure back-pack from the “old” stage environment.
  12. Import the Azure back-pack into the “new” Dev environment.
  13. Run AdminUserProvisioning tool with admin user from new tenant to swap tenant.
  14. Repopulate email settings, users and other settings lost by the copy.
  15. Check, Check, Check….Fix, Fix, Fix.
  16. Request DSE to copy new stage to new PROD (only possible once).
  17. Check, Check, Check….Fix, Fix, Fix.
  18. Suspend/end the “old” CSP subscription.

In the process you will lose all documents that is record attached/stored in the old environment. There are also some other expected issues.

Do expect to spend some time on such a process. And it’s a good thing to perform the DB copy two times (first time just for validation and test). Microsoft is looking into how to improve this process, but this is how we are performing it.

If any in the community have better ideas, feel free to share it

BIG credits to my colleague HAKAM.

Great stuff on the D365 roadmap

What we currently see is that more and more power user functionality is introduced step-by-step to make Dynamics 365 ready for the next natural technological step; to become a true SaaS solution built as a Azure service fabric. Check out this video from Microsoft for what I hope is the future and architecture direction for Dynamics 365. But before we get there, there have to be a natural transition of making Dynamics 365 more configurable and less dependent on creating your own customizations and extensions.

Now and then I try to keep an eye on the D365 roadmap for signs on this transition, and today I found these nice features that I think will be highly valuable. I have copied the descriptions from the roadmap, and the release date is not clear, but I look forward to present these great enhancements to my customers.

1. Power users can add custom fields to forms without developer customization

Many application customizations involve adding one or more fields to existing tables and including them in application forms. Most of your customizations may be comprised of adding fields.

Customizations are expensive because they require developer intervention for development, test, and code life cycle management. Customizations also need to be managed and migrated from one environment to another.

We are making it easier to add custom fields to forms in Dynamics 365 for Finance and Operations, Enterprise edition. No longer will developer customization be needed. Instead, a power user will be able to add a custom field to a table and then place that field on the form using personalization. An IT administrator will then be able to share the personalization with others in your organization.

2. Product lifecycle state

The product lifecycle state will be introduced for released products and product variants. You can define any number of product lifecycle states by assigning a state name and description. You can select one lifecycle state as the default state for new released products. Released product variants inherit the product lifecycle state from their released product masters. When changing the lifecycle state on a released product master, you can choose to update all existing variants that have the same original state.

To control and understand the situation of a specific product or product variant in its lifecycle, it is a best practice in Product lifecycle management solutions (PLM) to associate a lifecycle state with a variable state model to products. This capability will be added to the released product model. The main purpose of this extension is to provide a scalable solution that can exclude obsolete products and product variants, including configurations, from master planning and BOM-level calculation.

Impact on master planning – The product lifecycle state has only one control flag: Is active for planning. By default, this is set to Yes for all product lifecycle states. When the field is set to No, the associated released products or product variants are:

  • Excluded from Master planning
  • Excluded from BOM level calculation

For performance reasons, it is highly recommended to associate all obsolete released products or product variants to a product lifecycle state that is deactivated for master planning, especially when you work with non-reusable product configuration variants.

Find obsolete released products and products variants – You can run an analysis to find and update obsolete released products or product variants.

If you run the analysis in a simulation mode, the released products and product variants that are identified as obsolete will be displayed on a specific page for you to view. The analysis searches for transactions and specific master data to find the released products or product variants that have no demand within a specific period. New released products that are created within the specific period can be excluded from the analysis.

When the analysis simulation returns the expected result, you can run the analysis by assigning a new product lifecycle state to all the products that are identified as obsolete.

Default value during migration, import, and export

When migrating from previous releases, the lifecycle state for all released products and product variants will be blank.

When importing released products through a data entity, the default lifecycle state will be applied.

When importing released product variants through a data entity, the product lifecycle state of the released product master will be applied.

Note, the ability to set individual product lifecycle states using the data entities for released products or product variants is not supported.

3. Users can pin PowerApps to forms and share with peers to augment functionality

Have you built a PowerApp that uses or shows data from Dynamics 365 for Finance and Operations, Enterprise edition? Or have you been using a PowerApp built by someone in your organization? Would you like to use PowerApps to build last-mile applications that augment the functionality of Finance and Operations?

Your users can build PowerApps without having to be expert developers to extend ERP functionality. PowerApps developed by yourself, your organization, or the broader ecosystem can now be used to augment ERP functionality by including them within the Finance and Operations client.

Your users will be able to pin PowerApps to pages in Finance and Operations. After they’ve been added, these changes can be shared with peers in your organization as personalizations.

 

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer

Dynamics 365 : Adding check-digits to number-sequences

In Dynamics 365 we are using number sequences to automatically create identifiers like product number, customer number etc. I’m a fan of having these numbers as “clean” as possible, and I always try to convince my customers use pure numbers. Why ? Take a look at the keyboard:

The numb-pad is the fastest way of typing in data. I also see that users normally perform lookup and see the description of what they are selecting anyway.

But let take a scenario; We will use a number sequence to create products numbers. We will then typical get product numbers like this :

Then I have often seen that another problem arises; typing errors from the num-pad are actually getting a “hit”, because when using a number sequence we can almost always find a product that have the same number as the user wrongly typed.

If you try using your credit card online you will see that the number is not accepted if any number is wrong. The solution there is to build in check-digits in the number.

I created a very small extension to solve this in Dynamics 365, with just a few lines of code. In the following example The “green” part is from the number sequence, and the yellow part is from the modulo 10 check digit calculation.

In this way the user can never type the wrong product(or any other identifier), unless it is 100% correct.

In the screen for number sequences I added an option to add the check digit to my generated numbers.

I wanted to share this with you, because it is so simple:

1. Create an extension on the table “NumberSequencetable”. Then add the extended datatype (YesNo) as a field, and name it “AddCheckDigit”.

2. Add this field to the “Setup field group”

Then we have the parameter in place, and it is available on the number sequence as shown earlier.

3. Then create a new class and replace all code with the following :

Here I’m creating an extension to the NumberSeq class, and I’m creating one method; Num, that will add the modulo10 number to my number sequence.

Where I check if my new “AddCheckDigit” is enabled, and I’m also saying that do not do this for continuos number sequences, manual, and I also say that the number sequence must be allowed to be changed to a higher number.

That’s it

Now you can have check-digits on products, customers, vendors, sales orders, purchase orders etc.

PS! I have not tested this code 100%, but the community is full of brainpower that hopefully can share additional findings on bugs or flaws.

If you like this, vote at idea sat https://ideas.dynamics.com/ideas/dynamics-operations/ID0002954

Agile POD’s: Organize for efficiency

Have you ever seen the TV-series “House of Lies“. This is a quite funny TV comedy series that focuses on the extrovert lifestyle of a management consulting team. First of all it is a comedy and not very realistic, but it manages to illustrate the concept of how to create the most efficient organization form for solving problems; The Agile POD.

Agile pods are small custom agile teams, ranging from four to eight members, responsible for a single task, requirement, or part of the backlog. This organizational system is a step toward realizing the maximum potential of agile teams by involving members of different expertise and specialization, giving complete ownership and freedom, and expecting the best quality output. This blogpost is about how to organize a consulting business that is non-silo organized around an actual service product.

In many consulting companies today, we see increasingly alarming signs that prevents the full utilization of the people and resources. Some of the signs can be seen as:

– Many non-direct-operative managers. (If you have >5 levels from bottom to top you have an issue)
– To many internal meetings. (Why Meetings Kill Productivity)
– To much time are used to generate budgets, forecasts and excel spreadsheets.  (No actual customer value)
– Organized into silo-team with similar expertise. (Functional, Technical, Support etc)
– New project teams for each project. (Spends 2 months of getting to know your team members)
– Outdated internal systems and processes.
– Mixed marketing message and costly (pre) sales and implementation processes
– Many partners is currently not ready for the Dynamics 365 cloud based disruption (Sticks to waterfall, while agile accelerate)

Agile POD’s is a different way of organizing a team for efficiency. How does a agile POD look like? In this example we have a small 5 person permanent team. This team is specialized for running a some tasks/phases in the initial Dynamics 365 implementation; The Agile preparation phase.

In this example the POD owner is the Solution Architect. The roles in the POD can be described as:

The solution architect:

He runs the POD, and he also have all the responsibility of the POD. It is the POD owner that recruit the POD-members. The Solution Architect is the “face” of the POD, and will organize the work in the POD and also discuss the solutions with the key decision takers at the customer. Very often the solution architect have lot’s experience. In agile terms this is also the SCRUM-master and also very operational.

The Finance expert:

When implementing Dynamics 365, there is an always a need to know how to connect the operational processes into accounting and reporting. This person is highly knowledgeable in Financial Management reporting, Power BI, Excel. He also knows how to improve the reporting from the financial perspective by defining financial dimensions, setting up Tax, Bank, Fixed Assets, HR and Budgeting/Forecasting.

The Vertical Domain Expert:

How to implement best-of-breed processes is the vertical domain experts expertise. In Retail-domains this means expert on Master data, Categorization, Stores, POS, Devices etc.

The Technical Architect:

In a cloud based system, there is a need to understand how environments are deployed, setup and make it all ready for an efficient Application Lifecycle Management. The Architect knows the ITIL-framework. When a change is needed the technical architect will create the necessary documentation/VSTS backlogs for developers to execute on.

The Junior consultant:

The junior consultant is here to learn, offload and support the team. As experience increases the junior will eventually more responsibility and hopefully some day move into other positions in the team.

Within the team we are looking to T-shaped person, that have a width to their expertise, and also a few deep expert knowledge domains. A gaming company called Valve(That delivers the Steam gaming store) described what we are looking for with the following picture of the T-shaped model. Take a look at their employee handbook. This same concept and idea is relevant for Dynamics 365 consulting companies.

The Agile POD’s must therefore specialize their own services. Each POD-team must therefore build WBS (Work-Breakdown-Structures) that enables the delivery to combinedly utilizes the entire POD.

The idea is that a POD-team is sent out to the field, then delivers the pre-defined services, and returns safely afterwards. Then it is off to the next client to again deliver the same service. As you may understand, it is therefore important that the services delivered is predefined. In this concept there is not one team that delivers a complete implementation. In larger implementations it would be a sequence of Agile POD’s that cover the implementation.

This way of organizing is not a new way of doing things. This working concept have been applied for decades at entrepreneurs and building companies. When building a house this is not done with a single team. It is done by a sequence of teams that is specialized. A POD team will have responsibility of a limited set of tasks, that needs to be performed in a predefined sequence.

By organizing operational skills into POD’s executed in a sequence, we now have a balanced unit. One pain in Dynamics 365 consulting companies I often see is that bottlenecks arises around a few selected roles. Typical on the solution architects. This unbalance will result on high utilization on these roles, with other roles have low utilization, because work is not correctly distributed. We also see that consultants are being placed into project teams because they have free time, and not because they have the right knowledge. This increases costs and reduces satisfaction for customers. Ultimately it also reduces profitability for the implementation partner.

Agile POD’s does not solve every thing, but it makes the center core operational services lean and efficient. Any consulting company still needs sales, project management and customer support as separate functions.

As seen in the figure above the each vertical focus area will have a management functions, that focuses on building Agile POD’s. The idea is not to hire single consultants but to create new POD’s. The POD itself must define the services that the POD can deliver. The role of a vertical department management is therefore to how on recruiting new POD’s. As Valve explains it, the hiring becomes the most important thing in the universe.

A model for money and revenue must also be established. All departments must be self-financing and make sure that they are balanced according to how the revenue stream is defined. One element that is common in the consulting business is bonuses. I personally don’t like the idea of bonuses but I see that it is very difficult without it.(Necessary evil) In the model below is a an example on how different departments can be revarded.

Marketing and Sales: The concept of cloud based systems it that the customer don’t need to purchase all the software upfront. They rent/hire the software in the cloud, and only pays a monthly fee. The Marketing and Sales divisions must therefore be financed by the monthly license revenue, and the bonus would be accumulating. The purpose is therefore to make sure new customers are onboarding and that existing customers hare happy with the services. As a new seller in this way of organizing it, there will not be much bonus in the start, as you have few customers onboarded. But as more customers get’s on board, the bonus will be accumulating, and after 2-3 years there will be a decent bonus and a decent ground for investing more in marketing.

Project and Management consulting: As described earlier, these roles are the only more “permanent” roles that exists in the project. They will ask Agile POD’s to come inn and solve specific tasks. Their services are based on T&M(Time and Material), and their bonus will be based on the revenue(not margin) on the project.

The Agile POD’s: These services are charged in a combination of T&M and Predefined Product Services. The Predefined Product Services is the key here. Create WBS-structures where the price and delivery is clearly defined. The bonus here is a team bonus. Internally in the team it is distributed according to a key. But the POD-team can also choose to use the bonus for other purposes also like training or conferences. Remember that an agile POD is a self-contained unit with costs, revenues and margins. If the POD is not profitable it will be dissolved and the team unattached/fired.

Platform Services: This department is making sure all services/software around the Dynamics 365 is working as expected. This means making sure the azure/tendents are set up correctly, that Office is working and that services like CDS(Common Data Services) and PowerApps are setup as expected. All their services should be Predefined Product Services. And the bonus would be based on margin. Why ? Because we want to become better and better delivering these predefined services. The faster this is delivered the more margin is generated. This is a Win-Win situation for both the customer and for the consulting company.

Customer support/After Sales: Customer support and aftersales is all about delivering excellent customer service after the project have gone live. It’s revenue should be based on support agreements and add-ons. The bonus for the department is based on accumulated revenue, because these services should be reoccurring services that the customer pays for each month. If the customer is happy about the services provided then they will continue to use this service. The alternative for the customer is to use Microsoft Premier Support that can be quite costly and not that relevant in most cases.

At the end of this blogpost I would like to visualize how we envision the Agile POD’s, where we have training on our services and delivering excellent customer services on time and on budget.

giphy-downsized2

And if we don’t, then this is the consequence:

formula-1-fire-gif-1632727

Additional details on Agile POD’s can be found here:

https://www.globant.com/build/agile-pods

https://www.agileconnection.com/article/using-agile-pods-realize-potential-your-team

Video : https://www.youtube.com/watch?v=IwJKRaocdxI

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

Dynamics 365 Pre go-live checklist

I asked Microsoft if I could share their Pre go-live checklist that is used in the Fast-Track program. And they said yes

So here is a copy for you, and what customers must be prepared to answer before Microsoft is deploying the production environment.

Pre Go-live Health Check list:

  1. Solution acceptance by users: UAT
    1. Is UAT completed successfully? How many users participated in UAT?
    2. Did UAT test cases cover entire scope of requirements planned for go-live?
    3. How many bugs/issues from UAT are still open?
    4. Any of the open bugs/issues a showstopper for go-live?
    5. Was UAT done using migrated data?
  2. Business signoff:
    1. Business has signed off after UAT that the solution meets business needs?
    2. Solution adheres to any company/industry specific compliance (where necessary)
    3. Training is complete
    4. All features going live are documented, approved and signed off by customer
  3. Performance:
    1. How was the performance in UAT? Is it acceptable for go-live?
    2. If Performance testing was done, then are there any open actions from it?
  4. User & Security setup:
    1. How many security roles are being used. All security roles are setup and tested?
    2. Users that will need access at go-live have been setup with correct security role?
  5. Data Migration:
    1. Data migration status – Masters & Open Transactions/Balances
    2. Business has identified owners for data validation?
    3. Review cut-over plan: Business & Partner teams are comfortable with the plan?
    4. Does the Data migration performance fits within cut-over window?
  6. Configuration Management:
    1. Are the configurations updated in Golden Configuration environment based on changes in UAT?
    2. Data stewards/owners identified and process in place for post go-live changes in Master/Configuration data?
    3. All Legal Entities configured for Go-Live?
    4. Are configurations documented?
  7. Integrations:
    1. Review list of integrations and readiness plan for each
    2. Latency requirements and performance criteria are met
    3. Integration support is in place with named contacts/owners
  8. Code Management
    1. Production fixes/maintenance process defined?
    2. Code promotion (between environments) process is in place, documented and the entire team knows and understands the process
    3. Code promotion schedule for production is in place?
    4. Emergency process for code promotion to production is defined?
  9. Monitoring and Microsoft Support
    1. LCS diagnostics setup and knowledge transfer to customer
    2. Issue resolution and Escalation process defined – LCS support is verified?

A Practical Guide for Dynamics 365 Iterative Implementation

With the introduction of Dynamics 365 and cloud enabled tools like Office and VSTS(Visual Studio Team Services) we have accelerators towards iterative ways of performing an implementation.

Digitalization also enables the ability to go from a document and template approach to a committed task driven implementation with a sprint based sub-deliveries, where all parties are involved. This also increases visibility, removes lead-times and results in faster deliveries. Adapting digitalization and going iterative in a project it not only about using new tools and processes like VSTS, but also covering Practices, Principles, Values and Mindsets of the project participants.

The iterative preparation

As described in earlier blogposts it is vital to have a clear concept of process modeling where processes are broken down to sub-processes and requirements. Having WBS (Work-Breakdown-Structures) is the tool to plan and execute on deliverables. The traditionally solution analysis is transforming into a iterative preparation phase that can let us define clear work packages that can be solved in sprint executions.

The Iterative preparation should have a formalized set of workshops, and the main purpose is to generate an approved solution backlog. It is normally recommended to complete the preparation phase before going into the execution phase. But in larger projects the preparation phase could be a parallel phase to the execution phase, and where customer approved solution backlogs can be planned into sprints and started upon before the phase is ended.

Please remember that iterative implementation models do not give a detailed picture of scope or costs! The actual deliveries are defined by the customer approved solution backlog.

The following flow chart shows the main activities in the preparation phase.

The granularity and level of details needed in the deliverable documents is agreed on in the project. A middle and practical way is to create the deliverable documents with a minimum set of information and a final conclusion, and then URL link the content in documents towards a VSTS site for further information and process.

The preparation phase is highly customer intensive and require a detailed plan, invitations, workshops and time to document the findings. Before is participating in preparation workshops it is recommended that the participants have completed a “Learn, Try, Buy” phase. An example project plan for the preparation phase can look like this for a retail customer.

As seen in the example plan, the preparation can have dedicated tracks for the functional areas, and these will vary based on the vertical models that is being used. The level of granularity of the sub topics is recommended to be according to the first and second level in the process models.

Use process models to define scope and topics.

The contents of the preparation workshops should be organized based on the process models. This makes sure that best practices are discussed and taken into account for the execution phase. The value chain shown here is the divided into 3 main epic tracks; Management processes, Operational processes and Support processes. There are different models for each vertical. As seen in the following figure I typical use to illustrate the EG retail value chain model.

ProcessModels

For each of the “boxes” in the model represents a topic, where business processes is discussed and defined. The model will provide:

  • Workshop Agenda’s templates
  • UAT test scripts templates and recommended process recordings
  • Stack/technology recommendations
  • Process flows (visio or BPM in LCS)
  • Solution Backlog templates.
  • KPI assessment recommendations (APQC)

From Model to solution backlog

Based on the findings from the preparation phase a solution backlog is created. The most efficient tool to do this in, is the VSTS (Visual Studio Team Services), setup using the CMMI definitions. Here all backlogs are organized in a hierarchy of Epic’s, Features, Backlogs, tasks and Impediments.

The general consensus of these definitions are:

Level Description
Epics Something that transcends projects/releases/versions.
Features Something that cannot be delivered in a single sprint, but can be delivered in a single release.
Requirement(CMMI)
Product Backlog (SCRUM)
Something that can be delivered in a sprint, and have an estimation.
Bug Something that that is not working and can be solved in a sprint, and have an estimation.
Task Assigned work elements with remaining effort.

To relate the structures to CMMI, the following guideline can also be followed.

More details in how to create a backlog in VSTS can be found here. Best practice is that the VSTS site is located on the customers tendant, and that external project participants are invited. The VSTS backlog can also be regarded as a WBS (Work Breakdown Structure). In the following example you can see how the backlog is structured according to a business process model.

The VSTS will also provide dashboards where a complete status can be seen and monitored. Setting up these Dashboards is based on defined queries towards tasks and backlogs, and are easy to tailor to the needs.

How to fill in a backlog item

The product backlog (and other elements) contains a small set of fields needed.

What should be regarded as a minimum set of information defined on a backlog is:

  • Name
  • Description
  • Acceptance Criteria
  • Effort estimated.

If additional fields are needed, they are quite easy to add and also easy to extend with new statuses.

If additional fields are needed, like APQC ID, planning dates, additional names etc, they can very easily be added to the form. See https://www.visualstudio.com/en-us/docs/work/customize/customize-work for more information.

In the preparation phase perform these activities:

  • Right-size backlog items by splitting larger items into smaller items. No backlog item should be larger than it will take to complete in a single sprint.
  • Identify and fill in gaps in the product backlog. Capture new ideas and stories, architecture and design requirements, and other spikes.
  • Reorder the backlog to represent today’s priorities and business value focus.
  • Ensure well defined acceptance criteria has been added to each item.
  • Revisit estimates made to backlog items and adjust upwards or downwards based on recent understanding about scope and acceptance criteria.
  • Review all potential backlog items to consider for the upcoming sprint to make sure they are well understood and that any additional work required to support their development is well understood by both product owner and the team.

Mapping a VSTS product backlog to the functional requirement documentation

Most often it is mandatory to also deliver a Functional Requirement Document. This document is delivered as in the end of the preparation phase. The reason why this document is important, is that it explicitly defines all requirements, and is a commercial document. But instead of writing a hundreds of page document, try to link the requirements using URL-links to the document. Then the FRD only contains the vital and important information that regulates responsibilities and commercial conditions.

The preparation phase ends when the deliverables from the phase is approved and signed by the customer. After the phase is approved, all information on the VSTS site can be copied into Excel backup sheets, that represents a snapshot of the status at the end of the prep phase.

Roles involved in an iterative preparation phase

The roles running an iterative preparation phase depends on project size and complexity. As a minimum, it is recommended that the following defined roles are present in this phase:

  • Project manager (Planning and facilitating)
  • Solution architect (Overall approval of solution)
  • Technical lead (Integrations and migration)
  • Functional Consultants (Covering training, functional area’s)
  • Junior Business consultants (Assisting writing and maintaining the solution backlog)

Customer project participants need to match this roles.

Iterative Execution phase

As the solution backlog is filled, sprints may be filled with approved backlog items. The overall process of the sprint is to deliver at set of backlog items, that have been broken down to specific tasks. The duration of a sprint is determined by the scrum master, the team’s facilitator. Once the team reaches a consensus for how many days a sprint should last, all future sprints should be the same. Normally, a sprint lasts between 2 to 4 weeks. During the sprint, the team holds daily stand up meeting to discuss progress and brainstorm solutions to challenges. The customer may not make requests for changes during a sprint and only the scrum master or project manager has the power to interrupt or stop the sprint. At the end of the sprint, the team presents its completed work to the customer and the customer uses the criteria established at the sprint planning meeting to either accept or reject the work.

The following diagram shows the activities involved, and the expected deliverables from a sprint.

Define the Sprint log

To solve a backlog, then several resources may be required to be involved. When defining the sprint log, each backlog is split into tasks, that defines the sequence, remaining work and the assigned to. This means having tasks for analysis and design of the backlog, creating scripts for testing, tasks for developing and tasks for performing the test of the backlog item. As seen in the following figure a backlog is divided into tasks, and each task is must have a clear description and a “remaining work” estimate. If essential resources are needed to solve the task, then the task also should be assigned to this person.

When a task have been assigned to a person the person is committing to the task, and agrees on delivering the task within the defined sprint.

Conducting a sprint planning meeting

The customer, project manager and the Scrum Master will start a sprint by selecting the backlogs that should be solved in the current print. This is done in VSTS, and the backlogs are “dragged-and-dropped” to the selected sprint, or marked to specific iteration.

When planning a sprint, also identify what resources are needed in the sprint. In the sprint overview, then define the capacity and the resources required in the sprint. This makes planning easier and resource/capacity constraints can be identified by project manager/scrum master.

The daily sprint meeting

This meeting is the most important meeting each day. It should only last for 15 minutes, starts at the same time every day and is located on the same place every day. The SCRUM master is responsible to make sure that the meeting is as efficient as possible. It is a team meeting, where each team member explains what he is working on, and if there are any issues. Do NOT use the sprint meeting to try to solve issues. Just identify and share. Use other meetings to solve and to go deeper into each topic. Any notes that is important and identified in the CMMI can be described in the discussion field on the task/backlog.

Also use the “Add Tag” to mark tasks and backlogs that need special attention and follow-up.

Reporting status and completion

All backlog items have a state. The meaning of these states can be seen in the following flow charts:

Teams can use the Kanban board to update the status of backlogs, and the sprint task board to update the status of tasks. Dragging items to a new state column updates both the State and Reason fields. If additional intermediate steps and stages are needed, this can be customized in the settings of the VSTS.

Documentation

One of the disadvantages of an iterative implementations is that there are no clear and sharp end-phases, and this often does not fit good with commercial contracts. It is therefore important to make sure the deliverable documents are created/updated according to the progress. But remember to be active in documenting as much as possible in VSTS, and define the creation of deliverable documents as defined backlogs in the sprint. Expect to use at least 10% of you time in VSTS to give visibility to others.

Conduct Solution testing

Quality is a vital aspect of and everybody in the team owns quality – including developers, managers, product owners, user experience advocates, and customer project members. It is vital that the solution testing is a customer responsibility and that the testing is structures and planned accordingly.

VSTS provide rich and powerful tools everyone in the team can use to drive quality and collaboration throughout the implementation process. The easy-to-use, browser-based test management solution provides all the capabilities required for planned manual testing, user acceptance testing, exploratory testing, and gathering feedback from stakeholders.

Creating test plans and performing test are one of the most vital elements in the iterative implementation. Please read the following URL for UAT testing https://www.visualstudio.com/en-us/docs/test/manual-exploratory-testing/getting-started/user-acceptance-testing . Building a test plan is therefore a mandatory step and this ensures that defined accept criteria have been met.

The following documents are the input to the solution testing:

Flow Test Script
UAT Test Script by Function
UAT Test Script by Role
UAT Test Script Details

Test & Feedback

Visual Studio Team Services Marketplace contains a ton of jewels, and one add-in that can accelerate testing and feedback is the Test & Feedback extension to VSTS.

When installing it, you get a small icon in Chrome, where test and feedback can be given.

When setting it up, you just point to the VSTS site. And then you are ready to start giving feedback, and to collect backorders, bugs or tests, just click on the play button.

While navigating and taking screenshots, notes and video, it all gets recorded, with URL, time etc.

When done with the recording then create a bug, test or create a test case:

After saving the bug. I see that a bug have been created in VSTS:

I now have a complete bug report in VSTS, that the consultants can start to process and to identify if this is a bug or an “as designed” feature.

Microsoft Tools available for a Dynamics 365 project.

When working in a project, it is important to know that Microsoft tools and services are tightly connected and that each tool can simplify and enrich the user experience and efficiency. In the following figure the different most common tools can be seen. Also pay attention to that there are powerful integrations between these tools, and this section will provide some small tips on how to make these tools work together.

Having a clear understanding of the tools available can speed up implementations, and also give better visibility to all stakeholders. In the following topics, some of these benefits are discussed.

Microsoft VSTS: Visual Studio Team Services

Create a VSTS site at http://VisualStudio.com. For internal projects, create using your domain account. For customer project, it is recommended to create the site on a customer controlled domain, and then add the domain users as guest users. Other elements in relation to VSTS have been covered earlier in this document.

Who uses it? All implementation project members both from EG, Customer and 3’rd party vendors.
When to use it? Everyday and in all SCUM meetings.
Pricing 5 users free, stakeholders free. Paid user is 6$/month.
Members with Visual Studio subscriptions don’t need licenses. https://www.visualstudio.com/team-services/pricing/

Microsoft Excel: Upload and Maintain

Microsoft Excel can be used to import and publish the structure into VSTS, when the Visual Studio Community edition is locally installed on your PC. This makes it possible to extract all fields and values by using VSTS defined query.

Then a process model may be imported and a best practice product backlog is ready to be processed. Step-by-Step instruction on how to use Excel with VSTS, take a look at https://www.visualstudio.com/en-us/docs/work/office/bulk-add-modify-work-items-excel

Who uses it? Solution Architects and vertical responsible.
When to use it? In the start, when uploading process models and WBS’s as a start.
When mass updating backlog items and tasks.
Pricing Office 365 prices. https://products.office.com/en-us/compare-all-microsoft-office-products
Visual Studio Community edition is free.

Microsoft Project: Plan and Breakdown

Microsoft Project is a vital tool for streamlining quotations, WBS and resource planning. Built-in vertical templates, familiar scheduling tools, and access across devices help project managers and teams stay productive and on target. Microsoft Project is also directly integrated with VSTS, and exporting created backlogs and tasks/activities to Dynamics 365 for Operations can be done, and create a complete end-to-end process covering “from quote to cash”.

“Plan-the-work”, and “Work-the-plan” are essential activities and where all stakeholders can participate and cooperate, and that we deliver what is planned, and the invoice the customer receives corresponds to the agreement and contract. Having predefined WBS structures in Microsoft Project simplifies project planning, and the VSTS is auto updated accordingly to how the planning is performed.

 

Who uses it? Presales, Sales and Project management.
When to use it? Microsoft Project is excellent to handle WBS structures when planning and quoting a project. Microsoft Project is also used for planning resources, and to reach project deadlines. For more information on how connect VSTS and Microsoft Project, take a look at https://www.youtube.com/watch?v=GjYu5WmcQXo
Pricing 30$/user/month for Project Online Professional
https://products.office.com/en-us/project/compare-microsoft-project-management-software?tab=tabs-1

Microsoft Outlook: Inform and Alert

Some stakeholders do not want to go deep into VSTS, or to extract information from Excel/Projects. Also when tasks are being assigned they want to be informed and when issues are resolved, they want to notified. Setting up notifications in VSTS solves this requirement, and will keep project participants informed of any changes. The email also contains a URL directly to the task/backlog.

Setting up notifications are done in VSTS, and individual filtering can be defined.

Who uses it? All project participants receive assigned notifications. Project managers and solution architect receive all notifications.
When to use it? When Outlook is used to keep participants informed.
Pricing Outlook included with Office 365 prices. No additional costs.

Microsoft Teams: Discuss and Involve

Informal communications are vital for any project. Tools like Skype for Business will take care of meetings and screen sharing, but Microsoft Teams gives flexible communication on all platforms and keep everyone in the loop. The users can see content and chat history anytime, including team chats with Skype that are visible to the whole team. Private group chats are available for smaller group conversations. The Microsoft teams can also function as the center point, with direct tabpages towards VSTS, Home Dynamics 365, LCS, Sharepoint etc. Since this September the Microsoft teams support guest users, and since these sites normally is on the customers tendents, we consultants are logging in with our company email addresses.

The VSTS Kanban board are easily accessible from the Microsoft teams.

Who uses it? Project participants involved in a project, that needs to have informal communication and the ability to work asynchrony with a discussion history.
When to use it? When more direct communication is needed, and especially for developers.
Pricing Teams normally included with Office 365 prices. No additional costs.

Microsoft SharePoint online: Documents and Archive

Even in a highly interactive and iterative environment, there is a need for documents. And then especially for deliverable documents. For this, SharePoint Online is used to store, track and develop the documentation. The internal folder structure is optimized for the sales process, and contains commercial binding documents. The SharePoint online site in mention here, is the SharePoint online site that is the customer property. The following document structure can be recommended.

After the project is delivered, the SharePoint site will remain as the documentation together with the VSTS site.

Who uses it? Project participants involved in a project, that needs to create or use formal documentation and deliverable.
When to use it? When having specific deliverable that.
Pricing SharePoint is included with recommended Office 365 E3 prices.

Microsoft Flow and PowerApps: Workflow and Apps

Microsoft Flow and PowerApps are quite new technologies in the Microsoft office family. The idea of bringing these tools into the scope, is to be able to have process and workflow automation in the implementations. PowerApps is also a great tool for data collection in testing and for getting feedback.

Some examples of Microsoft Flow:

Streamline approvals by sending files with approval requests

  • I’m sick button
    à Inform colleagues and block calendar.

Some examples of powerApps:

Who uses it? Superusers and Architects
When to use it? Used for automating tasks and to create fast simple prototype apps that can assist in the implementation
Pricing Flow and PowerApps are included in a Dynamics 365 Plan 2 license.

I hope this blogpost gives an insight into the digitalization process partners now are using in Dynamics 365 implementations. The Microsoft sites contains tons of more information and I recommend to explore more of the Microsoft technology stack that is available for Dynamics implementations.

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

 

Common Data Service(CDS); Talk the talk, but not walk the walk yet

With every new release and platform update we see a clear Microsoft commitment to support deeper integration across the D365 portfolio. It’s still in the early stages, but we see the direction. New developments such as parts of Dynamics 365 for Talent is using the CDS. I think that we in the future will see more business apps utilizing the CDS as the data storage. The benefit of the common data model is that applications can work against data without needing to explicitly know where that data is coming from. To see what the Microsoft business platform are, take a look at https://businessplatform.microsoft.com

The CDS also have an important role in both process and data integration between the Sales(CRM) and Operations(ERP) apps, and the current status after the July release and Update 9 is that we have 6 templates that we can use to test some scenarios with D365. The Business platform admin center is where the CDS data integrations can be set up. You can reach this from https://admin.businessplatform.microsoft.com or from https://admin.powerapps.com

The first needed to be set up is Connection sets. Connection sets are a collection of two more connections, organization mapping information, and integration keys that can be reused among projects.

In this case I have set up a integration where data can go from D365 for Finance and Operations à CDS à D365 Sales:

Then also to map the organization ID’s across the 3 services.

And finally the integration key’s.

After the connection sets have been setup, the CDS knows how to connect to the different systems. We can then create an integration project

We can then select between the current 6 templates.

I have only been able to test the Accounts and Products. The Sales Quotes did not work for me (but they are also in preview currently).

After the integration project have been created it is possible to add more integration tasks and make changes to the mapping. If there are issues with the mapping or you need to map additional it will show in the “Issues” column.

N the mapping transformation functions, like this where the item type is changed when transferred from CDS to Dynamics 365 for Sales

The integration can also be scheduled to run at specific times:

Some unofficial benchmarking, I managed to transfer 200 products to CRM in 40 seconds.

Summary:

We are definitely going in the right direction, but we are yet not ready to “Walk-the-walk”. Microsoft currently only support synchronization in one direction, but bi-directional synchronization are in the roadmap. This is important to support “multi-master” system where the same fields can be authored on either side. Like Customer Name etc. It seems Microsoft is focusing on providing business scenario’s like “Prospect-To-Cash”, and what we mainly have now is a preview of how this would look in the future. The mapping is not complete from the templates and needs more work.

My final comments are that we will in the future have a very easy to use system for connecting the unified business operations apps. It was hoped that this business platform would be more ready with the July release, but it needs more releases for this to be useful in actual implementations. This feature needs to grow beyond the “Minimum-Viable-Product”. I hope and expect that we will see in place through the fall, and that the next release will have more mature integration templates. So far, good work Microsoft, but please hurry ! To learn more take a look into the https://docs.microsoft.com/en-us/common-data-service/entity-reference/dynamics-365-integration.

 

 

 

 

Dynamics 365 CSP; What happens when a customer is not paying their monthly bill?

Disclaimer: In this blog post I would like to share my understanding on what is happening when customers no longer pay their bill for Dynamics 365. Please consult with your partner or Microsoft to get the actual and official interpretation.

First some definitions; Most mid-size customers will buy Dynamics 365 through a partner that is a CSP (Cloud Solution Provider). Larger corporations will have the opportunity to buy Dynamics 365 directly from Microsoft through a EA (Enterprise Agreement). The information here is related to the CSP way of purchasing licenses.

When buying Dynamics 365, most customers will receive a monthly bill from their CSP partner. But the great thing about the CSP, is that you may adjust the number of users for the next period. Dynamics 365 have a low limitation of 20 licenses, but above this the customer may make changes.

But keep in mind that even though you receive a bill for the upcoming month, there is still a commitment for the base subscription period. For Dynamics 365, the subscription period is normally 12 months. I think I finally understood why the name is Dynamics 365; The reason may be that you have to buy it for at least 365 days

As stated earlier the customer normally receives a bill each month. But what happens when the customer stops paying the bills?

1. Well first the normal procedure is that the customer is notified by their CSP that payments are missing that follows the normal procedure.

2. The next step is that the CSP partner will suspend the subscription. This is done when by changing the status on the subscription to “Suspended”.

3. When a subscription status is changed to “Suspended”, this puts the subscription into a “data retention” mode. This means that end-users will not have access to any services, but administrators will still have access to the data associated with this subscription.

4. At the end of 60 days after a subscription is Suspended, the subscription is moved to a “de-provisioned” state. At this time, all data is removed.

The conclusion is therefore; Pay you bill or lose your data.

When I think of it…… it’s just like paying your electric bill.. no pay…no power.

 

 

Dynamics 365; New VM type cut your Azure bill

When deploying your Azure based VM’s the most common VM Size is D13 with 8 cores and 56 GB RAM. This VM costs approx. 924 USD per month according to the pricing calculator.

Microsoft have made some new sizes available :

https://azure.microsoft.com/en-us/blog/price-reductions-on-l-series-and-announcing-next-generation-hyper-threaded-virtual-machines/

The new size is named D13 V2 Promo, and will cost 749 USD. If you have MSDN the costs are further reduced to 429 USD/Month

You cannot select this size in LCS, so you must log into your azure portal after deploying, and change the size there.

Nice

Dynamics 365 ideas

Microsoft have released a new site for posting and voting on ideas to the Dynamics 365. https://ideas.dynamics.com

It is assumed that this is replacing all other forum and sites, like connect, yammer etc for suggesting new and exiting functionality. It covers the entire Dynamics 365 stack, and the concept is that each person can suggest and vote on up to 50 suggestions per forum. Microsoft have also created statuses on each suggestion, and additional comments can be added by registered participants.

A small suggestion to Microsoft on the site; Allow us to use our ADFS login, and not just our Live-ID login. (I guess I have to create a suggestion for this)

 

D365FO Channels, Insiders Preview and Update Policy

As announced with the update 4, Microsoft will release monthly updates so that new and existing environments can stay up-to-date with the latest innovations with a click of a button. Hopefully this make it easier to stay on the newest platform. We are also assuming and hoping that this approach in the future will extend to also cover business functionality (eg ApplicationSuite). A faster update cycle also results that there are more versions currently used in at customers. As seen here are all the official releases that Microsoft have made available for Dynamics 365 for operations. With a monthly update cycle the list will be extended quickly. Keeping track on versions does not give any actual customers value. But in a SaaS approach, creating faster and simplified updated on business functionality will require a better and more visible release policy, not based on build numbers.

We need to make this upgrade and update experience easier to understand and easier to follow. The work that the Microsoft Office and Windows team have done is a great example that I think is something we also should have for Dynamics 365: The introduction of release channels.

Update channel

Primary purpose

Feature updates

Platform updates

Security Updates

Preview Channel

Provide users to evaluate the newest features of Dynamics 365 as soon as possible. The channel is only available through an insider program, and is not deployable into a production environment.

Weekly

Weekly

Weekly

First Channel

Provide users with the newest features of Dynamics 365.

Monthly

Monthly

Monthly

Current Channel

Provide users with the released and stable features of Dynamics 365.

Every 3 months

Monthly

Monthly

Deferred Channel

Provide users with new features of Dynamics 365 only a few times a year.

Every 6 months

Monthly

Monthly

 

What channel you would like to have should be a setting in LCS, and the customers can switch between the channels as wanted. Visually the release and update schedule could then look something like this.

With the introduction of Microsoft AppSource and for ISV’s this would mean that they could commit to a specific channel, and “front-runners” like myself would have early access to the newest and hottest. Some customers are deadly serious about stability, and new features could potentially cripple business processes. In this way customers can for them self’s decide their speed for adding new functionality.

Dear Microsoft; Can we have something like this also for Dynamics 365?

 

 

 

 

 

 

 

D365FO – Test & Feedback

Visual Studio Team Services Marketplace contains a ton of jewels, and one add-in I like a lot is the Test & Feedback extension to VSTS:

When installing it, you get a small icon in Chrome, where test and feedback can be given.

When setting it up, you just point to the VSTS site :

And then you are ready to start giving feedback, and to collect backorders, bugs or tests.

Let’s say I wanted the implementation team to know that some changes are necessary on the “All Customers” form. I then click on the feedback button, and click on “start recording”

While navigating and taking screenshots, notes and video, it all gets recorded, with URL, time etc.

When done with my recording I want to create a bug, test or create a test case:

In this case, create a bug, and while I’m typing I can even see that there are one similar bug already reported.

After saving the bug. I see that a bug have been created in VSTS:

I now have a complete bug report in VSTS, that the consultants can start to process and to identify if this is a bug or an “as designed” feature

This means that in a Dynamics 365 project where VSTS is the central element, the feedback and testing just got simpler.

The feedback and Test extension do contain a lot more, but the rest you have to explore for yourself. Try out video recording . That is really cool.

 

 

 

Warehouse Performance Power BI pack

Just a small reminder to my digital brain, that Microsoft have released a Microsoft PowerBI pack aimed for the WMS industry. Here are some samples.

Inbound – Measure vendor delivery precision. Measure put-away average times for products, and vendors, and be able to measure how fast your workers are processing put-away work.

Outbound – Measure how many of the shipments are send in full and on time. We provide ability to measure early, late and on time shipments in order to monitor outbound performance and endure high customer service levels.


Inventory accuracy (Warehouse itself) – Every warehouse needs to have high inventory accuracy on locations in order to be able to process shipments correctly. Measure inventory accuracy for locations and items based on inventory counting with full visibility into discrepancies in quantity and percentage. We provide easy way to monitor counting performance, and inventory accuracy for items on locations

 

Where can you find this package ?

In the LCS- shared asset library:

Thanks Yi Zhang !

The new Warehousing App

Microsoft have released the new warehouse APP, and the setup instructions can be followed here : https://ax.help.dynamics.com/en/wiki/install-and-configure-dynamics-365-for-operations-warehousing/

You can download the Windows App here: https://www.microsoft.com/en-us/store/p/dynamics-365-for-operations-warehousing/9p1bffd5tstm

The setup instructions are very good, and in 10 minutes you should be able to get the app working as expected in your windows 10 machine. The App also have a demo-mode that lets you try it out, without having to connect it to an environment.

Here are some pictures for you pleasure.

Thank you Markus J

Retail process modeling; Divide and conquer

I normally don’t share much that is considered as employer specific IP/tools, but today I will do an exception. At EG we have for years been focusing on how to address the business processes for the Retail industry, and how to name and classify these processes. By combining the structure APQC business process mapping and classification with the essential understanding on how to improve and implement the retail business processes. This means we have a that a predefined approach for scoping and planning retail implementations. The key to this model, is to ensure that we can have a good scoping and planning phased retail implementations based on the customers actual processes.

The top level in the EG Retail model we group all epic processes into “Management processes“, Operating processes” and “Support processes” as seen in the following picture. Then we have broken each process into sub-processes(Levels), pretty much according to APQC.

 

Level 1 – Operating processes

The Operating processes are the day-to-day processes taking place in at a retailer. We have divided the level 2 processes into 5 specific areas as seen in the figure below.

1. Category management is all about grouping products into hierarchies with similar properties and attributes. This makes it possible to give responsibilities and parameters on group levels, instead of on SKU level.

2. Sourcing and procurement is about making sure that we have the products available on the store/channels available for sale. This means working with vendors and producers, and to have clear planning strategies.

3. Retail logistics is processes that typical happens at the central warehouse, and when replenishment to stores is needed, then it is sent at the right time.

4. Omni channels is about being available to customers on multiple platforms and through the customers purchase experience. It stretches from brand awareness, store, web, mobile, loyalty and after sales processes.

5. Store operations is what is happening at the physical store.

Each of these level 1 the retail processes have been split into the following level 2 processes. In the column 1 we have the parent process, and below we have the sub-processes in the horizontal boxes.

We can further look deeper into the category management processes and we see the following level 3 sub processes. You can see the red boxes in the level 1, have been moved to the first column in level 2, and then the sub-processes are shown in the horizontal columns.

For each and every retail process we break the processes down to level 3 or level 4, and we then also decide on how we are solving each of these sub processes. This is done by color coding the processes. As you can see in the following picture, you see that most is solved in standard Dynamics 365, but also with some 3-party products. There are also processes that is not covered by the solution stack available currently.

At level 3 we have mapped each of these processes into APQC and into the LCS business process modeler. When we take the level 3 process called “Define categories” we have a relevant APQC process named 2.1.1, and this means that we(or APQC Members) can extract som KPI’s to allow us to define how this process are performing.

Together with APQC we can use these KPI’s to measure how good this process if performing, and also compare the process with similar retailers also using the same KPI’s. This tells us if the process needs to be improved to achieve more.

Microsoft released a new APQC library in November 2016, that is available in LCS, and here Microsoft have defined 3774 business processes and have 617 flow charts for Microsoft Dynamics 365 for Operations. This gives us a further ability to map the processes directly into Dynamics 365. Here I have searched for “category” to see what APQC and Dynamics 365 processes are supported.

Using the process mapping to create a implementation plan.

When we are working with our customers, we build a scope plan quickly, and define what processes we what to start with, and what to postpone into future projects. We can be clear about the how quickly the ROI can be, and that we can start on business processes where we today have low performing processes. In the sample scoping below, I show how we can start with the stores, then in project 2 enable the HQ suppy chain/replenishment and then finally start a project where logistics to the stores are in scope.

This means we can do project phased retail implementations within the budgets for the retailer. Each of the “Boxes” also contains process descriptions, VSTS backlog and task lists, UAT test scripts and workshop agenda’s. This means that when running a retail project, we don’t have to start with a blank whiteboard.

In addition the model have been mapped into Visual Studio Team Services. This means that the Retail model is also a implementation model for than can be used by project managers, consultants, developers, customer super users and stakeholders.

 

I hope this gives you some ideas on how we are approaching the retail market from a business process standpoint, and delivering our implementation as predefined repeatable services where Azure, Office365, LCS, VSTS, ODM and all the other good Microsoft services are used to the full extent.

Retail is detail, and the future is bright J

Check out KB 3206881 – New image preview control

Microsoft have released a hotfix on the document preview control, and this is actually quite cool. In the following picture you see the preview pane have been updated with some new buttons.

Now we have ordinary functions like move, zoom, print, but we also have highlight, Blocking and some text functions.

This means we can make direct changes to the attached images, and this is interesting when we have scanned copies of invoices or any other document.

In the following picture, I have just highlighted some parts, and blocked some texts. I have also added a text of my own.

Why is this interesting? Because good developers is experts in “copy-with-pride” solutions. And we now have a new web-enabled control that allows us to create new extended solutions for handling scanned documents.

I expect that we very soon will see small invoice approval apps available at a fraction of the price we have seen before, and that is using this feature.

Try it out J It’s Cool.

 

Warning; Generate Demodata; Financial period close

Dynamics 365 for Operations have a nice feature for generating demo data.

Here is my 1000 $ tip!
DON’T USE IT IN ANY IMPLEMENTATION PROJECTS!

This class is only meant for generating demodata in the Contoso dataset, and will corrupt and delete any real data you may have created. If you take a look at the class LedgerPeriodCloseWorkspaceDataCreation that will generate month end financial closing data you see that it only works towards specific contoso companies and personas defined in the contoso company.

There is also a method executed in the beginning, that just deletes data, and makes sure any data you may have generated is just gone.

Why Microsoft have decided to include this “Demo-data” package in the implementation deployment I don’t understand……

…and if you wonder; Yes, I did this mistake.

Try Dynamics 365 now

Microsoft is currently holding an online virtual Dynamics 365 launch party and I’m happy to see that Microsoft is delivering as promised. Take a look at it here:

What Microsoft also have made available is a trial experience of Dynamics 365 for Operations. That is available here. What you get is an access to a 30 day multi-tendent trial experience, where basic testing can be taken place. In my case I got company number 037, and I cannot access other trial users company J

In this first release you will have limited access to 3 basic processes as defined in the trial experience task recording. You can try to navigate around, but you have very limited access to create customers/vendors/products etc. Important: This is NOT a full feature trial, and you NEED to follow the task guides as your guide to Dynamics 365.

Remember that Microsoft is constantly refining and improving the trial experience, and if you want a full-blown trial you need to contact a Dynamics 365 partner that can help you set this up. Later Microsoft will release additional trial experiences, and also support localized trial experiences. Other industry based trials are on the way, like Retail/POS experiences.

Check it out!

My Dynamics 365 FastTrack experiences

If you have not heard about the Microsoft FastTrack program for Dynamics 365 on-boarding, then this is the post for you. So, to say it simple; the FastTrack program is Microsoft’s involvement after the licenses have been purchased to get you fast up and running on the cloud platform.

It starts when the licenses have been purchased through the CSP-portal (or through a EA agreement), and lasts until the live production system have been deployed.

When a Dynamics 365 deployment should start, we get a checklist of tasks that need to be completed when we move from one stage to another. The LCS implementation project looks a bit different than the ordinary LCS projects.

As you can see here there are a lot of checks that needs to be confirmed before going live. In the process, some guidance is needed, and Microsoft is giving this as a service included in the license. As the implementation goes forward Microsoft is conducting some bi-weekly workshops, where each meeting has a predefined agenda with information and some room for discussions and guidance. The touchpoints are divided between actual workshops using Skype 4 Business and Tech Talks that is a kind of webinar session.

In the FastTrack program there is a role and responsibility, that is explaining what is expected from the parties involved in a Dynamics365 rollout.

I have been lucky, and have been involved in a complete cycle, and I have to say that I’m impressed how this FastTrack program works. As the Dynamics 365 is quite new, and the entire Dynamics ecosystem is trying to absorb the information made available, it is easy to get lost and to think that implementations are conducted in the same way as earlier. If you expect that some hardcode system administrator/developer can jump into the sandbox/production environments, then you are wrong. Now things have to happen in a sequence and have to follow predefined quality steps to ensure that we get a rock-solid production environment.

Our FastTrack contact have always been available and have given us the “light touch” on the shoulder to guide the implementation and expectations. Remember that FastTrack is not about business processes, masterdata and project management. That is still handled outside of this program.

A small and important reminder; remember that you have to purchase your implementation licenses, and remember that you could start small, and ramp up you license count as needed.

 

Testing Microsoft Flow for CRM –> AX integration

A few days ago Microsoft have the Flow connector available for preview, and you can read more about it here. What I wanted was to see if I could make a very simplified flow, where a customer is created in CRM, and then transferred to Dynamics AX.

The flow therefore consists of the following steps, when a record is created in CRM, a customer is created in AX. After that, I wanted an email to be sent to me.

To test this flow, I created a Customer in CRM online.

Then I waited for a few second, and then the customer was visible in AX. I just became very impressed.

I also received an email, telling me that a new customer was created in AX from CRM, and that made be even more happy.

If I when in and analyzed what happened, I could trace the entire integration in Flow, and also see how much time spent on processing each step. In this case, I see that AX used 10 seconds to process the JSON/ODATA message, and spent 3 seconds to sending me an email that the record was created.

 

Here are the steps I used to create this flow. First I select the Flow action “Dynamics CRM Online – When a record is created”.

Then I specify the organization and the entity name: Accounts

Next I add the action Dynamics AX Online – Create a record

And I select the instance, and what user I should log in with. I also select the entity name: Customers, and select to only transfer the Account number and Account name into the AX entity. Some of the other fields, I choose to hardcode for simplicity reasons.

The last step is to send an email to myself

Some summary.

Using Dynamics AX with flow will certainly be the way forward on how we integrate AX with CRM and all kinds of other 3’rd party systems. It is still in preview, and the next thing we are waiting for is that Dynamics AX can get reactive, and then when a record is created or modified inside AX, this can trigger a flow. But Microsoft have promised that this is on its way. Also remember that this tool has its current restrictions and that we need to be patient and let Microsoft further develop and improve its capabilities. But for easy and simple integrations I would call this as a unique opportunity to get rid of complex and time consuming integrations. As long as you keep it simple it works as intended.

Thanks Microsoft, keep it coming!

Dynamics AX Retail Scale Unit

In the AX Licensing Guide from february 2016 Microsoft announced a new add-on to retail called; Retail Scale Unit.

Retail Scale Unit

As part of our future offering, we are considering offering a scale unit (Retail Scale Unit) that will enable businesses to run in distributed environment across datacenters to support proximity to physical locations
as well as allow distributed storage and help scale out needs of retail and commerce operations. This offering will allow the ability to add one or more identical scale units that can meet the transactional compute needs of retail and commerce channels. Additional details coming soon.

Even though details have not yet been disclosed on what or how it works, it is now available on sites where you buy Microsoft Licenses. Even prices are available, and it is a service priced per month.

Try to google-search on “E3307B7FD0C149AE9B95E4707C9D1AD7” and you will see distributors that are having this SKU/product in their assortments. Stay tuned as more will be explained as Microsoft makes more information available, but this is great stuff for all retailers!

 

 

 

 

Dynamics 365; Hello CDM

The Common Data Model was today as promised made available in preview through PowerApps, and gave us insight on how it works. You need to take a log at the following blog posts. Your entry point for starting to explore CDM is http://powerapps.microsoft.com

Let’s jump past
all introductions take a small look at the products made available the sample demo data when the CDM database is created. After the sample CDM database is created you will have access to the entities here

Then find the entity named Product. Then click on the “Open in Excel

After logging in I start to see some similarities to what we have in the new Dynamics AX. It’s the same excel app on the right side.

It is even the Contoso data, and as highlighted here I’m showing the item 1000 – the good old Surface Pro 128 Gb J

 

Now start your journey into the CDM. It will be the backbone and foundation of our entire Dynamics 365 stack.

 

 

Dynamics 365, PowerApps, Flow and Common Data Model

This summer at WPC Microsoft unveiled the Dynamics cloud strategy by explaining their new initiative named Dynamics 365. Let me say it very short; IT ROCKS !

A good Q&A blog explaining it is this blog post from James Crowter. The essence is that the Dynamics 365 will be available in 2 editions; Business (cloud based NAV-edition) and Enterprise (new Dynamics AX, aka AX’7′). In addition, Microsoft have also launched the AppSource that can help finding the right business apps available from other ISV/VAR’s. This is a great offer to customers, where 3rd party apps and extensions can be previewed.

As the new name implies ‘Dynamics 365’, there will be a tight connection to the Office 365 package. Is there something Microsoft is good at, it is cross-selling and building strong dependency though the entire stack of Microsoft technology. This will further strengthen the offering. Some concerns are that the total offering could be regarded as an increase in costs. Very often we see customers comparing their offer based on the wrong assumptions, and where on-premises offers are compared with cloud and SaaS offerings. This will give the wrong perspective, because often in on-premises solutions don’t include all costs related to implementation and running the systems. What looks as cheap today may in the longer run actually result in higher costs and the build-up of a technological debt. When making the classic tradeoff decisions in technology, make sure you understand the implications.

Dynamics 365 is more than just a rebranding, and the introduction of the new Common Data Model(CDM) is the glue(database) that will stick all pieces/entities together. We can expect that in future, all the components will be working together across the ordinary product lines as we know it today. Customers will download a app, and don’t care if they have a business or enterprise edition of Dynamics.

CDM will over time make sure that Microsoft PowerApps enables users to create applications for Windows, iOS, and Android mobile devices. Using these apps, you can create connections to common SaaS services, including Twitter, Office 365, Dynamics 365, Dropbox, and Excel. Making all kinds of apps will easier, and in many cases not even involve any coding.

My Dynamics friends, please try out the Microsoft PowerApps because this a central element in the future of Dynamics 365, and also check out Microsoft Flow, to understand how the CDM in the future will enable the flow of data and processes between all components in the Dynamics 365 and Office 365 landscape.

Again we have a lot of learning, and I’m amazed how fast the transition to a cloud and mobile first business environment is going. This change will also make ripple effects on the entire ecosystem. New technologies require new organizational approaches and new workforce skills and knowledge. I assume that we again will see consolidations and mergers among the traditional ERP vendors, where the traditional WEB and .NET consultancy is being consolidated under the Dynamics 365 umbrella. We can also assume that smaller ERP vendors are just too small to master all these new technologies, and will slowly fade away. Soon, most of our business processes is handled on your mobile phone, backed by the cloud.

And remember, your best bet is to learn!

How I saved thousands of dollars on my Azure environments!

I just love such headlines, because it instantly attracts attention.

But in this case it is actually true. And Microsoft is even wants us to do this. I want to write how to automatically shut down and start up environments in Azure, so that you are not spending more than needed. This post is for newbies, and experts surely will bombard the comment section with improved suggestions on how to make it even better.

In this example I have 4 environment running in Azure and the machine type I prefer is the D13_v2. This will cost me 3696 USD per month if I just let them stay on for 744 hours per month.

But I only plan to use them 07:00 à 17:00 Monday to Friday. This is 200 hours per month, and then it will just cost 993 USD J A lot of fun can be done with these extra credits.

So what is the next step? The trick is to use the Azure Powershell Runbook. Here is the step-by-step instruction on how to set it up:

1. Log into Azure, and open the Azure automation

2. Add an Automation Account.
    Create a name, like “TurnOffOnVM”.
    Select the subscription, and if a resource group should be created. Also if you want an Azure Run As Account. (I didn’t bother to have that, since I have no important stuff on these environments)

3. Then create an Asset named “automation” for holding credentials, that will run the shutdown/start up scripts. The credentials you are using must have the rights to run scripts and to start/stop VM’s.

4. Let’s create 2 Runbooks, that holds the scripts and schedules for the start and stop scripts.

5. Use the “Powershell Workflow” type

 

6. Let’s put in the “Start script”. It’s done here

 

I have removed my VM-names in this example.

If you wonder what your VM name is, it is the computer name, that can be seen here:

Here is a copy-paste version of the Start-up script:

workflow StartVM

{

    $cred = get-automationpscredential -name “automation”

    add-azureaccount -credential $cred

    select-azuresubscription -subscriptionname “Microsoft Azure Enterprise”

 

    $VMs = Get-AzureVM

 

foreach($VM in $VMs)

{

if ($VM.Name -In “VMName1”, “VMName2”, “VMName3”, “VMName4” )

{

if ($VM.PowerState -ne “Started”)

        {

     Start-AzureVM -Name $VM.Name -ServiceName $VM.ServiceName -ErrorAction Continue

        }

}

}

}

7. Let’s put in the “Stop script”. It is basically the same procedure as creating the “start script”, so I just add the copy-past version of the script.

workflow StopVM

{

    $cred = get-automationpscredential -name “automation”

    add-azureaccount -credential $cred

    select-azuresubscription -subscriptionname “Microsoft Azure Enterprise”

 

    $VMs = Get-AzureVM

 

    foreach($VM in $VMs)

    {

    if ($VM.Name -In “VMName1”, “VMName2”, “VMName3”, “VMName4” )

    {

        if($vm.Status -eq ‘ReadyRole’)

        {

        Stop-AzureVm -Name $vm.Name -ServiceName $vm.ServiceName -Force

        }

 

    }

    }

}

 

Remember to press the “publish” button the scripts J

8. Let’s create a schedule (one for the Start runbook, and one for the stop runbook)


9. You can now monitor the start/stop scripts:

 

10. Go party with all the credits you have saved! And if you see me, and use this script, buy me a beer J

 

Happy DAX’ing J

The most important AX.HELP page

To always keep an eye on what’s happening is important. We see that AX.HELP is growing and becoming the number one center for understanding the new AX. I want to give you the most important page; https://ax.help.dynamics.com/en/wiki/help-get-started/ Read it!

What I did was to setup a RSS feed to get all the news and new articles, and the address is https://ax.help.dynamics.com/en/revisions/

Setting this up in Outlook is easy. Right-click on the RSS Subscription, and add https://ax.help.dynamics.com/en/revisions/

You will then get a RSS message for each new post and article. You will in 5 minutes every day get the overview of what have been published and update. No more slow searching, and you will quickly be the “go-to” expert, that knows it all.

Happy DAX’ing

 

 

 

New Dynamics AX – Pimp up your form-letters with document branding

The New Dynamics AX never stops to surprise me, and every day I find new possibilities and solutions. Microsoft have made available a new set of form letters, like purchase order, sales confirmation, invoice etc, and installing them is optional. What is new is that they are much nice and modern, but they are missing country specific requirements. Microsoft is calling them “Modern reports”, and you can read about them here.

But the main topic of this blog is about how to pimp-up your form letters, with document branding. The following report is the Purchase order in modern design, and where I have added a logo and some colors.

The menu item for controlling this is found under Organization à Setup à Document branding

We have the following menu items;

Document brands, is just the identifier, like company ID etc.

Document images is a container for logoes etc,

The Brand details is where we can override per formletter and design, and select colors, and override addresses and contact information.


 

I expect Microsoft have much more coming, so stay tuned J

AX RTW My ODATA and JSON journey – Part III

Now the fun begins, and let’s develop! The following post is only meant for hardcore Dynamics AX technical consultants J

In previous posts I wrote about how to access Dynamics AX data and metadata through ODATA and only using an internet Explorer. In these scenario’s we where only fetching data from Dynamics AX. This time, we will use Visual Studio to publish data into Dynamics AX, by accessing the ODATA services. I must let you know, that I consider myself as a newbie to creating C# code, and the following post is only for giving you a directional guide for you to start exploring for yourself.

What you need to make this happen is:

  1. Visual Studio 2015
  2. Administrator access to Azure AD
  3. A deployed New AX on Azure

What I wanted to achieve here, is to be able to add vendors from a C# program. A None-Dynamics AX developer may have no idea of the inner structure of AX, but they can be given access to the metadata. Based on this metadata it should be possible to create CRUD integrations. One issue with Visual Studio is that it is not possible to consume ODATA services directly. So we need to generate a proxy library. The MSDN OData v4 Client Code Generator is the best way of doing this, because it will generate wrapper classes for the data entities. To speed up a bit I found the AX-LAB12, where Microsoft is showing how to import a BOM, here I found the framework that we can use. This AX-LAB12 contains a word document that is good to understanding how to set this up. I’m “stealing” the following 4 first classes from the.

The AuthenticationUtility is the class that makes sure we are authenticated with Azure AD, and that we are logged in with the right user. In this class you can hardcode the user/password and the Tendant and the ActiveDirectoryClientAppId

The next step is to generate the ODataProxy. This is done in the Microsoft.Dynamics.DataEntities project. This basically means creating a bunch of classes that reflects all the metadata. It will give us a class, so that we can assign values to Odata fields and execute methods etc. But first we must specify where all the metadata can be downloaded from. In the picture below, you see that this is just a hardcoded string in the OdataProxyGenerator.tt file.

Then right-click as shown under, and select the “Run Custom Tool”.

This will then download all the metadata from the published data entities in Dynamics AX, and create one class per data entity. It takes a few minutes, and it creates thousands of classes.

Since we want to create vendors, it is interesting to see how the Vendor data entity looks in AX, and how the generated C# proxy class looks like:

As you see, we are consuming the ODATA Data entities into visual studio, that let’s to access fields and methods as we are used to in X++. And this by only generating proxy classes from the Odata metadata.

Then I may start developing against the ODATA proxy classes, and now I see that fields and method lookup, that we are used to in X++ is working. As seen in the following picture, I’m declaring the vendVendorEntity of the type Vendor, that have the same structure as defined in the Data Entity.

My complete code for creating a vendor using ODATA is therefore :

I build and run:

I then check AX to see if the vendor is created:

It works J

Let’s try to see if I change the code, and are selecting a vendor group that does not exists :

It correctly don’t let me create the vendor J

The conclusion:

The ability to create CRUD operations using ODATA, changes the game. External none-Dynamics developers can create apps and integrations through the Odata services, and it regulated through security and validation. They don’t need to know the internal structure of Dynamics, because this is exposed through the metadata service. Dynamics AX is truly a game changer.

Happy DAX’ing J

 

 

New Dynamics AX and the Excel Add-on

When using the «Open in Excel»( Dynamics Office Add-in) feature in the New Dynamics AX RTW, you may have some trouble opening it in Excel.

Especially if you have a corporate login, like me. It then seams that the login failed.

 

Microsoft have upgraded the Dynamics Office Add-in, but on existing demo data (Contoso) may also need to be changed.

Then the connector seams to be working (At least for me)

Also take a look at https://ax.help.dynamics.com/en/wiki/office-integration-troubleshooting/

Happy DAX’ing

New Dynamics AX On premise = Azure Stack

As we know, deploying the new Dynamics AX will basically come in 3 different flavors. I wanted to explain a bit what this means and what I have found. The information here should be double checked together with your partners, and also with Microsoft. Also remember that it all is very fresh technology, and that things may change quickly as must is in early releases and preview.

 

  1. AX Public cloud – Black-box, maintained by Microsoft in Azure and it just works.
    The public cloud “edition” was the first platform that the new Dynamics AX was released on. In the public cloud it is Microsoft personnel that is deploying and monitoring the instances. Customers and partners should have no technical access to the production environments. Data and code (like customizations) are created as packages and uploaded into LCS, where according to maintenance windows, and Microsoft will deploy them to the production environment. Customers pays a monthly fee per user, that includes licenses a production environment with high availability, disaster recovery and some sand-box environments (for testing and dev). The customer doesn’t have consider how to scale or what kind of virtual machines is needed. This is taken care of by Microsoft. Customers must expect to pay at least 110.000 USD per year in costs for this. It is my consideration that this offer actually is a very good offer, because it includes many of the services and licenses that we don’t normally consider when evaluating costs for operating a ERP system. I think than smaller customers (50-250 users) would benefit from this scenario.
  2. AX Private cloud – Maintained and deployed by customer/partner, but still on Azure.
    Private cloud is 100% running in Azure. Private just means that Microsoft is not deploying and monitoring the instances. In this scenario you will purchase AX licenses, and you will purchase Azure services and deployments. Basically 2-3 invoices J. You scale up the VM’s according to you needs, and it is your own responsibility. It is typical a partner that can help out, and you probably will have to purchase service agreements to monitor and maintain your Azure deployed instances. Will this be cheaper than the “public cloud” offer? If you compare apples with apples I don’t think so. There are many hidden costs, and if you sum up the costs, at least my internal calculations show that this offer quickly can be 20% more expensive than the Public Cloud offer. But the private cloud offers flexibility, but will demand a very knowledgeable technical department/partner. You can decide more by yourself within the boundaries of the Azure. I expect that larger customers (250+ users) would like to go for this scenario.
  3. AX On-Premise and Azure Stack – For those that have a datacenter to spare

    Azure Stack is the new hybrid cloud platform product that enables organization to deliver Azure services from their own datacenters. You get cloud services, yet maintain control. You decide where to keep your data and applications—in your own datacenter or on others/azure. You will still pay for the AX licenses, but the you will also have to pay for your own hardware. There is one problem. It is not released yet. We are waiting for Windows Server 2016 with Azure Stack, and SQL Server 2016. These are still in technical preview. But for those (like me) that like to try out, you can actually download it from https://azure.microsoft.com/en-us/overview/azure-stack/ . If you wonder what kind of machinery is needed, take a look her. (Basically 16 Cores , >128 Gb RAM and a few TB of disk). It will be a bit difficult to run the Azure Stack on my portable PC J. Also remember that there will still be lots of services that still have to be on the cloud. I assume that this option will be selected for large enterprises (1000+ users) and for hosting providers/ASP.

And remember that what I write here is not facts, but just my interpretation of how it can be.

Happy DAX’ing J

Mobile Access for Visual Studio

In the new AX the tool we use for work, development, test and build is Visual Studio Online(VSO). Now a mobile access to VSO is available in the Visual Studio Markedplace. It enables you to browse, monitor and engage in projects via your phone, It’s still in preview, but it looks very interesting.

Take a look at it here; https://marketplace.visualstudio.com/items?itemName=sprints-for-vsts.sprints-for-visualstudio

AX RTW Hack to enable unsupported countries

We have learned that today the RTW is officially released, but this is mainly for the “tier-1” countries. I’m a bit jealous on Denmark and Iceland that are in the first support release wave, and that Norway have to wait until H2 2016 to get country specific support. But when I dig into the AX RTW I will find much of the country specific elements already in place. They have been included in the transfer from AX 2012.

Only one small Issue. Microsoft have hardcoded that the unsupported features cannot be used. I guess (and hope) that it is for a reason. If you try to create an unsupported company for Finland, you get;

Many of the localized fields needed to run Finland is then hidden or disabled.

But there is a way to “Hack” this. Comment out your country from SysCountryRegionCode.onCountryRegionSupportedCheck():

Then compile and deploy. Then the fields related to Finland etc will open up.

I know I’m are moving into uncharted terrain, and this is disabled for a reason. But we start already now to promote and sell AX 7, and then we expect Microsoft to stick to the release schedule, and make the Dynamics AX ready for all countries as planned. We also have several customers that don’t need the localized company specific functionality, and they don’t want to be constantly reminded J.

Disclaimer; If you do this for a production environment you are on your own!

Hacking Dax’ing

AX RTW – My ODATA and JSON journey – Part I

Learn the word; ODATA. We will hear a lot of ODATA in the future, because it will change the way we integrate and how we exchange information between AX and other systems. A good starting point is the AX help wiki, that Kuntal Mehta created. I have decided to explore what the ODATA can do, and wanted to write a bit about my journey. Instead of trying to explain all technical details of data entities and how the architecture is, then let us rather just test something J

What you need to test what I’m doing is

  1. An AX RTW environment deployed from LCS
  2. Internet explorer
  3. Good old notepad

Step1: What services is available?

To get all entities available to you use your Site address, and add “/data” at the end.

Then save the file you receive, and open it in notepad. (I have associated *.json with notepad). The file you get looks like this:

Each line here represents a data entity service we can use. The format of this is the JSON format, but that is not important now.

Step2: Show me the customers

In the file you may find that there is an entity/schema named “Customers“. I can therefore just add the “/data/Customers” to my URL

And then I get a JSON file of all the customers;

But this is a bit “cloudy” and I can further filer down what I want. Let’s say I just want to see all customer names. I can then add “/data/Customers/?$select=Name” to my URL

Now it returns a JSON file with only the Name.

If I wanted to add one more column, like the Payment terms, the syntax would look like “/data/Customers/?$select=Name,PaymentTerms“, but this would not work because the comma cannot be used on a URL. I therefore need to replace the comma with %2C, that is the URL representation of comma. For multiple columns I therefore add “/data/Customers/?$select=Name%2CPaymentTerms

You see some strange “@data.etag”, and here is an explanation. It is for caching.

Step3: Can I read this in Excel?

Yes. Excel can import OData, and format it like we would.

Then fill in the /data URL, select schema, and then select fields.

And then you may read directly into Excel all entities made available in AX RTW, even without the AX connector.

Step5: Show me all !

Sure. Try to add the “/data/$metadata“, and AX return All schemas, fields and relations. It take a long time, but nice to explore.

Step6: Can we use DIXF to import directly from OData feeds ?

This is what I would love to see. But I have not found it yet.

Happy DAX’ing 😉

AX RTW – Hack to enable configuration mode

When downloading the AX RTW local VM from connect, you often want to disable some configuration keys. (Like catch weight etc). But now you will see the following warning;

“This form is read-only unless the system is in the maintenance mode. Maintenance mode can be enabled in this environment by running maintenance job from LCS, or using Deployment.Setup tool locally”

This warning is to prevent that configurations are enables/disabled, and that the system is set in maintenance mode. There are 2 ways of dealing with this.

  1. The proper way; Use the Microsoft.Dynamics.AX.Deployment.Setup.exe command; (Credits to Joris de Gruter)
    Run this from command line:

    \bin\Microsoft.Dynamics.AX.Deployment.Setup.exe –metadatadir –bindir –sqlserver . –sqldatabase axdbrain –sqluser –sqlpwd –setupmode maintenancemode –isinmaintenancemode true

     

     

     

    Here is an actual example J:\AosService\WebRoot\bin\Microsoft.Dynamics.AX.Deployment.Setup.exe –metadatadir J:\AosService\PackagesLocalDirectory\ –bindir J:\AosService\WebRoot\bin –sqlserver . –sqldatabase AxDB –sqluser axdbadmin –sqlpwd ******* –setupmode maintenancemode –isinmaintenancemode true

    and then run the command with “false” at the end to turn if back off.

  2. The Hack way; Use the Microsoft SQL Server Management Studio, and edit the following record in the table dbo.SQLSystemvariables ; CONFIGURATIONMODE

After that, you can change configurations. But make sure you never
EVER do this in a production environment!

Hacking DAX’ing J

Master data concepts

In Dynamics AX, we have been blessed that we can have most of the data in one system, and in one single database. But I sense a shift, where systems are breaking up into more loosely best-of-breed components. We see the introduction of Omi-channels, SaaS, RESTful Web services and ODATA as accelerators into this area.

The generic topic of Master data management(MDM) is much less about technology and much more about understanding how business processes are supposed to work. The principle of MDM is applied whenever two or more business processes must view or share (master) data. This means that all companies have a need for the discipline of MDM, meaning that it must be driven by the business, a business case, and supported/enabled by IT. This includes governance and data quality, and MDM cannot be established without them.

In my profession, when working with our internal EG-Retail model we are covering the Master Data Management processes from life-cycle data management to data distribution to POS-systems. As you see, there are not much directly related to the functionality of Dynamics AX, but more against how to create work processes that maintains and secures a company’s master data.

Master data lifecycle

The most common area where life-cycle processes are used are on products. Products are introduced, created, maintained, discontinued and finally archived or deleted. A lifecycle also involved different roles and departments, and the responsibility and master data ownership is changed through the lifecycle. It could be visualized like this, where effort in the processes is shown.

In terms of responsibility I also see the benefit of separating the roles of Master Data owner and requester, and introducing clear formal processes.

 

Create master data

Master data are the critical nouns of a business and falls into four groupings: people, things, places, and concepts. Further categorizations within those groupings are called subject areas, domain areas, or entity types. For example, within people, there are customer, employee, and salesperson. Within things, there are product, part, store, and asset. Within concepts, there are things like contract, warrantee, and licenses. Finally, within places, there are office
locations and geographic divisions.

The process of identifying additional Master Data elements should be a formalized process and an ownership process must be in place.

Creating master data is the process of collecting the accurate and persistent data related to each master data element. It is not only important to just enter the data, but also to identify
the source of the master data. In a data maintenance scenario, the process and master data owner may return to the source to collect more details. When creating master data, the completeness must be defined. Define what Master Data elements are mandatory and optional. What related master data must be in place to correctly create the master data.

As described above, all Master Data will have a life cycle. When creating, often only a minimum set of mandatory data elements are required. The life cycle and status should reflect the completeness of the master data.

Master data may originate from many different resources, roles and process participants. Timing is also relevant, because different data elements may only be needed on a later stage. An example is that certain set of data elements is needed when purchasing a product. When selling the product another set of data needs to be in place before this process can start.

A more formal process of handling Master Data will ensure that the quality and relations are taken into consideration. The following process should include both a requester and the master data owner, so ensure the formality.

When the master data is complete enough to be used for selected processes, the master data must be released and available in the systems where it is used. Examples are the procurement process where only a small subset of information is needed to initiate the purchasing process. But in order to receive the product at the warehouse, the product must be released to the warehouse management system. This require additional information and timing. In integrations, the release process of master data must be controlled and tracked. Before release of master data, the required completeness and data quality must be decided and confirmed.

Maintain master data

As requirements is expanding and changing, the master data will also have the need to change. Maintaining completeness and quality of master data becomes a central part in the life cycle Master Data management. Typically new markets and countries will request the need for new prices, VAT/Tax compliance and translations. Establishing maintenance processes will not only be essential for growth and expansions, but also to support day-to-day processes. The key principle is to centralize master data update around specific roles and processes. The generic maintenance process is reflecting this.

Discontinuation of master data

Deleting Master Data is generally not recommended. The reason is that the master data often have been used and related to transactions, and often in other systems like WMS, eCommerce etc. A better approach is the discontinuation process. If master data still should be deleted, it is important to analyze the consequences, and to initiate clean-up processes.

The discontinuation phase begins when the maintenance phase ends. Master data discontinuation is often a planned process, where the date of the discontinuation is set in advance. The discontinuation can also be related to specific processes, like a product is discontinued for procurement, but not for sale.

It is recommended must implement a structured life cycle process on master data, to control the situations where a master data record is discontinued or replaced with a new master data record. In this future process, we expect that the participants are the Master Data Owners and the MDM Administrator. The main purpose of the process is to support the actual discontinuation or depletion of Master Data Records in the AX system.

The input to the process is an online request or need for discontinuation or depletion of a Master Data Record. The output of the process is an updated Master Data Record in AX with the specified data from the request.

The process diagram below outlines an example of the future process for the discontinuation or depletion of existing Master Data Records in AX. In this process, we distinguish between discontinuation and depletion. In relation to products, the discontinuation only applies for specific SKU’s whereas depletion applies for all. (Like flushing out master data from an integrated system)

If it is expected that the growth of the data will accelerate as more partners and stores is connected to the installation, then archiving and purging will be important to keep an optimized performance of the Microsoft Dynamics AX installation. Before a master data record can be archived and deleted, other data (such as purchasing documents) that refer to the master data must themselves be archived. Both the purge and archive operations depend on a carefully determined hierarchical relationship of related tables based on both master data, settings and transactions. The archiving and purging process is too complex to be manually handled, because of the many relationships that exists to master data. It is therefore recommended to look into tools like the Microsoft Dynamics AX Intelligent Data Management Framework (IDMF) that can be used for this purpose.

IDMF have the following process.

Taking easy on master data processes can result and complete failure of your system and processes. When implementing Dynamics AX, make sure that enough time is invested to create a clear strategy, and good processes.

Happy DAX’ing J

AX7 – Cloud estimation sheet

When working with new clients, I just love to show Life Cycle Services, and all the new tools and gadgets now available to us. Prices on licenses and Azure services are beginning to come into light, but there is one “unknown” factor. How much would it cost to in time to setup and implement all the tools and services?

I will not
give you the estimates, but I can give you the sequenced task list of what to do

Area

Sub Area

Task

Life Cycle Services

Installation

Create LCS site

Life Cycle Services

Installation

Add users to LCS

Office 365

Installation

Create Office 365 Azure AD

Office 365

Installation

Create SharePoint site

Visual Studio Online

Installation

Create VSO site

Visual Studio Online

Installation

Add users to VSO

Customer source

Installation

Add users to Customer source

Azure

Installation

Create Azure account

Azure

Installation

Setup administrators to Azure

Azure

Installation

Connect Azure subscription to CSP account

LCS

Setup

Create LCS projects

LCS

Setup

Connect Azure, VSO and SharePoint to LCS

LCS

Setup

Invite project users into LCS

LCS

Setup

Select and setup metology in LCS

LCS

Setup

Infra structure estimation tool in LCS

LCS

Setup

License sizing estimation tool in LCS

LCS

Deploy

Deploy Demo/CRP based on Contoso

LCS

Deploy

Deploy development environments (per developer)

LCS

Deploy

Deploy VSO Build/staging environment (one VM)

LCS

Deploy

Deploy Acceptance test environments (2-3 VM’s)

LCS

Deploy

Deploy Production environments (3-10 VM’s)

LCS

Monitor

Setup LCS System diagnostics

Visual Studio Online

Setup

Setup VSO projects and users rights

Visual Studio Online

Setup

Setup process for nightly builds and automated test

Visual Studio Online

Setup

Define product backlog

Visual Studio Online

Setup

Define sprints

Office 365

Setup

Setup Microsoft power BI

LCS

Setup

Setup Configuration Manager (data transfer)

Office 365/LCS

Setup

Add ODM templates to be used in the project

AX 7

Setup

Setup Data Entities for ODATA/Integrations

AX 7

Setup

Basic parameters and generic setup

 

..Ok… the number is 42… but what was the question ?

 

Access Dynamics AX (aka ‘7’) performance counters with “&debug=develop”.

The new Dynamics AX have some very powerful capabilities to show exactly where time is spent. You would like to explore more about what is happening behind the scene, try adding the “&debug=develop” to the URL. This will bring up a small timer that shows how much time was spent on opening and showing the current form.

If you click on it, it will give more performance details. Like if I open the all customers in the Contoso form, my system gives me the following.

I see here that the loading the customers took 366 ms, where 131 ms was from the AOS loading the data. You also see a color coding (green/blue) that indicates at what sequence the time was spent.

There is also a section to show what the server is doing, and it also shows you the exact SQL call that was the longest running SQL statement.

At the bottom of the performance screen we can also see something interesting;

And when I click on this Session ID, I get;

I have no clue what that is J, but could this be a tool for external monitoring ?

 

 

New AX (aka ’7’), limited navigation and the WHSWorkExecute

When opening the new Dynamics AX client, you see that we have a nice set of navigation menus and options in the top of the screen.


But if you would like to limit the navigation options for the user, try to add the “&limitednav=true” to the URL.


As you see, the menu-bar is then changed, and the user cannot see the menu buttons, search and settings any more. We have a limited navigation.

Let’s further extend this ability by also adding the “&mi=action:WHSWorkExecute. For those that have not heard what WHSWorkExecute is, it is the form used for simulating a Warehouse Mobile Device. The form existed in AX 2012, and it also exists in AX 7.

In the AX ‘7’ preview demo environment try the URL:
https://usnconeboxax1aos.cloud.onebox.dynamics.com/?cmp=usmf&mi=action:WHSWorkExecute&limitednav=true

You then have a quite nice RF device without the navigation options. Simple can often be the best J

Happy DAX’ing

New Microsoft Dynamics AX – A guide for using retail sales prices and discounts

This is a guide I have been looking forward to publish, but due to the NDA restrictions I needed to wait until the new Dynamics AX was made public preview, and today it is J This blogpost is not about AX licenses prices, or implementation costs, and if you were looking for that, you have to google again. It’s about what product sales price and discount options that exists MSDAX – “out-of-the-box”, and it’s not about product sales price tactics and strategies, but how to apply them into AX and most of the presented information here can also apply to earlier releases of Dynamics AX.

Summary

Pricing and discounts is a science, and the number of variations and combinations is amazing. When I talk to my customers I like to show the following overview of the most common used pricing strategies available in AX. Here I have tried to put “retail names”, and exemplify what they mean.

The Microsoft documentation on TechNet visualizes quite clearly the relationship between prices, discounts, channels and programs with the following many-to-many relationship diagram;

It basically means that you can combine and mix pricing elements to achieve the strategy you are aiming for. The combination of these gives us the functionality we are looking for, but we have also realized that not all combinations are practical or possible. Remember that much of the pricing possibilities became available with the introduction of Retail, Call- center and in some cases the lean-modules. Some pricing options will therefore not work in combination with each other, and also not across the Omni-channel.

Product prices and RRP (Recommended Retail price)

RRP, or Recommended Retail Price is a very common and known concept. It is very often used on, when the sales channels are more complex, and includes producers, distributors, partners and resellers. Often the ownership structure of the sales channel is fragmented and differentiated. The recommended retail price is therefore often specified based on geography, currency and channel. The RRP if very often just the baseline on which prices are built on, and as the name suggests, just a recommended price. In Dynamics AX either the standard (single currency) sales price on the released product or having a “All/Group” sales trade agreement (multiple currencies, dates, quantity) is sufficient to make this work. One restriction to make this work within AX, is that the product must be released, in order to do this. If a more “Global recommended retail price” is wanted, I suggest reading my blogpost on this subject, where prices are made global, and distributed to all legal entities.

Price/discount matrix’s (Trade agreements)

The price discount matrix has been available for AX users since the beginning, and handles most requirements in a B2B scenario. The setup is quite easy, and involves setting price or discounts on groups, or in relation to specific entities like product or customer accounts.

In the following example we have different sales price on a single item, but it differences per customer price/discount group.

In AX ‘7’, you will just create a trade agreement journal, and create the necessary lines to reflect the different prices.

 

Discount matrix

Maintaining sales prices on the item-level per customer group, or per specific customer can quickly be a maintenance nightmare. The different combinations quickly end up with so many price points, that oversight is lost. A customer of me had 4000 products, 6 currencies and 12 different account selection options. This would mean that they would maintain 288.000 price points. That was impossible! The option was to use discounts instead, and just maintain recommended retail prices per currency. This resulted in 24.012 price points to maintain. With the use of smart rounding, and generic currency we reduced the number further down to 4012 price points. A discount matrix would look like this, where the combination per item price group and customer price group resulted in a discount.

In AX ‘7’, you will just create a trade agreement journal, just like for prices, and create the necessary lines to reflect the different discounts.

If you want to test a sales price, with all the different combinations, you can use the “Find prices” feature. Here I have a specific customer belonging to the price group “Retailer”, The item belongs to “Apparel”, and the combination of this gives me a net amount of “127,5”.

Generic currency and smart rounding

As seen in the example above, I have specified prices for different account selections, but only in USD. If multiple currencies are used, and you don’t always want to maintain currency based pricelists, then the generic currency option is a nice feature. The first step is to specify what is the generic currency, and what exchange rates that should be used. Also if smart rounding should be applied after currency conversion. In the setup of smart rounding you also need to set up member currencies that belongs to the smart rounding.

Enabling this, will open the “include generic currency” option on the trade agreements, and when creating sales orders in a different currency, the sales order. Here we have a sales order, converted from USD to €, and then the automatic smart rounding is applied.

Trade agreements and Retail/Call center in combination

As shown here we, the trade agreement matrix can solve quite a few price requirements, but in a retail scenario it would not solve the all requirements we see. We are missing elements like retail channel, categories and more advanced features. Going deeper into retail functionality shows that there are many additional possibilities that opens up. I have met quite a few companies, that have advanced price and discount requirements, but they have never thought about enabling the retail module. Many thinks the retail is for POS, but the retail module is for Omni-channel. This means that we can also use this in traditional sales orders (The “call center module” enabled retail functionality to be used in traditional sales.)

But there are some elements in AX, that you should be aware. I often see good old AX consultants getting confused when they start looking at retail discounts. Let’s say we have the following scenario; We have item 0001 with a recommended price of 150 USD.

And then we have a 15% discount on the customer;

You would then expect that when using the retail module, it will just use these values. But it don’t ! It actually calculates the discount amount, and not the percentage. Keep this in mind, because it will confuse you later.

When using the retail discounts, they will work together with the traditional trade agreements, but not exactly as you would expect. To better understand the actual code executing, then take a look at the class\RetailOrderCalculator.saveSalesOrder(). The CRT (Commerce Run Time) engine returns quantity, price and total discount amount. Based on this, the unit discount amount is calculated. I’ll also show the actual source code on this, because when we are talking about discounts, we are most likely going to think in percentages, and the same are customers and users.

And one more thing. The CRT will not apply both the trade agreement discounts and the retail discount. It will select the best of them.

The use of Retail Discounts in Sales orders

Let’s take some scenarios, that is common in the retail industry, and I’ll try to link them to what you find in the standard Contoso demo dataset. In AX ‘7’, the retail module mainly has 4 discount types of discounts, and a price adjustment;

  1. Just an ordinary discount product

Let’s say you just want to have a discount on a single item.

First step is to use the retail discount rule.

 

Periodic discounts on categories

“50 % off on accessories this week” can be a strong trigger to make the customer open the wallet.

To create such discounts, we can use the same screen.

PS! The Retail discounts have a “discount code” field, but I have never managed to use it in the call-center sale order screen. But in POS, it works J

 

Happy hour J

Happy hours are an efficient way of attracting and breaking customer’s shopping routine. It’s also fun, and can give retailers a lot of attention. This feature is excellent also for Black Fridays.


The validation period has an advanced setting, that opens up the field “Discount period number”, and select a discount period. I have here created a happy hour between 12 AM and 1 AM for early birds.


We can further specify the valid periods here, so that more period based discounts can be given. Just remember that the valid period is current time +/- the offset time defined on the retail channel. In the AX ‘7’ CTP 7 version, the call-center channel details screen is not showing the time zone, like it does for retail stores, and this could have some implications on using “Happy hour” in an installation that works across time zones. A small service request to Microsoft have been created, and I’m sure they will fix this J

Coupon discounts (Call center)

Coupon discounts seems to be very popular in US, and we also see an increasing use of this in Europa. Especially in relation to mobile coupons.

In Dynamics AX 2012 there are two ways of working with Coupons. One solution for POS, and one solution for Call-Center. And these two don’t work in common. It is a bit sad, because it means we have a GAP in the Omni-Channel offering. But with enough push on Microsoft, I’m certain it will merge in future versions.

Call-Center Coupons, can be defined to be an amount or a percentage, and may have valid period. The coupon can be a unique, and a one-time use, and

It can have item rules attached, that specifies what products the coupon can be used with or excluded from.

So solve Coupons for Retail/POS, you can see that all discounts have the option to apply a “Discount code”. I have not found a way to make this discount code to behave as a unique and on-time code. Even though AX-Sales orders not can use CRT as the price engine, there are no places, where the discount code can be applied to a sales order. It therefore only works on POS/eCommerce scenario’s It is a bit sad, because it means we have a GAP in the Omni-Channel offering.


Dynamics AX for retail have a GAP in relation to coupons to solve “One-Time” coupons, and in future versions I hope to see more features, like Store coupons, Manufacturer coupons, Mobile/Cell codes and promotion codes.

Trade discounts

Trade discounts is a most common way of creating discounts. 3 very common discounts are

  • Employee discounts

    This can be solved by using affiliations in Dynamics AX, and works for both POS and AX sales orders In order to leverage the Affiliation functionality at retail POS, the following must be setup:
    Price group(Retail)

    Affiliations, and link the Affiliation to the Employee price group.

    Discounts, and link to the price group “employee

    Each employee must be a customer, that is linked to the Affiliation, and you mark the customer with the relevant affiliation. When this customer in POS or in AX creates a sales order, the discount will apply.

    Affiliations can also be used for senior, military or any other groups.

  • Discount category

    With discount category I mean that we can have a discount on an entire category of product. Let’s say “Digital SLR camera” sale, as is exemplified in the Contoso demo data. We then don’t have to assign a discount to a specific product, but just refer to the retail category. When adding products from the category the discounts will automatically apply (after running some periodic jobs, and distributing the prices to the channels.)
  • Loyalty discounts

    To use the loyalty features, then a lot needs to be setup. But in relation to prices, it just means that you need to assign a price group to the discount. If we take the example of the “SLR” discount, we can associate it with a price group called “LP-FabGold”, meaning that all loyalty customers in the “Gold” tier will get the discount.

Mix and Match discounts

  • Mix/Match

    A mix and match discount gives customers a discount when they purchase a specific combination of products. In the Contoso demo data, take a look at the “30% off 3” discount. Here a 30% discount is given, if you select 3 of the specified products. Here we use the “Mix and Match discount” form.
  • BOGO

    A BOGO price is basically “get 2 for the price of one”. We also use the “Mix and Match discount” form for this, but we use the “Least expensive” price option, and give a discount of 100%, and specify that 2 units are required for the price to “kick-in”. In this case, the lucky customer gets 2 camera’s for the price of one J

    Other options also exists, like a 50% discount when selecting 2 of an item, or use line Spec and Product category feature in Mix and Match discount

  • Buy 3 pay for 2 promo
    This can be solved just like BOGO, but where the number of products are increased to 3.
  • Free item A if item B
    For this, we can take a look at the “Water Bottle Promo” in the Contoso demo data. We use the Mix and match discount, but use the “line spec” discount type. Then we create a “Mix and match line group, to specify that we need to have at least one item in the selected category/product, for the price to be active.

Quantity discounts

A quantity discount is a discount that is given to customers when they purchase a particular quantity of a product. For example, you can set up a 5 percent discount for the purchase of two products of a particular category or brand.

  • Buy 2 items get 5%, Buy 3 items get 10%, Buy 10 items get 40 %

Threshold discounts

A threshold discount is a discount that is given to customers when the total for a transaction reaches one or more specified amounts. For example, you can create a discount that gives a 5 percent discount for purchases over 100.00 or you can specify a fixed discount amount.

Buy for 100 $ get 5%, Buy for 200 $ get 10%, Buy for 900 $ get 40%

Price adjustments per channel

  • Online gives 5% discount, In Store gives 2% discount
    Channel based discounts are solved by assigning channel specific price groups to a channel, and then assign the different price adjustments to each price group.

Loyalty cards

The loyalty module in Dynamics AX is a large module with a lot to offer.

But there are some small GAPS you should be aware of, and that is to use loyalty points for payment in the call-center sales order. It could be I’m wrong, but I have not been able to efficiently use the Dynamics AX sales order screen efficiently with the loyalty module. A good blog for loyalty is available here.

Smart Rounding

Smart rounding is a feature that have no direct effect on retail pricing.

But it can adjust trade agreements, that indirectly works with retail. Also remember that it is not “automatic smart rounding”, and the smart rounding is applied to a price discount journal. A good blogpost on smart rounding is this one.

Recurrence and discounts

The ability of creating recurring sales is not directly related to Retail/POS. In AX there is a feature called continuity programs where delivery schedules can be setup.


I don’t think this modules works very nicely with POS and eCommerce, and is mainly a tool for call-centers and selling recurring items. More information is available on TechNet.

B2B Discount agreement

Price agreements is not working with retail-POS, and is not very good supported through the CRT. It is mainly used for B2B orders, and features exists both for sales and for purchase.

  • Yearly agreement to buy 1200 pcs, and get them for 450 $ per pcs.
    In this example I have an agreement to buy 1200 units, and I get the price of 460, until 12/1/2016. In this case, it is a product quantity commitment.

    Other types of commitment also exists like

    This makes it possible to create several different types and combinations. It features also contains Ok formletter that is confirming the agreements. Take a look at the following blog for additional information.

 

 

 

Synchronization problems in AX 2012 R3; Try disable track changes.

If you are installing a ISV solution into a Contoso environment, and you suddenly see that you cannot synchronize, and you cannot see why:

Here is a tip that works for me;

Disable the track changes in the specified tables;

Then the synchronization is able to perform. After synchronization you can enable it again.

If you want to switch on/off in SQL, then this is the script for Switching off;

 

To switch them on again;

 

Happy Daxing !

How to evaluate Dynamics AX ISV solutions

Dynamics AX 2012 R3 have truly evolved into an enterprise solution, and the functional width solves most requirements that any customer “really need”. But there is always the 80% / 20% rule that applies and there are requirements that is not solved by the standard Dynamics AX. We often see extended domain specific requirements in relation to finance, sales, procurement, logistics and production. To solve this, customers have the option to create customizations or to try to find VAR/ISV solutions that solves this. (VAR= Value Added Reseller. ISV =Independent software vendor) Creating your own large customizations can result in high risks, and the option to buy a “ready to use” ISV solution then becomes interesting.

What I wanted to give to the Dynamics Community is my list of how I evaluate ISV-solutions as a VAR. I basically just use a word document with a set of chapters, and the topics to evaluate here is from a consulting perspective. The idea is to have a formalized way of making evaluations, and also use the ISV/representatives to fill in the information for you.

1. Executive summary
{Yup. There is always someone to report to. }

2. Product Introduction
{Then I write a brief introduction to the product, describing the overall area that it solves.}

3. Background for evaluation
{Here I write why this product was evaluated in the first place.}

4. Vendor/Distributor information – general
{Then I record some general information about the company that offers the solution. Just to be sure that the company behind the solution will exists in the future. The following table can be used}

Comments
Company name
CEO
Sales manager
Turnover 2014
Results 2014
Number of employees
Number of customers
Number of references


5. Vendor/Distributor information – product

{The ISV-“mothership” may be large, but it is interesting also to check out the team that is organized around the product. I therefore have a secondary evaluation around the team developing and maintaining the ISV solution}

Comments
Product manager
Product turnover
Number of customers
Number of developers

100% dedicated

Part time dedicated

Number of consultants

100% dedicated

Part time dedicated

Support resources/routines


6. Pricing

{Describing product pricing, enhancement and also flexibility. Also include a case study with implementation. }

7. Marked/Cost savings potential
{ Describing the current marked potential, also reflect according to existing customer base. I also use a calculation sheet to show investment, potential, margins, number of sales each year, training costs, start up issues etc}

8. Known Competitors
{Describing any known competitors to the ISV solution and what the main advances this product have.}

9. Versions and language
{Describing supported versions and language of Dynamics AX. I also want to record how support is handled as new versions and upgrades go along}

10. Access to software
{Describing how the product is accessible from the ISV and how it is distributed}

11. Technical Installation guide

{Describing the quality of the installation guide, and I also try the installation to check how easy it is}

12. Application setup guide
{Describe the quality of the application guide, and I also try the application to check that it actually work}

13. Scenario testing
{In this chapter, we will test the different functional aspects of the product. The processes that is tested is the most common scenarios that we expect and have experienced in the field. Normally divided into a section per feature.}

14. Solution footprint
{The solution footprint is very important to evaluate, because this tells us how costly upgrades and applying hotfixes will be. We want to see high footprint on SYS elements for Dynamics AX. A way to easily evaluate this, is to count the number of SYS overlaying’s that exists and how many new elements that have been introduced. I therefore use a table like this}

{If there is a high number of SYS objects customized, I know that cumulative updates from Microsoft will be more costly.}

15. Best practice deviations
{To evaluate the coding quality, then evaluating all best practice deviations that can be found. Dynamics AX have building tools for this, and if there is a high amount is BP errors and warnings, it means that the code quality it not where it should be. }

Also get any documentation if the product have a CfMD certification (Certified for Microsoft Dynamics)

16. Support and training
{Describe how the product is supported from the vendor. There will always be questions and issues, so we need to know how this is handled. Often ISV solutions are purchased from VAR’s, and then it is good to know how this financially is handled. Also if there are any training included in the offering.}

17. Internal requirements
{Describe needed competence and needed training needed internally to implement, deliver and use this ISV-solution. }

18. Road map
{Show what plans there are for the product, what their priorities, what are their policies around upgrades, etc}

19. Marketing
{If you are a VAR, that want’s to include a ISV solution offering, it is important to understand how the ISV will support you in promoting the solution. Show what marketing material the ISV have and how you can use it. Do they have marketing insight they in the local market. Also find out how the ISV will support and drive market campaigns, participate in events, demos, etc with people and or marketing funds (money)}

ISV solutions play a big role in unlocking all of the possibilities that Dynamics harnesses. When analyzing your Dynamics ERP investment, it’s imperative that you also analyze your ISVs so that you can make the best decision for your business, both in the short and long-term. These topics will lead you toward making better evaluations while looking for an ISV solution. Feel free to use them, and extend them when needed.