Retail assortments and planned orders extensions

Microsoft have created an excellent description of this in the Assortment management doc-page. Retail assortments are essential to define what products that should be available across retail channels. Assigning a product to an assortment, will assign the product to the stores that have the assortment. This makes it possible to sell the product in the store.

But there is something missing, and that is to use assortments for replenishment and procurement. Retailers want the master planning to only suggest procurement and transfers on products that belongs to the stores assortments. You can control requirement parameters per warehouse, but there is no standard way to use the assortment to control the generation of planned orders.

This blog post is to show how to make a very small extension that allows you to make sure that only products that belongs to a stores assortment will generate planned orders. The solution I will make use of is to look into how the product lifecycle state functionality is working, and extend this with an assortment planning parameter. I have called this parameter “Is lifecycle state active for assortment procurement

What it will do, is to validate if a product is in the assortment of the store, and if it is, then the product will be requirement calculated and will generate planned orders. If the product is not in the assortment of the store, that no planned orders will be generated.

To make this happen, I needed to create 4 extensions. The three first is adding a new field on the product lifecycle form.  For an experienced developer this is easy to create, and no need to spend time on in this this blog-post.

The real fun is how to manage and control the planned order generation. This happens in in an extension to the ReqSetupDim class.  Here there is a lookup to the assortment on the product, and checks if the store have this assortment.  If not, then the masterplanning will not generate any planned orders. I therefore create an extention class and use the new method wrapping/CoC feature to add some additional code.

///
<summary>
/// Contains extension methods for the ReqSetupDim class.
/// </summary>

[ExtensionOf(classStr(ReqSetupDim))]
final class ReqSetupDim_extension
{

    ///
<summary>
    /// Validates if a product should be assortment planned
    /// </summary>

    /// The parm of the ReqSetupDim class.
    /// false if the product is not assortment planned; otherwise, return default value.
    public boolean  mustReqBeCreated(InventDim _inventDimComplete)
    {
        Boolean ret = next mustReqBeCreated(_inventDimComplete);

        if (ret)
        {
            if (inventdim.InventLocationId)
            {
                InventTable                 inventtable;
                EcoResProductLifecycleState ecoResProductLifecycleState;

                //Fetching fields from  inventtable
                select firstonly ProductLifecycleStateId, Product from  inventtable where inventtable.ItemId == this.itemId();

                //validating if the product is active for planning and that also assortment planning is enabled.
                select firstonly RecId from ecoResProductLifecycleState
                        where   ecoResProductLifecycleState.IsActiveForAssortmentPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.IsActiveForPlanning == true
                            &amp;&amp;  ecoResProductLifecycleState.StateId == inventtable.ProductLifecycleStateId;

                if(ecoResProductLifecycleState)
                {
                    RetailStoreTable                    store;
                    EcoResProduct                       product;
                    RetailAssortmentLookup              assortmentLookupInclude;
                    RetailAssortmentLookup              assortmentLookupExclude;

                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupInclude;
                    RetailAssortmentLookupChannelGroup  assortmentLookupChannelGroupExclude;

                    //Finding OMOperatingUnitID from the inventlocationId
                    while select firstonly OMOperatingUnitID from store
                        where store.inventlocation == inventdim.InventLocationId
                    {
                        //Check if the product is in the assortment of the store in question
                        select RecId from product
                            where product.RecId == inventtable.product
                        exists join assortmentLookupInclude
                            where   assortmentLookupInclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupInclude.lineType == RetailAssortmentExcludeIncludeType::Include
                        exists join assortmentLookupChannelGroupInclude
                                where   assortmentLookupChannelGroupInclude.OMOperatingUnitId == store.OMOperatingUnitID
                                    &amp;&amp;  assortmentLookupChannelGroupInclude.AssortmentId == assortmentLookupInclude.AssortmentId
                        notexists join assortmentLookupExclude
                            where   assortmentLookupExclude.ProductId == product.RecId
                                &amp;&amp;  assortmentLookupExclude.lineType == RetailAssortmentExcludeIncludeType::Exclude
                        exists join assortmentLookupChannelGroupExclude
                            where   assortmentLookupChannelGroupExclude.OMOperatingUnitId == store.OMOperatingUnitID
                                &amp;&amp;  assortmentLookupChannelGroupExclude.AssortmentId == assortmentLookupExclude.AssortmentId;

                        if (!product)
                        {
                            ret = false; //The product does NOT belong to the stores assortment, and should not be planned
                        }
                    }
                }
            }
        }
        return ret;
    }
}

I also have code to restrict creation of manual purchase orders, and where simular code can be used, but let’s hope that Microsoft can further extend standard Dynamics 365 with assortment based procurement planning.

Copy with pride, and let’s hope next year will give us 365 more opertunities.

POS Invoice Pay – #Dyn365F&O

A very nice omnichannel capability made available in Dynamics 365 version 8.1, is the ability for customers to pay their invoices directly in the POS. A scenario is that a customer is allowed to purchase “on-account” and then later pay all the invoices. Let’s say that the customer is in a hotel, and allows the customers to buy food, drinks and services throughout the stay. At the end of the stay the customer pays for all the services at the reception. Like “pay-before-your-leave”.

There is no requirement that the goods have to be sold on a POS. It is fully omnichannel capable. So, the orders can be created in the call-center, WEB or in stores. I would like to share this with you and how you can set it up in the Contoso demo data set. If you open the functionality profiles, you will find the possibility to enable paying:

  • Sales order invoice
  • Free text invoice
  • Project invoice (Yes! Even project invoices!)
  • Sales order credit note

The next thing you need to do is to add a “Sales invoice” – button to the transaction screen. (I’m using Houston store, and button grid F2T2)

This will add a sales invoice button to the POS design, that allows for paying invoices in POS.

The next thing is to create a POS transaction/order. First select a customer (like Karen), and then use the on-account button to sell the goods.

On the payment screen you can say how much you would like to put on account, and you also see that the credit limit and balance is available.

The next step requires that the there are some periodic batch jobs, that needs to run;

1. Run the “P-job”, to fetch the transactions from the channel database.

2. Run the “Calculate statement” (manually or in batch)

3. Run the “Post statement” (This process will create the sales order and the invoice)

!Make sure the statement is posted and invoiced before continuing!

The option you now have is to continue to the process in Dynamics 365, and create an automatic sending of the invoice to the customer through print management, or have the customer come to the “reception” and pay for the goods directly.

To pay the order, select the Karen customer, and use the Sales Invoice button.

If you have done all right, you should find the invoice in the list now. (If you have enabled aggregation in the parameters, you will have a single invoice per customer)

I can then select the invoice (or multiple), and pay it using cash, card, loyalty (And even on-account again)

This opens up for some very nice omnichannel processes, and I hope that Microsoft invests further in this. It would be nice to actually see the actual lines on the invoices that is being paid, and to even print-out the invoice if the customer requires this. Also I suggest that for retailers, use the modern report possibility to make the invoice look awesome.

Take care friends, and thanks for all your support and encouragement!

Retail category managers, Simplify your import of released products in #Dyn365FO

It is a category manager’s job to try to maximize profit from selling products within a specific category. This may be looking after a broad category such as ‘confectionery’ or they may focus closely on a more specific category, such as ‘snacking’. A category manager will analyze complex data collected on shopper behavior from a range of different sources, and then translate it into meaningful information. The category manager’s duty is to ensure that their company is providing the market with the products that consumers desire.

Retail Category managers love Excel. It is used for almost everything, and they perform much of the analyzing, lookup, data collection and decision making in Excel. When implementing Dynamics 365 we are often faces with large set of excel spreadsheets that needs to be imported. I have seen users import 8 different excel spreadsheets for importing products. This blog post is about how to simplify the process of keeping retail master data in single excel sheet and easily importing and updating products. For this, Dynamics 365 data management framework is used. One of the problems I often se uses are struggling with, is the issue that the source excel spread sheet is a single spreadsheet, but it needs to be imported into several data entities. For a retailer some of the most common master data entities are:

Data entity

Description of data entity

Products V2

Contains Product number, Product name and dimension groups

Released products V2

Contains most fields on the released product

Item – bar code

Contains the item barcodes used for scanning

Default order settings

Contains information like minimum purchase quantity etc.

External item descriptions for vendors

Vendors item numbers and descriptions

Product category assignments

The connection to the retail category hierarchy.

 

It is possible to create a single excel spreads sheet that overs all of these entities, and in a single run import or update the retail products.

So how to exactly do this?

Create an Excel spreadsheet with exactly the following.

I recommend creating two sheets. First one is a “read me” sheet, that explains the “template” sheet.

Use exactly the column names as described here. This will make the mapping between the columns and the data entity automatic. Here I also use color coding to show what entity each column mainly belongs to.

Field

Example Value

Comment

Tables

ITEMNUMBER

1005157

Product number

Released Products, Products

PRODUCTNUMBER

1005157

Product number

Released Products

PRODUCTNAME

Jalla Coffee 500G

Item name

Released Products, Products

PRODUCTSEARCHNAME

4001392 Jalla Coffee FILTER 500G

Seach name

Released Products, Products

SEARCHNAME

Jalla Coffee FILTER 500G

Seach name

Released Products

PRODUCTDESCRIPTION

Jalla Coffee Original is a useful coffee that can be enjoyed on most occasions. A carefully selected mix of coffee types, mainly from Brazil, guarantees a round and full-bodied coffee with long aftertaste

Full item description

Released Products, Products

PRODUCTSUBTYPE

Product

Should always be “product”

Released Products, Products

PRODUCTTYPE

Item

Item or Service

Released Products, Products

STORAGEDIMENSIONGROUPNAME

SiteWhLoc

Name of the storage dimension group

Released Products, Products

ISPURCHASEPRICEAUTOMATICALLYUPDATED

Yes/No

Should last purchase price be updated automatically

Released Products

ISUNITCOSTAUTOMATICALLYUPDATED

Yes/No

Should cost purchase price be updated automatically

Released Products

PRODUCTGROUPID

WHI

WHI(warehouse controlled) or, SRV(service)

Released Products

INVENTORYUNITSYMBOL

PCS

Inventory unit

Released Products

PURCHASEUNITSYMBOL

PCS

Purchase unit

Released Products

SALESUNITSYMBOL

PCS

Sales unit

Released Products

PURCHASEPRICE

0

Latest purchase price in local currency

Released Products

UNITCOST

0

Latest cost price Sin local currency

Released Products

SALESPRICE

0

Default sales price in local currency

Released Products

NETPRODUCTWEIGHT

0,5

Weight of the product

Released Products

PRIMARYVENDORACCOUNTNUMBER

20086

Primary vendor

Released Products

PURCHASESALESTAXITEMGROUPCODE

Middle

Purchase item tax groups

Released Products

SALESSALESTAXITEMGROUPCODE

Middle

Sales item tax groups

Released Products

BUYERGROUPID

P108

Grouping related to buyergroup

Released Products

TRACKINGDIMENSIONGROUPNAME

None

Tracking dimension

Released Products, Products

BASESALESPRICESOURCE

PurchPrice

Base sales prices on purchase price ?

Released Products

DEFAULTORDERTYPE

Purch

Standard verdier

Released Products

ITEMMODELGROUPID

FIFO

item model group

Released Products

PRODUCTCOVERAGEGROUPID

Min/Max

Coverage group

Released Products

COUNTGROUPID

PER

Gcount group

Released Products

PURCHASEPRICEQUANTITY

1

Purchase price quantity

Released Products

UNITCOSTQUANTITY

1

Cost price quantity

Released Products

DEFAULTLEDGERDIMENSIONDISPLAYVALUE

-D30-320—P108

Financial dimensions(=”-D30-320—“&B34)

Released Products

Product Dimension

P108

Just a helping colum

Help column for DefaultLedgerDimension

ProductCategoryHierarchyName

Retail category

Retail hierarcy name

Product category assignments

ProductCAtegoryName

Coffee

Category node

Product category assignments

VendorProductNumber

4001392

Vendors item number

External item descriptions for vendors

VendorProductDescription

Jalla Coffee FILTER 500G

Vendors item name

External item descriptions for vendors

VendorAccountNumber

20086

Vendor number

External item descriptions for vendors

BARCODESETUPID

EAN13

Barcode type

Item – Bar Code, Released products

BARCODE

7041011050007

Barcode

Item – Bar Code

PRODUCTQUANTITYUNITSYMBOL

PCS

barcode unit

Item – Bar Code

ISDEFAULTSCANNEDBARCODE

Yes

Scanning yes/no

Item – Bar Code

PRODUCTQUANTITY

1

Barcode quantity

Item – Bar Code

PURCHASEUNDERDELIVERYPERCENTAGE

20

Purchase under delivery percentage allowed

Released Products

PURCHASEOVERDELIVERYPERCENTAGE

20

Purchase over delivery percentage allowed

Released Products

MINIMUMPROCUREMENTORDERQUANTITY

x

Minimum purchase quantity

Default Order Settings

MAXIMUMPROCUREMENTORDERQUANTITY

x

Maximum purchase quantity

Default Order Settings

STANDARDPROCUREMENTORDERQUANTITY

x

Standard purchase quantity

Default Order Settings

PROCUREMENTQUANTITYMULTIPLES

x

Multiple purchase quantity

Default Order Settings

 

The template excel spread sheet columns should contain exactly the columns as listed above:

Then start building the excel spread sheet (this is the time consuming part). This can also be regarded as the “master file” for products. And mass update and mass import of products is done using this file. Remember that you can add more columns and also include calculated fields. Like in this case, the default dimension (used for financial dimension have the formula like =”-D30-320—“&B34 making sure that cell B34 is merged into the financial dimension.

Create the data management import project.

In the data management workspace, create a import project, and use the “+ Add file”, and select the excel file by using the “upload and add”. Then select all the entities and what page in the excel spread sheet that should be imported.

– Select file
– Select entity name
– Select sheet lookup
– Then repeat by select entity name and sheet lookup until all date entities needed are selected

After done this correctly you should have an import project with the following entities:

You should also click on the “view map” symbol if there are a warning, and just delete the lines where there are no mapping generated. Like what I have done here to the “Products V2” entity.

The mapping will be done automatically for you, and will only select the fields that is relevant for each data entity.

Your data entity is now ready to be used. I recommend to use the data management workspace, and select the import project and then “run project”

Then for each data entity I upload exactly the same excel spreadsheet :

And then click on the “import”. If there are any errors, then fix them in the excel sheet or make changes to the staging.

What we then have accomplished is to have a single excel spreadsheet that the category manager can maintain and work with, and it can uploaded(several times) into the import project. For trade agreement sales and purchase prices I normally recommend creating a separate excel spread sheet

Then the excel loving category managers will be happy, and they can import thousands of products in a very short time

 

 

 

 

 

 

 

D365F&O Retail: Combining important retail statement batch jobs

The Retail statement functionality in D365F&O is the process that puts everything together and makes sure transactions from POS flows into D365F&O HQ. Microsoft have made some improvements to the statement functionality that you can read here : https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/statement-posting-eod. I wanted to show how to combine these 3 processes into a single batch job.

The following drawing is an oversimplification of the process, but here the process starts with the opening of a shift in the POS (with start amount declaration), and then start selling in POS. Each time the job P-0001 upload channel transaction is executed, the transactions are fetched from the channel databases, and imported to D365F&O. If you are using shift-based statements, a statement will be calculated when the shift is closed. Using shift-based closing can be tricky, but I highly recommend doing this! After the statement is calculated and there are no issues, the statement will be posted, and an invoiced sales order is created. Then you have all your inventory and financial transactions in place.

 

What I often do see, is that customers are using 3 separate batch jobs for this. The results in the user experience that the retail statement form contains many calculated statements waiting for statement posting. Some customers say they only want to see statements where there are issues (like cash differences after shift is closed).

By combining the batch jobs into a sequenced batch job, then the calculated statements will be posted right again, instead of waiting until the post statement batch job is executed. Here is how to set this up:

1. Manually create a new “blank” batch job

 

2. Click on “View Tasks”.

3. Add the following 4 classes:

RetailCDXScheduleRunner – Upload channel transaction (also called P-job)

RetailTransactionSalesTransMark_Multi – Post inventory

RetailEodStatementCalculateBatchScheduler– Calculate statement

RetailEodStatementPostBatchScheduler – Post statement

Here I choose to include upload of transactions, post inventory, calculate statement and post statement into a single batch-job.

Also remember to ignore task failures.

And remember to click on the “parameters” to set the parameters on each task, like what organization notes that should be included.

On each batch task I also add conditions, so that the previous step needs to be completed before the batch-job starts on the next.

Then I have 1 single batch job, and when executing it spawns subsequent tasks nicely.

The benefit of this is that when you are opening the statements workspace you mostly see statements where there are cash differences, or where the issues on master data.

Take case and post your retail statements.

 

 

 

A quick look at download Retail distribution jobs (CDX)

Commerce Data Exchange (CDX) is a system that transfers data between the Dynamics 365 F&O headquarters database based and retail channels databases(RSSU/Offline database). The retail channels databases can the cloud based “default” channel database, the RSSU database and offline databases that is on the MPOS devices. If the look at the following figure from Microsoft docs, this blog post is explaining how to practically understand this.

What data is sent to the channel/offline databases?

In the retail menus you will find 2 menu items; Scheduler
jobs
and scheduler subjobs. Here the different data that can be sent is defined.

When setting up Dynamics 365 the first time, Microsoft have defined a set to ready to use scheduler jobs that get’s automatically created by the “initialize” menu item, as described here.

Scheduler jobs is a collection of the tables that should be sent, and sub jobs contains the actual mapping between D365 F&O and channel database fields. As seen in the next picture, the fields on the table CustTable in D365 is mapped towards the AX.CUSTTABLE in the channel database.

To explore what is/can be transferred, then explore the Scheduler jobs and scheduler subjobs.

Can I see what data is actually sent to the channel/offline databases?

Yes you can! In the retail menu, you should be able to find a Commerce Data Exchange, and a menu item named “Download sessions”.

Here you should see all data that is sent to the channel databases, and here there are a menu item names “Download file”.

This will download a Zip file, that contains CSV files, that corresponds to the Scheduler
jobs
and scheduler subjobs.

You can open this file in Excel to see the actual contents. (I have a few hidden columns and formatted the excel sheet to look better). So this means you can see the actual data being sent to the RSSU/Offline channel database.

All distribution jobs can be set up as batch jobs with different execution reoccurrence. If you want to make it simple but execute download distribution job 9999 to run every 30 minutes. If you have a more complex setup and need to better control when data is sent, then make separate distribution batch-jobs so that you can send new data to the channel databases in periods when there are less loads in the retail channels.

Too much data is sent to the channel databases/offline database and the MPOS is slow?

Retail is using change tracking, and this makes sure that only new and updated records is sent. This makes sure that amount of data is minimized. There is an important parameter, that controls how often a FULL distribution should be executed. By default it is 2 days. If you have lots of products and customers, we see that this generates very large distribution jobs with millions of records that will be distributed. By setting this to Zero, this will not happen. Very large distributions can cripple your POS’es, and your users will complain that the system is slow, or they get strange database errors. In version 8.1.3 it is expected to be changed to default to zero, meaning that full datasets will not be distributed automatically.

Change tracking seams not to be working?

As you may know, Dynamics 365 have also added the possibility to add change tracking on data entities when using BOYD. I have experienced that adjusting this affect the retail requirement for change tracking. If this happens, please use the Initialize retail scheduler to set this right again.

Missing upload transactions from your channel databases?

In some rare cases it have been experienced that there are missing transactions in D365, compared to what the POS is showing. The trick to resent all transactions is the following:

Run script: “delete crt.TableReplicationLog” in the RSSU DB. And the next P job will sync all transactions from RSSU DB (include missing ones).

 

Using Cloud POS as your retail mobile device

Handheld functionality for retailers is a question I get a lot. Then typical in the area of counting, replenishment, receive and daily POS operations. In version 8.1 Microsoft have taken a small step forward to make it easier to use any handheld device that supports a common browser. Because Cloud POS (CPOS) runs in a browser, the application isn’t installed on the device. Instead, the browser accesses the application code from the CPOS server. CPOS can’t directly access POS hardware or work in an offline state.

What Microsoft have done is to make the CPOS change according to the screen size, to work more effectively on your device. To make it simple, I just want to show you how it looks on my iPhone.

Step 1: Direct your browser towards the URL of where the CPOS is located. In LCS you will find the URL here:

Step 2: Activate your POS on mobile device by selecting store and register, and log in

Step 3: Log into CPOS and start using it. Here are some sample screens from my iPhone, where I count an item using CPOS.

You can also “simulate” this in your PC browser, but just reducing the size of your browser window before you log into CPOS. Here I’m showing the inventory lookup in CPOS.

What I would love to see more of is:

– Barcode scanning support using camera

– The ability to create replenishment/purchase orders in CPOS

– More receive capabilities like ASN/Pallet receive etc.

– Improved browser functionality (like back-forward browsing etc)

To me it seems clear that we will see additional improvements in CPOS, making it the preferred mobile platform for Dynamics 365 for Retail. As we get a little, I hope to see more of this as Microsoft is definitely investing in this area. In our own customer projects we will be developing more and more functionality using RTS (Real Time Service calls) to add more features to be used together with CPOS.

To take this to the next level, please also check evaluate to create a hybrid app, that incorporate CPOS in a app friendly way. Sources say that this will also allow us to build extensions like camera barcode scanning

The direction is right and my prediction for the future is that: Mobile Retail device = CPOS.

Report your bugs, free-riders!

Microsoft Dynamics 365 is the fastest innovation and most agile business software in the world. A very feature rich solution with a packed very fast moving roadmap. We see new possibilities and features coming monthly in platform update, fall/spring releases. if you look at the entire platform-stack including windows, office, and platform (power* apps) new features being made available on a daily basis. Being first and fast have changed and challenged the Dynamics 365 ecosystem. Mostly for the good.

But we have to recognize that it is people (and highly productive) behind this innovation tsunami. In such an environment there are thousands of elements that must to fit together. If you look towards the number of combinations on how you can use and setup Dynamics 365, I would assume that this is millions of combinations in the core product. And when adding office and power* apps, combinations just increases exponentially.

People are people, and there is a limitation to the numbers of combinations that can be tested, both from a manual and automated testing scenario. This leads to scenarios that there is no capacity to test everything before the product is released. It is not possible to test all of the millions of combinations, and I know that even Microsoft do not have unlimited people and resources to cover every test scenario.

This evidently results in issues and bugs that will be found when implementing Dynamics 365, and these needs to be reported to Microsoft support so that the fixes becomes part of the future solution.

Searching, testing, reporting a solution takes time and do cost money! Each time I find a bug, I report this to Microsoft so that all of the community can benefit of a fix. But as some have recognized that reporting issues/bug is requiring effort and resources. You report the bug, analyze the issue, report the issue, Microsoft provides hotfixes, the hotfix needs to be validated and testing and then deployed to the environment. This takes time, but is necessary!

With this blog post I urge both partners and customers to report your findings to Microsoft, so that all the rest of us can benefit that we are an ecosystem together. As I hope most of you know, we are quickly moving towards Dynamics 10, that is often referred to as the “ever-green” solution. This means that there are ONE version, that all customers are using, and that follows the Microsoft roadmap. When one customer reports an issue, and it is fixed, then all benefit from this.

Then there is the issue with the “free-riders”. These are the people that recognize the issue, find workarounds and DON’T take the investment in time and resources of reporting the issue. They know and see the issue, but choose to live with it or ignore it. Then in many cases, Microsoft if not even aware of any issue, and the issues just continues to be present in future releases. The best way is to report what you see to Microsoft support or to Microsoft ideas. Then Microsoft can take action on it, because they know of it.

So, I urge my fellow community friends to not be a Free-Rider, but report your issues. This will ensure that we all can share the resource/time burden among us, and we also improve and strengthen Dynamics 365, that we all will benefit from.

PS! Dynamics 365 is the BEST business application in the world!

Focus18 – EMEA – London

The User Groups for Dynamics 365, AX, CRM, BC/NAV, and Power BI road-trip named Focus is arriving to Europe and is making a stop in London from 5-6 September, 2018 featuring dive deep sessions covering advanced topics on D365 Finance and Operations and Customer Engagement. Additionally, specific topics to the Retail space including modern POS, inventory management, sales orders, ecommerce, credit card processing and more. This is great stuff!

It is a privilege for me to participate and present together with great MVP’s, Microsoft experts and the Dynamics 365 community. If you want to check out my sessions, I will have the following sessions:

Deep dive into retail pricing and discounts. 

This session is about what product sales price and discount options that exists in Dynamics 365 for Retail – “out-of-the-box”.  With actual and real examples of how to implement and maintain your retail prices.

 

Learn, Try, Buy for Retailers.

The “Learn, Try and Buy for Retailers” is an accelerated onboarding approach that enables you to evaluate if a cloud enabled Dynamics 365 for Retail is the right direction, and to be able to learn as much as possible prior to performing a business- and solution analysis. This is available for agile and iterative approaches, and this sessions shows why buying a small Dynamics 365 license is an affordable investment to purchase before scope of implementation have been defined. Using VSTS (Visual Studio Team Services) is a central topic in this session.

Power BI and Retail.  How to get the numbers.

This sessions shows how to publish retail transactions into a Azure SQL database or CDS(Common Data Services), and then analyze the retail sales in Power BI.

Check out https://www.focusemea.com/locations/london as there are many other very interesting sessions.

 

See you in London!

 

 

Microsoft Business Applications sessions on-demand and Dynamics 365 version 10

The Microsoft Business Applications sessions are now available on-demand https://www.microsoft.com/en-us/businessapplicationssummit/sessionsondemand

I enjoyed the following sessions:

Client usability and productivity improvements in the October release and beyond for Microsoft Dynamics 365 for Finance and Operations

Monitoring Microsoft Dynamics 365 for Finance and Operations with Lifecycle Services

Microsoft Dynamics 365 for Retail: Reliable data management and payment processing

Microsoft Dynamics 365 for Retail: Delivering cloud driven intelligence and tools to enable enterprise manageability

 

I also want to highlight the following session, where Microsoft is explaining Dynamics 365 version 10 (Thanks Shelly)

Microsoft managed continuous updates and support experience for Microsoft Dynamics 365 Finance and Operations

Vote on Dynamics 365 ideas

Do you know that you can influence the direction of Dynamics 365? But you may be unsure as to whether it really will make a difference. Microsoft have a site where the community can add ideas and vote on them. Go to https://experience.dynamics.com/ideas/ and create your ideas. If the idea is valid and they get enough votes, Microsoft will act and include them in their product backlog. But equally important is the ability to vote on other’s ideas.

  • Voting is the most important way to make the community voice heard on the issues that concerns the roadmap for Dynamics 365.
  • Voting gives you an opportunity to be part of the priority that affects Dynamics 365.
  • If YOU don’t Vote Others will make the decisions for YOU!

As we speak, there are 1673 ideas for Microsoft Dynamics 365 for Finance and Operations and 212 ideas for Microsoft Dynamics 365 for Retail. Microsoft employees are some of most actives to add ideas to their site.

The ideas portal allows you to see as the ideas more from an idea to being part of the product:

An important unofficial note is that for an idea to be moved from “New” to “Under Review” it requires at least 10 votes. Also discussions is possible on the ideas, and to add additional substance to the requirements.

You can also keep track of your own ideas and votes you have submitted.

If I have a few minutes of spare time, I like to go in and look at the new ideas submitted and read them. When there are ideas I like, I vote on them.

The more we use this channel to give ideas and feedback, the more important it will be. So please go in and vote at https://experience.dynamics.com/ideas/

(And if you find some of mine, please give it a vote )

MSDYN365FO: Automate repetitive tasks – the easy way

Here the other day, I got the task of posting a few thousand Retail Kit orders / BOM-Journals because they failed at the first time. I started, and managed to manually post 50 journals before my fingers got cramps and I started to feel dizzy. I could not multiselect the journals and post them, so I had to manually click “post” on each journal.

I surely sent a SR to Microsoft explaining that this should be easier in standard, and that SR is in process. But it will probably end up as a “As-Designed” state, or “post it to ideas.microsoft.com”.

But there is an easier low-tech way to solving this. Just install a Mouse-ghost app, and it will repeat the task for you. So I used the app “Mouse Recorder Premium” to post all the 1300 journals, and it went smoothly. Just record the clicks and then repeat for a 1000 times.

To make sure I did not “lock” my PC while this was performing, I started the task in a Hyper-V VM, and then it can run in the background.

That’s today’s small trick to get rid of repetitive tasks

D365FO – Some nice excel tricks

When working with importing master data into Dynamics 365 you will experience that they are available in different data entities. In a typical retail project you would need to import data like released products, item barcodes, external item numbers price. It is also common that we get the master data in many files and in different formats. It is therefore quite beneficial to know a few tricks so that it becomes easer to work with loads of data. Here are my tips.

Export all/selected rows (You should know this!)

From any grid in D365FO you can export selected/all rows to excel by right clicking on the grid. The tip is therefore to make a personalization to the grid, so that it contains the fields you want to export to excel.

Then Excel opens with the selected columns. (PS! This export is limited to 10.000 rows)

Use excel to create a filter

Let’s say we have a excel spread sheet with item numbers, and want to filter in D365FO on these items. Here is a very valuable tip.

  1. Copy the items column from excel and paste as rows in a new excel sheet.(Transpose)

  1. Then copy the row, and paste into notepad

  2. Then do a search, replace in notepad, where you copy the space/tab and replace it with comma (,)

  3. Then copy the content here and use it on a “match” filter in D365FO

     

  4. Then you have created a filter on the selected field. It seams the “match” filer is capable of handling quite a lot of text.

This is nice when some asks you to “Please fix these 200 items”. You then filter them and quite quickly go through them to fix it.

Learn Excel VLOOKUP

VLOOKUP is essential to learn, because it let’s you check and lookup data across multiple excel sheet. A typical scenario in the retail world is when the vendor sends a new pricelist, and you want to import them. Often this is delivered as a excel sheet with the vendor item number, item barcode and the price. Most retailers prefers to have their own item numbers. But then you have the issue of mapping the item barcode from the vendor pricelist and trying to find your own product number. Here is how I recommend my customers to do it:

  1. Export all D365FO item barcodes to excel (There is an entity for this, or open the barcodes from the retail menu)
  2. In the vendor excel price list, create a VLOOKUP field to lookup the D365FO product number based on the item barcode.

  3. Then you can create an excel sheet where you have your own product number, and you can import them using “open in excel” or through a data management import job.

     

     

Happy weekend friends !

Measure sales per Retail Category in Power BI

Drill down on sales per category, employee, and department is key essentials for Retailers. Doing this gives a more specific view of what’s generating sales and what isn’t. Having insights into top categories or departments might help make decisions about purchasing and marketing. A good point of sale comes with reporting and analytics, so you can quickly get the data you need, whenever you need it — without manual calculations.

Power BI is a must have for all retailers, and this blogpost is about creating a retail category hierarchy in power BI.

If you have worked with Retail Categories, you know that there exists a “parent-child” relationship between the categories as illustrated from the following data in the Contoso demodata set.

In power BI it is possible to also create such hierarchies, but it requires some minor changes to reflect this. My inspiration came from Power BI Tutorial: Flatten Parent Child Hierarchy. I will not go through how I build a retail power BI analysis, but I can share that I use ODATA entities, and here is the entities I’m using:

More information on the data model is available in DOCS her.

The “trick” is to create a new column named “Path“, and a column named CategoryL[X] for each level in the hierarchy, that for the RetailProductHierarchyCategories looks like this:

Here are the column formulas

Path = PATH(RetailProductHierarchyCategories[CategoryName];RetailProductHierarchyCategories[ParentCategoryName])

CategoryL2 = PATHITEM(RetailProductHierarchyCategories[Path];2)

CategoryL3 = PATHITEM(RetailProductHierarchyCategories[Path];3)

CategoryL4 = PATHITEM(RetailProductHierarchyCategories[Path];4)

CategoryL5 = PATHITEM(RetailProductHierarchyCategories[Path];5)

…etc

Then I create a new hierarchy column for, where I specify

And I use the Hierarchy Slicer that is available in the power BI marketplace.

In power BI I then get a Retail Category slicer, and can filter and measure sales per category in power BI

Microsoft are in process of aligning ourselves with future of Power BI and create the new version of Retail Channel Performance with New Common Data Service for Analytics capability coming to Power BI https://powerbi.microsoft.com/en-us/cds-analytics/

Keep on rocking #MSDYN365FO!

First Aid Kit for Dynamics 365 for Retail; A messy blog post

First, I want to say that Microsoft Dynamics 365 for Retail is the best retail system in the world. What we can do is just amazing! This blog post is going to be a mess without meaningful structure, because the purpose of this post is to quickly give 911-help to retailers, so that the they can continue their daily operations. I this blog post is primary focusing on the MPOS(Modern POS) with offline database and when having a local RSSU(Retail Store Scale Unit). Also, this blog post will be incrementally changed and new topics will be added. So please be welcome to revisit later.

MPOS Hardware

Microsoft do not give recommendations on hardware, but they have tested some hardware. I also can share what is working for a scenario where an offline database on the MPOS should be installed.

HP RP9 G1 AiO Retail System, Model 9018
Microsoft Windows 10 enterprise 64-bit OS – LTSB
HP RP9 Integrated Bar Code Scanner (as a secondary mounted scanner)
128GB M.2 SATA 3D SSD
◾ 16 Gb Ram
Intel Core i5-6500TE 3.3 6M 2133 4C CPU
HP RP9 Integrated Dual-Head MSR -Right (For log-on card reading)
HP L7014 14-inch Retail Monitor-Europe (for dual display)
HP LAN THERMAL RECEIPT PRINTER-EUROPE – ENGLISH LOCALIZATION (TC_POS_TERMALPRINT_BTO)

A small tip; OPOS devices are slow and unpredictable. Try to avoid them. But in this hardware we still had to use OPOS for the receipt printer and the cash drawer.

All drivers related to this machine is available her.

Payment terminals

Building payment connectors is time consuming, but Microsoft have provided documentation and samples that is available her. For me, I prefer ISV solutions for this.
◾ Ingenico iPP 350 Payment terminal (Requires a ISV payment solution)

Additional Scanners

◾ SYMBOL DS9808

Datalogic – Magellan 3200Vsi

Remember to open the scanner documentation, and to scan barcodes to program them to make sure to Enable Carriage Return/Line Feed, adjust beeping etc.

Generic preparation recommendations when having issues

In the following chapter is some preparation steps that you should be prepared to do.

Install TeamViewer on the MPOS device

To make sure that a professional quickly can analyze the device, we always try to use or install team viewer on the RSSU and MPOS devices. This makes it possible to access the machines. Please follow security precautions when using TeamViewer.

Start collecting information

Dynamics 365 for Retail contains a comprehensive set of events that is logged in the system, and that is available for IT resources. Please check out the following pages for additional steps to troubleshoot.

https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-component-events-diagnostics-troubleshooting

The following section contains issues experienced with manually installing Dynamics 365 MPOS.

If you cannot figure it out quickly, create a Microsoft support request as fast as you can. Normally Microsoft responds fast and can give recommendations quite quickly, but often they will need information on the actual machine to see if there are issues related to software and hardware. MPOS and RSSU is logging a tremendous set of information that is relevant for a support case. Take pictures, screen dumps and collect data.

Event logs

Always look into the event logs on the MPOS and the RSSU. Also learn to export the event logs as they can give valuable information on what is wrong. The following event logs are of interest.

•    Windows > Application
•    Windows > Security
•    Windows > System
•    Application and Services Logs > MPOS/Operational

Machine information

Collect Microsoft System Information, such as devices that are installed in the MPOS or device drivers loaded, and provides a menu for displaying the associated system topic. To collect this data do

  • Run a Command Prompt as an Administrator
  • Execute MSINFO32.exe
  • Go to Menu File > Save as machine.nfo

Backups of the local database

Take backups of the RSSU and local database, as this can be handy to analyze the data composition of the database. Some times Microsoft will ask for exact database version and information like:

  • What version of SQL is this?

    Further, is this Standard, Enterprise, Express, etc.?
    => Run query select @@version and share the resulting string.

  • How large is the SQL DB at this time?
  • Plenty of space available on the hard drive still?
  • What is the current size of the offline database and RetailChannelDatabase log file?

RSSU installation and Checklist

The setup and installation of RSSU is documented in the Microsoft DOCS https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-store-scale-unit-configuration-installation

  • Operating system is Windows 10 Enterprise LTSB with separate disk for SQL. SSD disks is highly recommended!
  • SQL Server 2016 standard edition with full text search installed locally on server.
    – I would not recommend SQL Express on a RSSU with multiple MPOS’es installed.
  • Install .NET 3.5 ,4.6, IIS and run Windows update before setup
  • Make sure that SSL certificates(RSSU and MPOS) have been installed and setup on the machine. Remember to add them to you Azure account
  • Verify that you have an Azure AD credentials that you can use to sign in to Retail headquarters.
  • Verify that you have administrative or root access to install Retail Modern POS on a device.
  • Verify that you can access the Retail Server from the device. (like ping with https://XXX.YY.ZZ/RetailServer/healthcheck?testname=ping)

  • Verify that the Microsoft Dynamics 365 for Retail, Enterprise edition, environment contains the Retail permission groups and jobs in the Human resources module. These permission groups and jobs should have been installed as part of the demo data.

A small, but important information about the RSSU. It is designed to always have some kind of cloud connection. If it loses this connection, then strange issues starts to occur. Especially in relation to RTS calls (Realtime Service Calls)

Set Async interval on RSSU

This has been described in a previous blogpost.

Installation of MPOS issues

There are a number of pre-requisites that needs to be followed that is available on Microsoft DOCS. Read them very carefully and follow them to the letter. Do not assume anything unless stated in the documentation. Also read https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/dev-itpro/retail-device-activation. Here are my additional tips:

When having customizations or extensions

If you have made extensions, remember to make sure that the developer that have made the deployable package have build the package with “configuration = Release”. There are scenario’s where the MPOS installation can give issues like this.

There are scenarios where making a MPOS build with configuration = debug for internal use, please take a look at the following Microsoft blog-post.

Having the right local SQL express with the right user access on the MPOS

If you are making a retail POS image (With Windows and SQL preinstalled), please make sure to select the right SQL version(Currently SQL 2014 SP2). If SQL express is not already installed, then the MPOS installer will automatically download and install it. But the file is 1.6 Gb, and it is therefore recommended to manually install the SQL express, or have it as part of the standard image. SQL Express is available her, and select the SQLEXPRADV_x64_ENU.exe


There are ways of using SQL Express 2017 with MPOS, but I recommend to wait doing this until Microsoft officially includes this in their installer. Also remember that the SQL Express have some limitations, like it can only use 1 Gb of Ram, and have a 10 Gb database size limitation.

I recommend creating two users on a MPOS machine:

– A PosUser@XXX.YYY, that is a user with very limited rights on the machine, and customers often wants auto login to the machine using this user. But this user also needs administrator elevation when it should do administrator stuff on the machine.

– A PosInstaller@XXX.YYY, that have administrator rights on the local MPOS machine.

When installing, remember to add both the PosUser and PosInstaller as users in the SQL when installing the SQL Express, else the installer struggles to create the offline databases.

Cannot download MPOS package from Dynamics 365

If you try to manually download the installation package, windows explorer have been setup to sometimes deny this.

The reason for this could be a certificate problem with the package. The work-around for this, is to use Chrome when downloading.

Cannot install the MPOS Offline package

When installing the MPOS the following error may come. In many cases the user must be leveraged to administrator. If you receive the following error, it means that the version you are installing is older than the existing version, and the current version must be uninstalled first. Do not try to install a higher version than is deployed in your Cloud RSSU default database, as this is not supported. Also if you need to “down-grade” a MPOS, then uninstall the MPOS first, and then reinstall the older release.

PowerShell scripts for manual uninstalling of MPOS

In 95% of any situation, just uninstalling the MPOS app should work. But if you are out of options, Microsoft have created an uninstall powershell script.

Cd “C:\Program Files (x86)\Microsoft Dynamics 365\70\Retail Modern POS\Tools”

Uninstall-RetailModernPOS.ps1

I often experience that we need to run the uninstall in the following sequence:

1. Run it as a local administrator

2. Then a “uninstall” icon appears on the desktop, that we need to click on

3. Run it again as a local administrator

Then the MPOS is gone, and you can reinstall the correct MPOS.

Connectivity issues

Here are some tips on connectivity issues, and how to solve them.

MPOS is slow to log in

When starting the MPOS, it sometimes can use a few seconds before available. We see this, it you typical have a slow internet connection with high latency. The MPOS is doing some stuff towards the cloud, and this just takes time.

MPOS cannot go online after being offline

I think this behavior currently is some bug that can happen in certain situations and if the RSSU looses internet connectivity. Microsoft are investigating the causes. If not possible to go online after the MPOS have been in offline, it is possible to reactivate the MPOS to get online. In the event log you may see issues like this : “UpsertAndValidateShifts”

Rename the file: C:\Users\[POS-User]\AppData\Local\Packages\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt\AC\Microsoft\Internet Explorer\DOMStore\DSSWV5L9\microsoft.dynamics.retail[1].xml

Then reactivate the MPOS with RSSU address, register and device and login with the D365.posinstaller.

IMPORTATANT: Remember to select hardware station when logging into the MPOS afterwards!

This is not a supported “fix” from Microsoft, and it is expected that Microsoft will find a permanent solution to this issue.

MPOS cannot connect with the payment connector

The following is mainly related to some issues that could be happening if having a third party payment connector using PINPAD. In most generic cases this is not relevant for those that is using standard or other payment connectors.

1. First check that Hardware station is selected on the MPOS.

2. The next step is to reboot the PC

3. If still not working, copy the file MerchantInformation.xml to the folder “C:\ProgramData\Microsoft Dynamics AX\Retail Hardware Station”. AND to C:\Users\[POS-User]\AppData\Local\Microsoft Dynamics AX\Retail Hardware Station. This will ensure that the payment is working as expected also in offline mode. The MerchantInformation.xml is a file that is downloaded from the cloud the first time the POS is started. If changing the hardware profile

4. Is still not working, open the hardware profile and in the profile ” set the EFT Service to Payment connector and test connector. This will download the MerchantInformation.xml again.

Then run the 1090 distribution job. After X minutes, try to restart the MPOS, and try to perform a payment. This should also automatically regenerate the MerchantInformation.xml. Microsoft is working on a fix for this, and you can follow the issue her.

PS! Normally a production environment should not need to have connection to the Microsoft test connector

Retail offline database exceeds 10 Gb limit

To ensure that a POS don’t exceed the SQL Express 10 Gb disk restrictions, I have created a SQL script that reduces size of the log file.  Please evaluate to implement on all POS’es.

Getting strange errors like “The channel does not exist or was not published”

In some rare situations you could experience getting errors like.

Our experience is that this could happen if the database on the RSSU is overloaded, and are not able to respond to MPOS connections. Log into the RSSU and check out if the CPU, database og disks are not able to respond. If you have SQL express on the RSSU, we have experienced this. Also try to not push to many distribution jobs too frequently. In a situation we uploaded 400.000 customers, while running the distribution job 1010 (customers) every 5 minutes. That “killed” the RSSU when having SQL express.

Getting strange errors like “A database error occurred”

We have also experienced this when the RSSU is overloaded. Remember that the Microsoft recommendation on the RSSU hardware needs to be scaled accordingly to hos many MPOS’es is connected and how much data and transaction volume. Get an SQL expert to evaluate the setup of the RSSU prior to go live and remember to volume test the setup.

Hot to fix ? Scale up your RSSU.

Getting strange errors like “We where unable to obtain the card payment accept page URL”

We have also experienced the following issue. The solution was simple; Remember to enable the local hardware station on the MPOS.

Getting strange errors like “StaffId”, when returning a transaction

In a situation where there are connection between the MPOS and the RSSU, but the RSSU don’t have a connection to the cloud, AND you perform a “return transaction”. You may get the following error.

“Return transaction” is defined as an operation that require online RTS (Real-Time-Service calls). The following list defines all POS operations, and if they are available in offline mode.
The solution in this situation is therefore to use the POS operation “Return Product” instead on the MPOS.

Keep and eye on your devices.

In the menu item Channel client connection status you can see last time each device was connected.

Functional issues

With functional issues I refer to issues that is related to user errors and more functional issues that can occure.

Dynamics 365 for Retail on version 8

Even though version 8 have been launched for Dynamics 365 for Finance and Operations, I have not seen that Retail yet(10 may 2018) is supported on version 8. So before going forward on version 8, please check with Microsoft support.

Barcode scanned as tendered currency amount

This is a funny issue, that can occur. Some background story is in place here. A customer wants to pay for the product in another currency, and the cashier selected the “pay currency” on the MPOS, ready to key in the amount that the customer is paying. But unfortunately, the cashier scanned the product barcode, and then the MPOS committed the sale as the customer had paid 7.622.100.917,80 in currency, and should have 5.707.750.079.417 in return (local currency). Lesson learned; Always remember to set the parameters “Overtender maximum amount” and the Amounts fields.

How to fix it? You actually need to create a Microsoft support request to have perform make some changes in the database. This takes time, and it have to be first performed in the staging environment that is updated. It can take a lot of time! So make sure you set these parameters right before you go live.

Cannot post retail statement, because of a rounding issue.

This is a known issue, and Microsoft have a hotfix for this. Always make sure you periodically update you system with the latest hotfixes. Here is my small tip on this; Try 4-5 time to click on post, and then it suddenly goes though and get’s posted. We do not know why ??

Retail statement (Legacy) and Retail Statement

In version 7.3.2, Microsoft released a new set of functionality for calculating and posting retail statements. You can read more about it her. Microsoft recommend that you use the Retail statements configuration key for the improved statement posting feature, unless you have compelling reasons to use the Retail statements (legacy) configuration key instead. Microsoft will continue to invest in the new and improved statement posting feature, and it’s important that you switch to it at the earliest opportunity to benefit from it. The legacy statement posting feature will be deprecated in a future release.

Access hidden Retail menu items.

The form “Retail Store transactions” contains all retail transactions that is received from the MPOS/RSSU’s, and here you will find, sales, logins, payments etc. This first step for any user should be to personalize this form, and only show the relevant fields and columns(Not done here).

You can dig deeper into the transactions, by clicking the “Transactions menu”

If I here open the “Payment transactions” I get a filtered view of the payment transactions related to that receipt.

BUT! In many cases you would like to look on ALL the payment transactions, and not only the those related to a specific receipt. But there are no menu items that let’s you see all payment transactions in one form.

Here is my tip. Right click on the form and then you can see the Form name. Click on that …

And you should be able to see the menu item name.

Then copy your D365FO URL, and replace the menu item name, and open it in another browser tab.

Then you get a nice list of all payment transactions regardless of what receipt is connected to

This procedure can be used most places in Dynamics 365. For retail, this is excellent because some times you need to find specific transactions. If you need to reconcile banked transactions (where you have a Bag number), then you can use this approach to see all banked bag numbers in a single form. But here is a list of the most common ones:

Sales transactions(items) &mi=RetailTransactionSalesTrans
Payment transactions &mi=RetailTransactionPaymentTrans
Discount transactions &mi=RetailTransactionDiscountTrans
Income/Expense transactions &mi=RetailTransactionIncomeExpenseTrans
Info code transactions &mi=RetailTransactionInfocodeTrans
Banked declaration transactions &mi=RetailTransactionBankedTenderTrans
Safe tender transactions &mi=RetailTransactionSafeTenderTrans
Loyalty card transactions &mi=RetailTransactionLoyaltyRewardPointTrans
Order/Invoice transactions &mi=RetailTransactionOrderInvoiceTrans

Unit conversion between <unit 1> and <unit 2> does not exist.

If you use Retail Kitting, and have kits with intraclass unit conversions, then there is an issue, that Microsoft is working on. This is scenarios where the included kit line is stocked in pcs and consumed in centiliters. Luckily Microsoft is working on this, and we expect a fix on this.

Wrong date format on the POS receipt.

In EN-US we have the date format MM/DD/YYYY. In Europe we use DD/MM/YYYY. The date format on the receipt is controlled by the language code defined on the store. We often prefer to have EN-US as the language on stores, but this gets wrong date format. Therefore to get the right date format on the receipt, you either have to maintain product names/descriptions in multiple languages (like both EN-US and EN-GB), and specify that the languageon the POS store should be EN-GB. We are working on finding a better and more permanent solution to this.

Dual display.

Microsoft writes: “When a secondary display is configured, the number 2 Windows display is used to show basic information. The purpose of the secondary display is to support independent software vendor (ISV) extension, because out of the box, the secondary display isn’t configurable and shows limited content. ” In short…. You have to create/develop it yourself in the project. This requires a skilled Retail developer that masters RetailSDK, C# and javascript.

Credit Card payment with signature

In certain situations it could happen that the payment terminal is capable of processing the payment, but for some reason this is not closing the “waiting for customer payment”. In most cases this is related to the payment terminal being able to perform offline transactions, and then the payment terminal will print a receipt where the customer must sign. In such cases we have created a separate payment method called “pay with signature”, that is posted in exactly the same way as a credit card payment method. Then the cashier is able to continue the payment processing, and register that the payment was ok, and then print out the receipt.

Something very wrong was done by the cashier, then suspend the transaction

If there for some reason, the cashier is not able to continue on the transaction, the casher have the option of suspending the transaction, and then continue. Then later, the POS experts can resume the transaction, and find out what went wrong.

Setting up MPOS in tablet mode

The MPOS works very nice in tablet mode. But if you have dual display, the PC cannot be put into tablet mode. We have not found a way to fix, and if you know, please share.

MPOS resolution and screen layout does not fit the screen

Do not just set the MPOS resolution to the screen resolution. If there is a “title bar”, you need to subtract that title bar height from the screen layout. This is important in scenarios where you have dual displays.

Use lock screen and not log off on the registers.

The log-out/in process to more “costly” from a resource perspective than the lock operation.

Keep the MPOS running (but logged out) when not using the device.

As the Dynamics 365 periodically sends new data to the MPOS offline database, this will be done through the day/night. Then the MPOS is “fit-for-fight” when the user logs in.

Run Distribution jobs in batch

My guide lines on retail distribution jobs is that all Retail jobs will start with the R-prefix, followed by the number. Download distribution jobs will be R1000-1999. Upload Distribution jobs will be R2000-2999. Processing batch jobs will be R3000-3999. Retail supply chain processes will be named R4000-4999.

There are a number of jobs distributing data from Dynamics 365 to the store databases (RSSU) and the offline databases. The jobs and suggested recurrence I suggest is

That’s my tips for today. If you have read this completely to the end, I’m VERY impressed, and let me know in the comments.

Failed ERP implementation will change partners to become trusted advisors.

A norwegian customer won a compensation case against an ERP implementation partner after the customer terminated the parties’ agreement on the supply of a new ERP. The customer was compensated by the Norwegian district court assessed at 288 mNOK (36,7 mUSD). Originally the contract was worth 120 mNOK. You can read the complete story here http://www.selmer.no/en/nyhet/felleskjopet-agri-wins-district-court-case. The court decision is expected to be appealed.

Luckily this was NOT a Dynamics 365 implementation, and the customer is actually replacing the failed ERP system with Dynamics 365. The reason why I wanted to write about this story is that it has implications on how much risk and responsibility an ERP implementation partner can take. A major part of the ERP partners are smaller companies with less than 100 employees, than cannot take the risk of getting into such a situation. There are always problems and risks that is beyond what a ERP partner can control. Partners are not the developer company of the standard software. They are implementing, and in some cases adding additional extensions. Also the cloud based software are running on azure that is beyond the control of the partner.

How can this change partners behavior? Partners are changing towards becoming verticalized trusted advisors, but with limited responsibilities. We can give recommendations based on what we know about the software and how to use it efficiently but the costs are more on a T&M(Time and Material) basis. It will more be the customer them selves that is responsible for the implementation and time-tables.

Some customers will not accept this change, but other do. There are currently resource constrains in the Dynamics 365 partner channel and we partners avoiding customers that takes a back-seat approach towards their implementation projects. The sales focus will change towards those customers that take more of the responsibility themselves, and that do understand to take a more dynamic and agile approach. A 400-page requirement document is not a good start for an ERP project, as we see the digitalization possibilities are accelerating. We also see that customers don’t run a 2 year ERP implementation project before going live. They run a 90 days project to get live with only parts of their requirements. The project then takes on other areas and they extend their use of the Dynamics 365.

At the end, I include some trusted advisor recommendations that I think can inspire anyone that is about to start a project.

D365FO – Speed up Retail RSSU download performance

If you don’t know what RSSU is, I suggest reading this, but the RSSU is about having a database locally in your store that MPOS or CPOS can connect to. It is typically used if you have an unreliable or slow internet connection.

One of the things you can evaluate is to implement the Azure Express Route, and Microsoft have released a whitepaper for Dynamics 365. This can really speed up the connectivity performance.

Another thing I see is annoying is that the local RSSU is only picking up the distribution files every 15 minutes. The Cloud channel database is really fast. This means that when sending new products or prices to the RSSU, it can take up to 15 minutes before this data is available in the MPOS. That is really annoying to wait 15 minutes when testing.

In the Microsoft documentation we are instructed to use the Data Sync interval to speed up the synchronization. But somehow it does not work.

But there is a way around this. On the local RSSU there is a configuration file, where you can modify how often the RRSU should request new data to be downloaded.

Then change the following two lines:

Then just restart the AsyncClient Services and reset the IIS on the RSSU box. Then the distribution of data to the RSSU is really speeding up.

But what is the recommended setting from Microsoft ?

This is recommended to make the RSSU request packages at an interval that is a proper fraction of what the packages are generated at. So if you are sending new products every 10 minutes? Do 5 minutes. If you are sending new products every 5 minutes, do 2 minutes download interval. The higher frequency the more often the RSSU will request data, and some consider this as a waste of bandwidth.

Good luck in your retail implementation

D365FOE-Moving to a new tenant

Companies change, merge, sell, purchase each other, and we encounter requirements where it is a requirement to move to a new/other Azure AD tenant.

But…. That’s not a small thing. We requested through a Microsoft support ticket on how to do this, and hoping this was a small formality, and that Microsoft had some magic tricks of doing this. But they don’t. But I can explain the process we are on to achieve this.

  1. Create Azure subscription on new tenant.
  2. Buy a new required licenses in new CSP-subscription for D365FO DEV/TEST/PROD instance.
  3. Add admin user on new tenant to the new LCS.
  4. Setup new azure connector in existing LCS project with the new subscription.
  5. Deploy new DEV/TEST/PROD environments for the new connector in the new tenant
  6. Setup new VSTS in the new tenant.
  7. Copy all checked-in code from old to new VSTS.
  8. Import all checked-in code from new VSTS to new DEV environment.
  9. Compile and install the code packages into the new stage environment.
  10. Request DB copy from “old” PROD to the “old” stage environment.
  11. Export an Azure back-pack from the “old” stage environment.
  12. Import the Azure back-pack into the “new” Dev environment.
  13. Run AdminUserProvisioning tool with admin user from new tenant to swap tenant.
  14. Repopulate email settings, users and other settings lost by the copy.
  15. Check, Check, Check….Fix, Fix, Fix.
  16. Request DSE to copy new stage to new PROD (only possible once).
  17. Check, Check, Check….Fix, Fix, Fix.
  18. Suspend/end the “old” CSP subscription.

In the process you will lose all documents that is record attached/stored in the old environment. There are also some other expected issues.

Do expect to spend some time on such a process. And it’s a good thing to perform the DB copy two times (first time just for validation and test). Microsoft is looking into how to improve this process, but this is how we are performing it.

If any in the community have better ideas, feel free to share it

BIG credits to my colleague HAKAM.

Great stuff on the D365 roadmap

What we currently see is that more and more power user functionality is introduced step-by-step to make Dynamics 365 ready for the next natural technological step; to become a true SaaS solution built as a Azure service fabric. Check out this video from Microsoft for what I hope is the future and architecture direction for Dynamics 365. But before we get there, there have to be a natural transition of making Dynamics 365 more configurable and less dependent on creating your own customizations and extensions.

Now and then I try to keep an eye on the D365 roadmap for signs on this transition, and today I found these nice features that I think will be highly valuable. I have copied the descriptions from the roadmap, and the release date is not clear, but I look forward to present these great enhancements to my customers.

1. Power users can add custom fields to forms without developer customization

Many application customizations involve adding one or more fields to existing tables and including them in application forms. Most of your customizations may be comprised of adding fields.

Customizations are expensive because they require developer intervention for development, test, and code life cycle management. Customizations also need to be managed and migrated from one environment to another.

We are making it easier to add custom fields to forms in Dynamics 365 for Finance and Operations, Enterprise edition. No longer will developer customization be needed. Instead, a power user will be able to add a custom field to a table and then place that field on the form using personalization. An IT administrator will then be able to share the personalization with others in your organization.

2. Product lifecycle state

The product lifecycle state will be introduced for released products and product variants. You can define any number of product lifecycle states by assigning a state name and description. You can select one lifecycle state as the default state for new released products. Released product variants inherit the product lifecycle state from their released product masters. When changing the lifecycle state on a released product master, you can choose to update all existing variants that have the same original state.

To control and understand the situation of a specific product or product variant in its lifecycle, it is a best practice in Product lifecycle management solutions (PLM) to associate a lifecycle state with a variable state model to products. This capability will be added to the released product model. The main purpose of this extension is to provide a scalable solution that can exclude obsolete products and product variants, including configurations, from master planning and BOM-level calculation.

Impact on master planning – The product lifecycle state has only one control flag: Is active for planning. By default, this is set to Yes for all product lifecycle states. When the field is set to No, the associated released products or product variants are:

  • Excluded from Master planning
  • Excluded from BOM level calculation

For performance reasons, it is highly recommended to associate all obsolete released products or product variants to a product lifecycle state that is deactivated for master planning, especially when you work with non-reusable product configuration variants.

Find obsolete released products and products variants – You can run an analysis to find and update obsolete released products or product variants.

If you run the analysis in a simulation mode, the released products and product variants that are identified as obsolete will be displayed on a specific page for you to view. The analysis searches for transactions and specific master data to find the released products or product variants that have no demand within a specific period. New released products that are created within the specific period can be excluded from the analysis.

When the analysis simulation returns the expected result, you can run the analysis by assigning a new product lifecycle state to all the products that are identified as obsolete.

Default value during migration, import, and export

When migrating from previous releases, the lifecycle state for all released products and product variants will be blank.

When importing released products through a data entity, the default lifecycle state will be applied.

When importing released product variants through a data entity, the product lifecycle state of the released product master will be applied.

Note, the ability to set individual product lifecycle states using the data entities for released products or product variants is not supported.

3. Users can pin PowerApps to forms and share with peers to augment functionality

Have you built a PowerApp that uses or shows data from Dynamics 365 for Finance and Operations, Enterprise edition? Or have you been using a PowerApp built by someone in your organization? Would you like to use PowerApps to build last-mile applications that augment the functionality of Finance and Operations?

Your users can build PowerApps without having to be expert developers to extend ERP functionality. PowerApps developed by yourself, your organization, or the broader ecosystem can now be used to augment ERP functionality by including them within the Finance and Operations client.

Your users will be able to pin PowerApps to pages in Finance and Operations. After they’ve been added, these changes can be shared with peers in your organization as personalizations.

 

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer

Dynamics 365 : Adding check-digits to number-sequences

In Dynamics 365 we are using number sequences to automatically create identifiers like product number, customer number etc. I’m a fan of having these numbers as “clean” as possible, and I always try to convince my customers use pure numbers. Why ? Take a look at the keyboard:

The numb-pad is the fastest way of typing in data. I also see that users normally perform lookup and see the description of what they are selecting anyway.

But let take a scenario; We will use a number sequence to create products numbers. We will then typical get product numbers like this :

Then I have often seen that another problem arises; typing errors from the num-pad are actually getting a “hit”, because when using a number sequence we can almost always find a product that have the same number as the user wrongly typed.

If you try using your credit card online you will see that the number is not accepted if any number is wrong. The solution there is to build in check-digits in the number.

I created a very small extension to solve this in Dynamics 365, with just a few lines of code. In the following example The “green” part is from the number sequence, and the yellow part is from the modulo 10 check digit calculation.

In this way the user can never type the wrong product(or any other identifier), unless it is 100% correct.

In the screen for number sequences I added an option to add the check digit to my generated numbers.

I wanted to share this with you, because it is so simple:

1. Create an extension on the table “NumberSequencetable”. Then add the extended datatype (YesNo) as a field, and name it “AddCheckDigit”.

2. Add this field to the “Setup field group”

Then we have the parameter in place, and it is available on the number sequence as shown earlier.

3. Then create a new class and replace all code with the following :

Here I’m creating an extension to the NumberSeq class, and I’m creating one method; Num, that will add the modulo10 number to my number sequence.

Where I check if my new “AddCheckDigit” is enabled, and I’m also saying that do not do this for continuos number sequences, manual, and I also say that the number sequence must be allowed to be changed to a higher number.

That’s it

Now you can have check-digits on products, customers, vendors, sales orders, purchase orders etc.

PS! I have not tested this code 100%, but the community is full of brainpower that hopefully can share additional findings on bugs or flaws.

If you like this, vote at idea sat https://ideas.dynamics.com/ideas/dynamics-operations/ID0002954

Agile POD’s: Organize for efficiency

Have you ever seen the TV-series “House of Lies“. This is a quite funny TV comedy series that focuses on the extrovert lifestyle of a management consulting team. First of all it is a comedy and not very realistic, but it manages to illustrate the concept of how to create the most efficient organization form for solving problems; The Agile POD.

Agile pods are small custom agile teams, ranging from four to eight members, responsible for a single task, requirement, or part of the backlog. This organizational system is a step toward realizing the maximum potential of agile teams by involving members of different expertise and specialization, giving complete ownership and freedom, and expecting the best quality output. This blogpost is about how to organize a consulting business that is non-silo organized around an actual service product.

In many consulting companies today, we see increasingly alarming signs that prevents the full utilization of the people and resources. Some of the signs can be seen as:

– Many non-direct-operative managers. (If you have >5 levels from bottom to top you have an issue)
– To many internal meetings. (Why Meetings Kill Productivity)
– To much time are used to generate budgets, forecasts and excel spreadsheets.  (No actual customer value)
– Organized into silo-team with similar expertise. (Functional, Technical, Support etc)
– New project teams for each project. (Spends 2 months of getting to know your team members)
– Outdated internal systems and processes.
– Mixed marketing message and costly (pre) sales and implementation processes
– Many partners is currently not ready for the Dynamics 365 cloud based disruption (Sticks to waterfall, while agile accelerate)

Agile POD’s is a different way of organizing a team for efficiency. How does a agile POD look like? In this example we have a small 5 person permanent team. This team is specialized for running a some tasks/phases in the initial Dynamics 365 implementation; The Agile preparation phase.

In this example the POD owner is the Solution Architect. The roles in the POD can be described as:

The solution architect:

He runs the POD, and he also have all the responsibility of the POD. It is the POD owner that recruit the POD-members. The Solution Architect is the “face” of the POD, and will organize the work in the POD and also discuss the solutions with the key decision takers at the customer. Very often the solution architect have lot’s experience. In agile terms this is also the SCRUM-master and also very operational.

The Finance expert:

When implementing Dynamics 365, there is an always a need to know how to connect the operational processes into accounting and reporting. This person is highly knowledgeable in Financial Management reporting, Power BI, Excel. He also knows how to improve the reporting from the financial perspective by defining financial dimensions, setting up Tax, Bank, Fixed Assets, HR and Budgeting/Forecasting.

The Vertical Domain Expert:

How to implement best-of-breed processes is the vertical domain experts expertise. In Retail-domains this means expert on Master data, Categorization, Stores, POS, Devices etc.

The Technical Architect:

In a cloud based system, there is a need to understand how environments are deployed, setup and make it all ready for an efficient Application Lifecycle Management. The Architect knows the ITIL-framework. When a change is needed the technical architect will create the necessary documentation/VSTS backlogs for developers to execute on.

The Junior consultant:

The junior consultant is here to learn, offload and support the team. As experience increases the junior will eventually more responsibility and hopefully some day move into other positions in the team.

Within the team we are looking to T-shaped person, that have a width to their expertise, and also a few deep expert knowledge domains. A gaming company called Valve(That delivers the Steam gaming store) described what we are looking for with the following picture of the T-shaped model. Take a look at their employee handbook. This same concept and idea is relevant for Dynamics 365 consulting companies.

The Agile POD’s must therefore specialize their own services. Each POD-team must therefore build WBS (Work-Breakdown-Structures) that enables the delivery to combinedly utilizes the entire POD.

The idea is that a POD-team is sent out to the field, then delivers the pre-defined services, and returns safely afterwards. Then it is off to the next client to again deliver the same service. As you may understand, it is therefore important that the services delivered is predefined. In this concept there is not one team that delivers a complete implementation. In larger implementations it would be a sequence of Agile POD’s that cover the implementation.

This way of organizing is not a new way of doing things. This working concept have been applied for decades at entrepreneurs and building companies. When building a house this is not done with a single team. It is done by a sequence of teams that is specialized. A POD team will have responsibility of a limited set of tasks, that needs to be performed in a predefined sequence.

By organizing operational skills into POD’s executed in a sequence, we now have a balanced unit. One pain in Dynamics 365 consulting companies I often see is that bottlenecks arises around a few selected roles. Typical on the solution architects. This unbalance will result on high utilization on these roles, with other roles have low utilization, because work is not correctly distributed. We also see that consultants are being placed into project teams because they have free time, and not because they have the right knowledge. This increases costs and reduces satisfaction for customers. Ultimately it also reduces profitability for the implementation partner.

Agile POD’s does not solve every thing, but it makes the center core operational services lean and efficient. Any consulting company still needs sales, project management and customer support as separate functions.

As seen in the figure above the each vertical focus area will have a management functions, that focuses on building Agile POD’s. The idea is not to hire single consultants but to create new POD’s. The POD itself must define the services that the POD can deliver. The role of a vertical department management is therefore to how on recruiting new POD’s. As Valve explains it, the hiring becomes the most important thing in the universe.

A model for money and revenue must also be established. All departments must be self-financing and make sure that they are balanced according to how the revenue stream is defined. One element that is common in the consulting business is bonuses. I personally don’t like the idea of bonuses but I see that it is very difficult without it.(Necessary evil) In the model below is a an example on how different departments can be revarded.

Marketing and Sales: The concept of cloud based systems it that the customer don’t need to purchase all the software upfront. They rent/hire the software in the cloud, and only pays a monthly fee. The Marketing and Sales divisions must therefore be financed by the monthly license revenue, and the bonus would be accumulating. The purpose is therefore to make sure new customers are onboarding and that existing customers hare happy with the services. As a new seller in this way of organizing it, there will not be much bonus in the start, as you have few customers onboarded. But as more customers get’s on board, the bonus will be accumulating, and after 2-3 years there will be a decent bonus and a decent ground for investing more in marketing.

Project and Management consulting: As described earlier, these roles are the only more “permanent” roles that exists in the project. They will ask Agile POD’s to come inn and solve specific tasks. Their services are based on T&M(Time and Material), and their bonus will be based on the revenue(not margin) on the project.

The Agile POD’s: These services are charged in a combination of T&M and Predefined Product Services. The Predefined Product Services is the key here. Create WBS-structures where the price and delivery is clearly defined. The bonus here is a team bonus. Internally in the team it is distributed according to a key. But the POD-team can also choose to use the bonus for other purposes also like training or conferences. Remember that an agile POD is a self-contained unit with costs, revenues and margins. If the POD is not profitable it will be dissolved and the team unattached/fired.

Platform Services: This department is making sure all services/software around the Dynamics 365 is working as expected. This means making sure the azure/tendents are set up correctly, that Office is working and that services like CDS(Common Data Services) and PowerApps are setup as expected. All their services should be Predefined Product Services. And the bonus would be based on margin. Why ? Because we want to become better and better delivering these predefined services. The faster this is delivered the more margin is generated. This is a Win-Win situation for both the customer and for the consulting company.

Customer support/After Sales: Customer support and aftersales is all about delivering excellent customer service after the project have gone live. It’s revenue should be based on support agreements and add-ons. The bonus for the department is based on accumulated revenue, because these services should be reoccurring services that the customer pays for each month. If the customer is happy about the services provided then they will continue to use this service. The alternative for the customer is to use Microsoft Premier Support that can be quite costly and not that relevant in most cases.

At the end of this blogpost I would like to visualize how we envision the Agile POD’s, where we have training on our services and delivering excellent customer services on time and on budget.

giphy-downsized2

And if we don’t, then this is the consequence:

formula-1-fire-gif-1632727

Additional details on Agile POD’s can be found here:

https://www.globant.com/build/agile-pods

https://www.agileconnection.com/article/using-agile-pods-realize-potential-your-team

Video : https://www.youtube.com/watch?v=IwJKRaocdxI

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

Dynamics 365 Pre go-live checklist

I asked Microsoft if I could share their Pre go-live checklist that is used in the Fast-Track program. And they said yes

So here is a copy for you, and what customers must be prepared to answer before Microsoft is deploying the production environment.

Pre Go-live Health Check list:

  1. Solution acceptance by users: UAT
    1. Is UAT completed successfully? How many users participated in UAT?
    2. Did UAT test cases cover entire scope of requirements planned for go-live?
    3. How many bugs/issues from UAT are still open?
    4. Any of the open bugs/issues a showstopper for go-live?
    5. Was UAT done using migrated data?
  2. Business signoff:
    1. Business has signed off after UAT that the solution meets business needs?
    2. Solution adheres to any company/industry specific compliance (where necessary)
    3. Training is complete
    4. All features going live are documented, approved and signed off by customer
  3. Performance:
    1. How was the performance in UAT? Is it acceptable for go-live?
    2. If Performance testing was done, then are there any open actions from it?
  4. User & Security setup:
    1. How many security roles are being used. All security roles are setup and tested?
    2. Users that will need access at go-live have been setup with correct security role?
  5. Data Migration:
    1. Data migration status – Masters & Open Transactions/Balances
    2. Business has identified owners for data validation?
    3. Review cut-over plan: Business & Partner teams are comfortable with the plan?
    4. Does the Data migration performance fits within cut-over window?
  6. Configuration Management:
    1. Are the configurations updated in Golden Configuration environment based on changes in UAT?
    2. Data stewards/owners identified and process in place for post go-live changes in Master/Configuration data?
    3. All Legal Entities configured for Go-Live?
    4. Are configurations documented?
  7. Integrations:
    1. Review list of integrations and readiness plan for each
    2. Latency requirements and performance criteria are met
    3. Integration support is in place with named contacts/owners
  8. Code Management
    1. Production fixes/maintenance process defined?
    2. Code promotion (between environments) process is in place, documented and the entire team knows and understands the process
    3. Code promotion schedule for production is in place?
    4. Emergency process for code promotion to production is defined?
  9. Monitoring and Microsoft Support
    1. LCS diagnostics setup and knowledge transfer to customer
    2. Issue resolution and Escalation process defined – LCS support is verified?

A Practical Guide for Dynamics 365 Iterative Implementation

With the introduction of Dynamics 365 and cloud enabled tools like Office and VSTS(Visual Studio Team Services) we have accelerators towards iterative ways of performing an implementation.

Digitalization also enables the ability to go from a document and template approach to a committed task driven implementation with a sprint based sub-deliveries, where all parties are involved. This also increases visibility, removes lead-times and results in faster deliveries. Adapting digitalization and going iterative in a project it not only about using new tools and processes like VSTS, but also covering Practices, Principles, Values and Mindsets of the project participants.

The iterative preparation

As described in earlier blogposts it is vital to have a clear concept of process modeling where processes are broken down to sub-processes and requirements. Having WBS (Work-Breakdown-Structures) is the tool to plan and execute on deliverables. The traditionally solution analysis is transforming into a iterative preparation phase that can let us define clear work packages that can be solved in sprint executions.

The Iterative preparation should have a formalized set of workshops, and the main purpose is to generate an approved solution backlog. It is normally recommended to complete the preparation phase before going into the execution phase. But in larger projects the preparation phase could be a parallel phase to the execution phase, and where customer approved solution backlogs can be planned into sprints and started upon before the phase is ended.

Please remember that iterative implementation models do not give a detailed picture of scope or costs! The actual deliveries are defined by the customer approved solution backlog.

The following flow chart shows the main activities in the preparation phase.

The granularity and level of details needed in the deliverable documents is agreed on in the project. A middle and practical way is to create the deliverable documents with a minimum set of information and a final conclusion, and then URL link the content in documents towards a VSTS site for further information and process.

The preparation phase is highly customer intensive and require a detailed plan, invitations, workshops and time to document the findings. Before is participating in preparation workshops it is recommended that the participants have completed a “Learn, Try, Buy” phase. An example project plan for the preparation phase can look like this for a retail customer.

As seen in the example plan, the preparation can have dedicated tracks for the functional areas, and these will vary based on the vertical models that is being used. The level of granularity of the sub topics is recommended to be according to the first and second level in the process models.

Use process models to define scope and topics.

The contents of the preparation workshops should be organized based on the process models. This makes sure that best practices are discussed and taken into account for the execution phase. The value chain shown here is the divided into 3 main epic tracks; Management processes, Operational processes and Support processes. There are different models for each vertical. As seen in the following figure I typical use to illustrate the EG retail value chain model.

ProcessModels

For each of the “boxes” in the model represents a topic, where business processes is discussed and defined. The model will provide:

  • Workshop Agenda’s templates
  • UAT test scripts templates and recommended process recordings
  • Stack/technology recommendations
  • Process flows (visio or BPM in LCS)
  • Solution Backlog templates.
  • KPI assessment recommendations (APQC)

From Model to solution backlog

Based on the findings from the preparation phase a solution backlog is created. The most efficient tool to do this in, is the VSTS (Visual Studio Team Services), setup using the CMMI definitions. Here all backlogs are organized in a hierarchy of Epic’s, Features, Backlogs, tasks and Impediments.

The general consensus of these definitions are:

Level Description
Epics Something that transcends projects/releases/versions.
Features Something that cannot be delivered in a single sprint, but can be delivered in a single release.
Requirement(CMMI)
Product Backlog (SCRUM)
Something that can be delivered in a sprint, and have an estimation.
Bug Something that that is not working and can be solved in a sprint, and have an estimation.
Task Assigned work elements with remaining effort.

To relate the structures to CMMI, the following guideline can also be followed.

More details in how to create a backlog in VSTS can be found here. Best practice is that the VSTS site is located on the customers tendant, and that external project participants are invited. The VSTS backlog can also be regarded as a WBS (Work Breakdown Structure). In the following example you can see how the backlog is structured according to a business process model.

The VSTS will also provide dashboards where a complete status can be seen and monitored. Setting up these Dashboards is based on defined queries towards tasks and backlogs, and are easy to tailor to the needs.

How to fill in a backlog item

The product backlog (and other elements) contains a small set of fields needed.

What should be regarded as a minimum set of information defined on a backlog is:

  • Name
  • Description
  • Acceptance Criteria
  • Effort estimated.

If additional fields are needed, they are quite easy to add and also easy to extend with new statuses.

If additional fields are needed, like APQC ID, planning dates, additional names etc, they can very easily be added to the form. See https://www.visualstudio.com/en-us/docs/work/customize/customize-work for more information.

In the preparation phase perform these activities:

  • Right-size backlog items by splitting larger items into smaller items. No backlog item should be larger than it will take to complete in a single sprint.
  • Identify and fill in gaps in the product backlog. Capture new ideas and stories, architecture and design requirements, and other spikes.
  • Reorder the backlog to represent today’s priorities and business value focus.
  • Ensure well defined acceptance criteria has been added to each item.
  • Revisit estimates made to backlog items and adjust upwards or downwards based on recent understanding about scope and acceptance criteria.
  • Review all potential backlog items to consider for the upcoming sprint to make sure they are well understood and that any additional work required to support their development is well understood by both product owner and the team.

Mapping a VSTS product backlog to the functional requirement documentation

Most often it is mandatory to also deliver a Functional Requirement Document. This document is delivered as in the end of the preparation phase. The reason why this document is important, is that it explicitly defines all requirements, and is a commercial document. But instead of writing a hundreds of page document, try to link the requirements using URL-links to the document. Then the FRD only contains the vital and important information that regulates responsibilities and commercial conditions.

The preparation phase ends when the deliverables from the phase is approved and signed by the customer. After the phase is approved, all information on the VSTS site can be copied into Excel backup sheets, that represents a snapshot of the status at the end of the prep phase.

Roles involved in an iterative preparation phase

The roles running an iterative preparation phase depends on project size and complexity. As a minimum, it is recommended that the following defined roles are present in this phase:

  • Project manager (Planning and facilitating)
  • Solution architect (Overall approval of solution)
  • Technical lead (Integrations and migration)
  • Functional Consultants (Covering training, functional area’s)
  • Junior Business consultants (Assisting writing and maintaining the solution backlog)

Customer project participants need to match this roles.

Iterative Execution phase

As the solution backlog is filled, sprints may be filled with approved backlog items. The overall process of the sprint is to deliver at set of backlog items, that have been broken down to specific tasks. The duration of a sprint is determined by the scrum master, the team’s facilitator. Once the team reaches a consensus for how many days a sprint should last, all future sprints should be the same. Normally, a sprint lasts between 2 to 4 weeks. During the sprint, the team holds daily stand up meeting to discuss progress and brainstorm solutions to challenges. The customer may not make requests for changes during a sprint and only the scrum master or project manager has the power to interrupt or stop the sprint. At the end of the sprint, the team presents its completed work to the customer and the customer uses the criteria established at the sprint planning meeting to either accept or reject the work.

The following diagram shows the activities involved, and the expected deliverables from a sprint.

Define the Sprint log

To solve a backlog, then several resources may be required to be involved. When defining the sprint log, each backlog is split into tasks, that defines the sequence, remaining work and the assigned to. This means having tasks for analysis and design of the backlog, creating scripts for testing, tasks for developing and tasks for performing the test of the backlog item. As seen in the following figure a backlog is divided into tasks, and each task is must have a clear description and a “remaining work” estimate. If essential resources are needed to solve the task, then the task also should be assigned to this person.

When a task have been assigned to a person the person is committing to the task, and agrees on delivering the task within the defined sprint.

Conducting a sprint planning meeting

The customer, project manager and the Scrum Master will start a sprint by selecting the backlogs that should be solved in the current print. This is done in VSTS, and the backlogs are “dragged-and-dropped” to the selected sprint, or marked to specific iteration.

When planning a sprint, also identify what resources are needed in the sprint. In the sprint overview, then define the capacity and the resources required in the sprint. This makes planning easier and resource/capacity constraints can be identified by project manager/scrum master.

The daily sprint meeting

This meeting is the most important meeting each day. It should only last for 15 minutes, starts at the same time every day and is located on the same place every day. The SCRUM master is responsible to make sure that the meeting is as efficient as possible. It is a team meeting, where each team member explains what he is working on, and if there are any issues. Do NOT use the sprint meeting to try to solve issues. Just identify and share. Use other meetings to solve and to go deeper into each topic. Any notes that is important and identified in the CMMI can be described in the discussion field on the task/backlog.

Also use the “Add Tag” to mark tasks and backlogs that need special attention and follow-up.

Reporting status and completion

All backlog items have a state. The meaning of these states can be seen in the following flow charts:

Teams can use the Kanban board to update the status of backlogs, and the sprint task board to update the status of tasks. Dragging items to a new state column updates both the State and Reason fields. If additional intermediate steps and stages are needed, this can be customized in the settings of the VSTS.

Documentation

One of the disadvantages of an iterative implementations is that there are no clear and sharp end-phases, and this often does not fit good with commercial contracts. It is therefore important to make sure the deliverable documents are created/updated according to the progress. But remember to be active in documenting as much as possible in VSTS, and define the creation of deliverable documents as defined backlogs in the sprint. Expect to use at least 10% of you time in VSTS to give visibility to others.

Conduct Solution testing

Quality is a vital aspect of and everybody in the team owns quality – including developers, managers, product owners, user experience advocates, and customer project members. It is vital that the solution testing is a customer responsibility and that the testing is structures and planned accordingly.

VSTS provide rich and powerful tools everyone in the team can use to drive quality and collaboration throughout the implementation process. The easy-to-use, browser-based test management solution provides all the capabilities required for planned manual testing, user acceptance testing, exploratory testing, and gathering feedback from stakeholders.

Creating test plans and performing test are one of the most vital elements in the iterative implementation. Please read the following URL for UAT testing https://www.visualstudio.com/en-us/docs/test/manual-exploratory-testing/getting-started/user-acceptance-testing . Building a test plan is therefore a mandatory step and this ensures that defined accept criteria have been met.

The following documents are the input to the solution testing:

Flow Test Script
UAT Test Script by Function
UAT Test Script by Role
UAT Test Script Details

Test & Feedback

Visual Studio Team Services Marketplace contains a ton of jewels, and one add-in that can accelerate testing and feedback is the Test & Feedback extension to VSTS.

When installing it, you get a small icon in Chrome, where test and feedback can be given.

When setting it up, you just point to the VSTS site. And then you are ready to start giving feedback, and to collect backorders, bugs or tests, just click on the play button.

While navigating and taking screenshots, notes and video, it all gets recorded, with URL, time etc.

When done with the recording then create a bug, test or create a test case:

After saving the bug. I see that a bug have been created in VSTS:

I now have a complete bug report in VSTS, that the consultants can start to process and to identify if this is a bug or an “as designed” feature.

Microsoft Tools available for a Dynamics 365 project.

When working in a project, it is important to know that Microsoft tools and services are tightly connected and that each tool can simplify and enrich the user experience and efficiency. In the following figure the different most common tools can be seen. Also pay attention to that there are powerful integrations between these tools, and this section will provide some small tips on how to make these tools work together.

Having a clear understanding of the tools available can speed up implementations, and also give better visibility to all stakeholders. In the following topics, some of these benefits are discussed.

Microsoft VSTS: Visual Studio Team Services

Create a VSTS site at http://VisualStudio.com. For internal projects, create using your domain account. For customer project, it is recommended to create the site on a customer controlled domain, and then add the domain users as guest users. Other elements in relation to VSTS have been covered earlier in this document.

Who uses it? All implementation project members both from EG, Customer and 3’rd party vendors.
When to use it? Everyday and in all SCUM meetings.
Pricing 5 users free, stakeholders free. Paid user is 6$/month.
Members with Visual Studio subscriptions don’t need licenses. https://www.visualstudio.com/team-services/pricing/

Microsoft Excel: Upload and Maintain

Microsoft Excel can be used to import and publish the structure into VSTS, when the Visual Studio Community edition is locally installed on your PC. This makes it possible to extract all fields and values by using VSTS defined query.

Then a process model may be imported and a best practice product backlog is ready to be processed. Step-by-Step instruction on how to use Excel with VSTS, take a look at https://www.visualstudio.com/en-us/docs/work/office/bulk-add-modify-work-items-excel

Who uses it? Solution Architects and vertical responsible.
When to use it? In the start, when uploading process models and WBS’s as a start.
When mass updating backlog items and tasks.
Pricing Office 365 prices. https://products.office.com/en-us/compare-all-microsoft-office-products
Visual Studio Community edition is free.

Microsoft Project: Plan and Breakdown

Microsoft Project is a vital tool for streamlining quotations, WBS and resource planning. Built-in vertical templates, familiar scheduling tools, and access across devices help project managers and teams stay productive and on target. Microsoft Project is also directly integrated with VSTS, and exporting created backlogs and tasks/activities to Dynamics 365 for Operations can be done, and create a complete end-to-end process covering “from quote to cash”.

“Plan-the-work”, and “Work-the-plan” are essential activities and where all stakeholders can participate and cooperate, and that we deliver what is planned, and the invoice the customer receives corresponds to the agreement and contract. Having predefined WBS structures in Microsoft Project simplifies project planning, and the VSTS is auto updated accordingly to how the planning is performed.

 

Who uses it? Presales, Sales and Project management.
When to use it? Microsoft Project is excellent to handle WBS structures when planning and quoting a project. Microsoft Project is also used for planning resources, and to reach project deadlines. For more information on how connect VSTS and Microsoft Project, take a look at https://www.youtube.com/watch?v=GjYu5WmcQXo
Pricing 30$/user/month for Project Online Professional
https://products.office.com/en-us/project/compare-microsoft-project-management-software?tab=tabs-1

Microsoft Outlook: Inform and Alert

Some stakeholders do not want to go deep into VSTS, or to extract information from Excel/Projects. Also when tasks are being assigned they want to be informed and when issues are resolved, they want to notified. Setting up notifications in VSTS solves this requirement, and will keep project participants informed of any changes. The email also contains a URL directly to the task/backlog.

Setting up notifications are done in VSTS, and individual filtering can be defined.

Who uses it? All project participants receive assigned notifications. Project managers and solution architect receive all notifications.
When to use it? When Outlook is used to keep participants informed.
Pricing Outlook included with Office 365 prices. No additional costs.

Microsoft Teams: Discuss and Involve

Informal communications are vital for any project. Tools like Skype for Business will take care of meetings and screen sharing, but Microsoft Teams gives flexible communication on all platforms and keep everyone in the loop. The users can see content and chat history anytime, including team chats with Skype that are visible to the whole team. Private group chats are available for smaller group conversations. The Microsoft teams can also function as the center point, with direct tabpages towards VSTS, Home Dynamics 365, LCS, Sharepoint etc. Since this September the Microsoft teams support guest users, and since these sites normally is on the customers tendents, we consultants are logging in with our company email addresses.

The VSTS Kanban board are easily accessible from the Microsoft teams.

Who uses it? Project participants involved in a project, that needs to have informal communication and the ability to work asynchrony with a discussion history.
When to use it? When more direct communication is needed, and especially for developers.
Pricing Teams normally included with Office 365 prices. No additional costs.

Microsoft SharePoint online: Documents and Archive

Even in a highly interactive and iterative environment, there is a need for documents. And then especially for deliverable documents. For this, SharePoint Online is used to store, track and develop the documentation. The internal folder structure is optimized for the sales process, and contains commercial binding documents. The SharePoint online site in mention here, is the SharePoint online site that is the customer property. The following document structure can be recommended.

After the project is delivered, the SharePoint site will remain as the documentation together with the VSTS site.

Who uses it? Project participants involved in a project, that needs to create or use formal documentation and deliverable.
When to use it? When having specific deliverable that.
Pricing SharePoint is included with recommended Office 365 E3 prices.

Microsoft Flow and PowerApps: Workflow and Apps

Microsoft Flow and PowerApps are quite new technologies in the Microsoft office family. The idea of bringing these tools into the scope, is to be able to have process and workflow automation in the implementations. PowerApps is also a great tool for data collection in testing and for getting feedback.

Some examples of Microsoft Flow:

Streamline approvals by sending files with approval requests

  • I’m sick button
    à Inform colleagues and block calendar.

Some examples of powerApps:

Who uses it? Superusers and Architects
When to use it? Used for automating tasks and to create fast simple prototype apps that can assist in the implementation
Pricing Flow and PowerApps are included in a Dynamics 365 Plan 2 license.

I hope this blogpost gives an insight into the digitalization process partners now are using in Dynamics 365 implementations. The Microsoft sites contains tons of more information and I recommend to explore more of the Microsoft technology stack that is available for Dynamics implementations.

Disclaimer: The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of Microsoft, my employer EG or other parties.

 

Common Data Service(CDS); Talk the talk, but not walk the walk yet

With every new release and platform update we see a clear Microsoft commitment to support deeper integration across the D365 portfolio. It’s still in the early stages, but we see the direction. New developments such as parts of Dynamics 365 for Talent is using the CDS. I think that we in the future will see more business apps utilizing the CDS as the data storage. The benefit of the common data model is that applications can work against data without needing to explicitly know where that data is coming from. To see what the Microsoft business platform are, take a look at https://businessplatform.microsoft.com

The CDS also have an important role in both process and data integration between the Sales(CRM) and Operations(ERP) apps, and the current status after the July release and Update 9 is that we have 6 templates that we can use to test some scenarios with D365. The Business platform admin center is where the CDS data integrations can be set up. You can reach this from https://admin.businessplatform.microsoft.com or from https://admin.powerapps.com

The first needed to be set up is Connection sets. Connection sets are a collection of two more connections, organization mapping information, and integration keys that can be reused among projects.

In this case I have set up a integration where data can go from D365 for Finance and Operations à CDS à D365 Sales:

Then also to map the organization ID’s across the 3 services.

And finally the integration key’s.

After the connection sets have been setup, the CDS knows how to connect to the different systems. We can then create an integration project

We can then select between the current 6 templates.

I have only been able to test the Accounts and Products. The Sales Quotes did not work for me (but they are also in preview currently).

After the integration project have been created it is possible to add more integration tasks and make changes to the mapping. If there are issues with the mapping or you need to map additional it will show in the “Issues” column.

N the mapping transformation functions, like this where the item type is changed when transferred from CDS to Dynamics 365 for Sales

The integration can also be scheduled to run at specific times:

Some unofficial benchmarking, I managed to transfer 200 products to CRM in 40 seconds.

Summary:

We are definitely going in the right direction, but we are yet not ready to “Walk-the-walk”. Microsoft currently only support synchronization in one direction, but bi-directional synchronization are in the roadmap. This is important to support “multi-master” system where the same fields can be authored on either side. Like Customer Name etc. It seems Microsoft is focusing on providing business scenario’s like “Prospect-To-Cash”, and what we mainly have now is a preview of how this would look in the future. The mapping is not complete from the templates and needs more work.

My final comments are that we will in the future have a very easy to use system for connecting the unified business operations apps. It was hoped that this business platform would be more ready with the July release, but it needs more releases for this to be useful in actual implementations. This feature needs to grow beyond the “Minimum-Viable-Product”. I hope and expect that we will see in place through the fall, and that the next release will have more mature integration templates. So far, good work Microsoft, but please hurry ! To learn more take a look into the https://docs.microsoft.com/en-us/common-data-service/entity-reference/dynamics-365-integration.

 

 

 

 

Dynamics 365 CSP; What happens when a customer is not paying their monthly bill?

Disclaimer: In this blog post I would like to share my understanding on what is happening when customers no longer pay their bill for Dynamics 365. Please consult with your partner or Microsoft to get the actual and official interpretation.

First some definitions; Most mid-size customers will buy Dynamics 365 through a partner that is a CSP (Cloud Solution Provider). Larger corporations will have the opportunity to buy Dynamics 365 directly from Microsoft through a EA (Enterprise Agreement). The information here is related to the CSP way of purchasing licenses.

When buying Dynamics 365, most customers will receive a monthly bill from their CSP partner. But the great thing about the CSP, is that you may adjust the number of users for the next period. Dynamics 365 have a low limitation of 20 licenses, but above this the customer may make changes.

But keep in mind that even though you receive a bill for the upcoming month, there is still a commitment for the base subscription period. For Dynamics 365, the subscription period is normally 12 months. I think I finally understood why the name is Dynamics 365; The reason may be that you have to buy it for at least 365 days

As stated earlier the customer normally receives a bill each month. But what happens when the customer stops paying the bills?

1. Well first the normal procedure is that the customer is notified by their CSP that payments are missing that follows the normal procedure.

2. The next step is that the CSP partner will suspend the subscription. This is done when by changing the status on the subscription to “Suspended”.

3. When a subscription status is changed to “Suspended”, this puts the subscription into a “data retention” mode. This means that end-users will not have access to any services, but administrators will still have access to the data associated with this subscription.

4. At the end of 60 days after a subscription is Suspended, the subscription is moved to a “de-provisioned” state. At this time, all data is removed.

The conclusion is therefore; Pay you bill or lose your data.

When I think of it…… it’s just like paying your electric bill.. no pay…no power.

 

 

Dynamics 365; New VM type cut your Azure bill

When deploying your Azure based VM’s the most common VM Size is D13 with 8 cores and 56 GB RAM. This VM costs approx. 924 USD per month according to the pricing calculator.

Microsoft have made some new sizes available :

https://azure.microsoft.com/en-us/blog/price-reductions-on-l-series-and-announcing-next-generation-hyper-threaded-virtual-machines/

The new size is named D13 V2 Promo, and will cost 749 USD. If you have MSDN the costs are further reduced to 429 USD/Month

You cannot select this size in LCS, so you must log into your azure portal after deploying, and change the size there.

Nice

Dynamics 365 ideas

Microsoft have released a new site for posting and voting on ideas to the Dynamics 365. https://ideas.dynamics.com

It is assumed that this is replacing all other forum and sites, like connect, yammer etc for suggesting new and exiting functionality. It covers the entire Dynamics 365 stack, and the concept is that each person can suggest and vote on up to 50 suggestions per forum. Microsoft have also created statuses on each suggestion, and additional comments can be added by registered participants.

A small suggestion to Microsoft on the site; Allow us to use our ADFS login, and not just our Live-ID login. (I guess I have to create a suggestion for this)

 

D365FO Channels, Insiders Preview and Update Policy

As announced with the update 4, Microsoft will release monthly updates so that new and existing environments can stay up-to-date with the latest innovations with a click of a button. Hopefully this make it easier to stay on the newest platform. We are also assuming and hoping that this approach in the future will extend to also cover business functionality (eg ApplicationSuite). A faster update cycle also results that there are more versions currently used in at customers. As seen here are all the official releases that Microsoft have made available for Dynamics 365 for operations. With a monthly update cycle the list will be extended quickly. Keeping track on versions does not give any actual customers value. But in a SaaS approach, creating faster and simplified updated on business functionality will require a better and more visible release policy, not based on build numbers.

We need to make this upgrade and update experience easier to understand and easier to follow. The work that the Microsoft Office and Windows team have done is a great example that I think is something we also should have for Dynamics 365: The introduction of release channels.

Update channel

Primary purpose

Feature updates

Platform updates

Security Updates

Preview Channel

Provide users to evaluate the newest features of Dynamics 365 as soon as possible. The channel is only available through an insider program, and is not deployable into a production environment.

Weekly

Weekly

Weekly

First Channel

Provide users with the newest features of Dynamics 365.

Monthly

Monthly

Monthly

Current Channel

Provide users with the released and stable features of Dynamics 365.

Every 3 months

Monthly

Monthly

Deferred Channel

Provide users with new features of Dynamics 365 only a few times a year.

Every 6 months

Monthly

Monthly

 

What channel you would like to have should be a setting in LCS, and the customers can switch between the channels as wanted. Visually the release and update schedule could then look something like this.

With the introduction of Microsoft AppSource and for ISV’s this would mean that they could commit to a specific channel, and “front-runners” like myself would have early access to the newest and hottest. Some customers are deadly serious about stability, and new features could potentially cripple business processes. In this way customers can for them self’s decide their speed for adding new functionality.

Dear Microsoft; Can we have something like this also for Dynamics 365?

 

 

 

 

 

 

 

D365FO – Test & Feedback

Visual Studio Team Services Marketplace contains a ton of jewels, and one add-in I like a lot is the Test & Feedback extension to VSTS:

When installing it, you get a small icon in Chrome, where test and feedback can be given.

When setting it up, you just point to the VSTS site :

And then you are ready to start giving feedback, and to collect backorders, bugs or tests.

Let’s say I wanted the implementation team to know that some changes are necessary on the “All Customers” form. I then click on the feedback button, and click on “start recording”

While navigating and taking screenshots, notes and video, it all gets recorded, with URL, time etc.

When done with my recording I want to create a bug, test or create a test case:

In this case, create a bug, and while I’m typing I can even see that there are one similar bug already reported.

After saving the bug. I see that a bug have been created in VSTS:

I now have a complete bug report in VSTS, that the consultants can start to process and to identify if this is a bug or an “as designed” feature

This means that in a Dynamics 365 project where VSTS is the central element, the feedback and testing just got simpler.

The feedback and Test extension do contain a lot more, but the rest you have to explore for yourself. Try out video recording . That is really cool.

 

 

 

Warehouse Performance Power BI pack

Just a small reminder to my digital brain, that Microsoft have released a Microsoft PowerBI pack aimed for the WMS industry. Here are some samples.

Inbound – Measure vendor delivery precision. Measure put-away average times for products, and vendors, and be able to measure how fast your workers are processing put-away work.

Outbound – Measure how many of the shipments are send in full and on time. We provide ability to measure early, late and on time shipments in order to monitor outbound performance and endure high customer service levels.


Inventory accuracy (Warehouse itself) – Every warehouse needs to have high inventory accuracy on locations in order to be able to process shipments correctly. Measure inventory accuracy for locations and items based on inventory counting with full visibility into discrepancies in quantity and percentage. We provide easy way to monitor counting performance, and inventory accuracy for items on locations

 

Where can you find this package ?

In the LCS- shared asset library:

Thanks Yi Zhang !

The new Warehousing App

Microsoft have released the new warehouse APP, and the setup instructions can be followed here : https://ax.help.dynamics.com/en/wiki/install-and-configure-dynamics-365-for-operations-warehousing/

You can download the Windows App here: https://www.microsoft.com/en-us/store/p/dynamics-365-for-operations-warehousing/9p1bffd5tstm

The setup instructions are very good, and in 10 minutes you should be able to get the app working as expected in your windows 10 machine. The App also have a demo-mode that lets you try it out, without having to connect it to an environment.

Here are some pictures for you pleasure.

Thank you Markus J

Retail process modeling; Divide and conquer

I normally don’t share much that is considered as employer specific IP/tools, but today I will do an exception. At EG we have for years been focusing on how to address the business processes for the Retail industry, and how to name and classify these processes. By combining the structure APQC business process mapping and classification with the essential understanding on how to improve and implement the retail business processes. This means we have a that a predefined approach for scoping and planning retail implementations. The key to this model, is to ensure that we can have a good scoping and planning phased retail implementations based on the customers actual processes.

The top level in the EG Retail model we group all epic processes into “Management processes“, Operating processes” and “Support processes” as seen in the following picture. Then we have broken each process into sub-processes(Levels), pretty much according to APQC.

 

Level 1 – Operating processes

The Operating processes are the day-to-day processes taking place in at a retailer. We have divided the level 2 processes into 5 specific areas as seen in the figure below.

1. Category management is all about grouping products into hierarchies with similar properties and attributes. This makes it possible to give responsibilities and parameters on group levels, instead of on SKU level.

2. Sourcing and procurement is about making sure that we have the products available on the store/channels available for sale. This means working with vendors and producers, and to have clear planning strategies.

3. Retail logistics is processes that typical happens at the central warehouse, and when replenishment to stores is needed, then it is sent at the right time.

4. Omni channels is about being available to customers on multiple platforms and through the customers purchase experience. It stretches from brand awareness, store, web, mobile, loyalty and after sales processes.

5. Store operations is what is happening at the physical store.

Each of these level 1 the retail processes have been split into the following level 2 processes. In the column 1 we have the parent process, and below we have the sub-processes in the horizontal boxes.

We can further look deeper into the category management processes and we see the following level 3 sub processes. You can see the red boxes in the level 1, have been moved to the first column in level 2, and then the sub-processes are shown in the horizontal columns.

For each and every retail process we break the processes down to level 3 or level 4, and we then also decide on how we are solving each of these sub processes. This is done by color coding the processes. As you can see in the following picture, you see that most is solved in standard Dynamics 365, but also with some 3-party products. There are also processes that is not covered by the solution stack available currently.

At level 3 we have mapped each of these processes into APQC and into the LCS business process modeler. When we take the level 3 process called “Define categories” we have a relevant APQC process named 2.1.1, and this means that we(or APQC Members) can extract som KPI’s to allow us to define how this process are performing.

Together with APQC we can use these KPI’s to measure how good this process if performing, and also compare the process with similar retailers also using the same KPI’s. This tells us if the process needs to be improved to achieve more.

Microsoft released a new APQC library in November 2016, that is available in LCS, and here Microsoft have defined 3774 business processes and have 617 flow charts for Microsoft Dynamics 365 for Operations. This gives us a further ability to map the processes directly into Dynamics 365. Here I have searched for “category” to see what APQC and Dynamics 365 processes are supported.

Using the process mapping to create a implementation plan.

When we are working with our customers, we build a scope plan quickly, and define what processes we what to start with, and what to postpone into future projects. We can be clear about the how quickly the ROI can be, and that we can start on business processes where we today have low performing processes. In the sample scoping below, I show how we can start with the stores, then in project 2 enable the HQ suppy chain/replenishment and then finally start a project where logistics to the stores are in scope.

This means we can do project phased retail implementations within the budgets for the retailer. Each of the “Boxes” also contains process descriptions, VSTS backlog and task lists, UAT test scripts and workshop agenda’s. This means that when running a retail project, we don’t have to start with a blank whiteboard.

In addition the model have been mapped into Visual Studio Team Services. This means that the Retail model is also a implementation model for than can be used by project managers, consultants, developers, customer super users and stakeholders.

 

I hope this gives you some ideas on how we are approaching the retail market from a business process standpoint, and delivering our implementation as predefined repeatable services where Azure, Office365, LCS, VSTS, ODM and all the other good Microsoft services are used to the full extent.

Retail is detail, and the future is bright J

Check out KB 3206881 – New image preview control

Microsoft have released a hotfix on the document preview control, and this is actually quite cool. In the following picture you see the preview pane have been updated with some new buttons.

Now we have ordinary functions like move, zoom, print, but we also have highlight, Blocking and some text functions.

This means we can make direct changes to the attached images, and this is interesting when we have scanned copies of invoices or any other document.

In the following picture, I have just highlighted some parts, and blocked some texts. I have also added a text of my own.

Why is this interesting? Because good developers is experts in “copy-with-pride” solutions. And we now have a new web-enabled control that allows us to create new extended solutions for handling scanned documents.

I expect that we very soon will see small invoice approval apps available at a fraction of the price we have seen before, and that is using this feature.

Try it out J It’s Cool.

 

Warning; Generate Demodata; Financial period close

Dynamics 365 for Operations have a nice feature for generating demo data.

Here is my 1000 $ tip!
DON’T USE IT IN ANY IMPLEMENTATION PROJECTS!

This class is only meant for generating demodata in the Contoso dataset, and will corrupt and delete any real data you may have created. If you take a look at the class LedgerPeriodCloseWorkspaceDataCreation that will generate month end financial closing data you see that it only works towards specific contoso companies and personas defined in the contoso company.

There is also a method executed in the beginning, that just deletes data, and makes sure any data you may have generated is just gone.

Why Microsoft have decided to include this “Demo-data” package in the implementation deployment I don’t understand……

…and if you wonder; Yes, I did this mistake.

Try Dynamics 365 now

Microsoft is currently holding an online virtual Dynamics 365 launch party and I’m happy to see that Microsoft is delivering as promised. Take a look at it here:

What Microsoft also have made available is a trial experience of Dynamics 365 for Operations. That is available here. What you get is an access to a 30 day multi-tendent trial experience, where basic testing can be taken place. In my case I got company number 037, and I cannot access other trial users company J

In this first release you will have limited access to 3 basic processes as defined in the trial experience task recording. You can try to navigate around, but you have very limited access to create customers/vendors/products etc. Important: This is NOT a full feature trial, and you NEED to follow the task guides as your guide to Dynamics 365.

Remember that Microsoft is constantly refining and improving the trial experience, and if you want a full-blown trial you need to contact a Dynamics 365 partner that can help you set this up. Later Microsoft will release additional trial experiences, and also support localized trial experiences. Other industry based trials are on the way, like Retail/POS experiences.

Check it out!

My Dynamics 365 FastTrack experiences

If you have not heard about the Microsoft FastTrack program for Dynamics 365 on-boarding, then this is the post for you. So, to say it simple; the FastTrack program is Microsoft’s involvement after the licenses have been purchased to get you fast up and running on the cloud platform.

It starts when the licenses have been purchased through the CSP-portal (or through a EA agreement), and lasts until the live production system have been deployed.

When a Dynamics 365 deployment should start, we get a checklist of tasks that need to be completed when we move from one stage to another. The LCS implementation project looks a bit different than the ordinary LCS projects.

As you can see here there are a lot of checks that needs to be confirmed before going live. In the process, some guidance is needed, and Microsoft is giving this as a service included in the license. As the implementation goes forward Microsoft is conducting some bi-weekly workshops, where each meeting has a predefined agenda with information and some room for discussions and guidance. The touchpoints are divided between actual workshops using Skype 4 Business and Tech Talks that is a kind of webinar session.

In the FastTrack program there is a role and responsibility, that is explaining what is expected from the parties involved in a Dynamics365 rollout.

I have been lucky, and have been involved in a complete cycle, and I have to say that I’m impressed how this FastTrack program works. As the Dynamics 365 is quite new, and the entire Dynamics ecosystem is trying to absorb the information made available, it is easy to get lost and to think that implementations are conducted in the same way as earlier. If you expect that some hardcode system administrator/developer can jump into the sandbox/production environments, then you are wrong. Now things have to happen in a sequence and have to follow predefined quality steps to ensure that we get a rock-solid production environment.

Our FastTrack contact have always been available and have given us the “light touch” on the shoulder to guide the implementation and expectations. Remember that FastTrack is not about business processes, masterdata and project management. That is still handled outside of this program.

A small and important reminder; remember that you have to purchase your implementation licenses, and remember that you could start small, and ramp up you license count as needed.

 

Testing Microsoft Flow for CRM –> AX integration

A few days ago Microsoft have the Flow connector available for preview, and you can read more about it here. What I wanted was to see if I could make a very simplified flow, where a customer is created in CRM, and then transferred to Dynamics AX.

The flow therefore consists of the following steps, when a record is created in CRM, a customer is created in AX. After that, I wanted an email to be sent to me.

To test this flow, I created a Customer in CRM online.

Then I waited for a few second, and then the customer was visible in AX. I just became very impressed.

I also received an email, telling me that a new customer was created in AX from CRM, and that made be even more happy.

If I when in and analyzed what happened, I could trace the entire integration in Flow, and also see how much time spent on processing each step. In this case, I see that AX used 10 seconds to process the JSON/ODATA message, and spent 3 seconds to sending me an email that the record was created.

 

Here are the steps I used to create this flow. First I select the Flow action “Dynamics CRM Online – When a record is created”.

Then I specify the organization and the entity name: Accounts

Next I add the action Dynamics AX Online – Create a record

And I select the instance, and what user I should log in with. I also select the entity name: Customers, and select to only transfer the Account number and Account name into the AX entity. Some of the other fields, I choose to hardcode for simplicity reasons.

The last step is to send an email to myself

Some summary.

Using Dynamics AX with flow will certainly be the way forward on how we integrate AX with CRM and all kinds of other 3’rd party systems. It is still in preview, and the next thing we are waiting for is that Dynamics AX can get reactive, and then when a record is created or modified inside AX, this can trigger a flow. But Microsoft have promised that this is on its way. Also remember that this tool has its current restrictions and that we need to be patient and let Microsoft further develop and improve its capabilities. But for easy and simple integrations I would call this as a unique opportunity to get rid of complex and time consuming integrations. As long as you keep it simple it works as intended.

Thanks Microsoft, keep it coming!

Dynamics AX Retail Scale Unit

In the AX Licensing Guide from february 2016 Microsoft announced a new add-on to retail called; Retail Scale Unit.

Retail Scale Unit

As part of our future offering, we are considering offering a scale unit (Retail Scale Unit) that will enable businesses to run in distributed environment across datacenters to support proximity to physical locations
as well as allow distributed storage and help scale out needs of retail and commerce operations. This offering will allow the ability to add one or more identical scale units that can meet the transactional compute needs of retail and commerce channels. Additional details coming soon.

Even though details have not yet been disclosed on what or how it works, it is now available on sites where you buy Microsoft Licenses. Even prices are available, and it is a service priced per month.

Try to google-search on “E3307B7FD0C149AE9B95E4707C9D1AD7” and you will see distributors that are having this SKU/product in their assortments. Stay tuned as more will be explained as Microsoft makes more information available, but this is great stuff for all retailers!

 

 

 

 

Dynamics 365; Hello CDM

The Common Data Model was today as promised made available in preview through PowerApps, and gave us insight on how it works. You need to take a log at the following blog posts. Your entry point for starting to explore CDM is http://powerapps.microsoft.com

Let’s jump past
all introductions take a small look at the products made available the sample demo data when the CDM database is created. After the sample CDM database is created you will have access to the entities here

Then find the entity named Product. Then click on the “Open in Excel

After logging in I start to see some similarities to what we have in the new Dynamics AX. It’s the same excel app on the right side.

It is even the Contoso data, and as highlighted here I’m showing the item 1000 – the good old Surface Pro 128 Gb J

 

Now start your journey into the CDM. It will be the backbone and foundation of our entire Dynamics 365 stack.

 

 

Dynamics 365, PowerApps, Flow and Common Data Model

This summer at WPC Microsoft unveiled the Dynamics cloud strategy by explaining their new initiative named Dynamics 365. Let me say it very short; IT ROCKS !

A good Q&A blog explaining it is this blog post from James Crowter. The essence is that the Dynamics 365 will be available in 2 editions; Business (cloud based NAV-edition) and Enterprise (new Dynamics AX, aka AX’7′). In addition, Microsoft have also launched the AppSource that can help finding the right business apps available from other ISV/VAR’s. This is a great offer to customers, where 3rd party apps and extensions can be previewed.

As the new name implies ‘Dynamics 365’, there will be a tight connection to the Office 365 package. Is there something Microsoft is good at, it is cross-selling and building strong dependency though the entire stack of Microsoft technology. This will further strengthen the offering. Some concerns are that the total offering could be regarded as an increase in costs. Very often we see customers comparing their offer based on the wrong assumptions, and where on-premises offers are compared with cloud and SaaS offerings. This will give the wrong perspective, because often in on-premises solutions don’t include all costs related to implementation and running the systems. What looks as cheap today may in the longer run actually result in higher costs and the build-up of a technological debt. When making the classic tradeoff decisions in technology, make sure you understand the implications.

Dynamics 365 is more than just a rebranding, and the introduction of the new Common Data Model(CDM) is the glue(database) that will stick all pieces/entities together. We can expect that in future, all the components will be working together across the ordinary product lines as we know it today. Customers will download a app, and don’t care if they have a business or enterprise edition of Dynamics.

CDM will over time make sure that Microsoft PowerApps enables users to create applications for Windows, iOS, and Android mobile devices. Using these apps, you can create connections to common SaaS services, including Twitter, Office 365, Dynamics 365, Dropbox, and Excel. Making all kinds of apps will easier, and in many cases not even involve any coding.

My Dynamics friends, please try out the Microsoft PowerApps because this a central element in the future of Dynamics 365, and also check out Microsoft Flow, to understand how the CDM in the future will enable the flow of data and processes between all components in the Dynamics 365 and Office 365 landscape.

Again we have a lot of learning, and I’m amazed how fast the transition to a cloud and mobile first business environment is going. This change will also make ripple effects on the entire ecosystem. New technologies require new organizational approaches and new workforce skills and knowledge. I assume that we again will see consolidations and mergers among the traditional ERP vendors, where the traditional WEB and .NET consultancy is being consolidated under the Dynamics 365 umbrella. We can also assume that smaller ERP vendors are just too small to master all these new technologies, and will slowly fade away. Soon, most of our business processes is handled on your mobile phone, backed by the cloud.

And remember, your best bet is to learn!

How I saved thousands of dollars on my Azure environments!

I just love such headlines, because it instantly attracts attention.

But in this case it is actually true. And Microsoft is even wants us to do this. I want to write how to automatically shut down and start up environments in Azure, so that you are not spending more than needed. This post is for newbies, and experts surely will bombard the comment section with improved suggestions on how to make it even better.

In this example I have 4 environment running in Azure and the machine type I prefer is the D13_v2. This will cost me 3696 USD per month if I just let them stay on for 744 hours per month.

But I only plan to use them 07:00 à 17:00 Monday to Friday. This is 200 hours per month, and then it will just cost 993 USD J A lot of fun can be done with these extra credits.

So what is the next step? The trick is to use the Azure Powershell Runbook. Here is the step-by-step instruction on how to set it up:

1. Log into Azure, and open the Azure automation

2. Add an Automation Account.
    Create a name, like “TurnOffOnVM”.
    Select the subscription, and if a resource group should be created. Also if you want an Azure Run As Account. (I didn’t bother to have that, since I have no important stuff on these environments)

3. Then create an Asset named “automation” for holding credentials, that will run the shutdown/start up scripts. The credentials you are using must have the rights to run scripts and to start/stop VM’s.

4. Let’s create 2 Runbooks, that holds the scripts and schedules for the start and stop scripts.

5. Use the “Powershell Workflow” type

 

6. Let’s put in the “Start script”. It’s done here

 

I have removed my VM-names in this example.

If you wonder what your VM name is, it is the computer name, that can be seen here:

Here is a copy-paste version of the Start-up script:

workflow StartVM

{

    $cred = get-automationpscredential -name “automation”

    add-azureaccount -credential $cred

    select-azuresubscription -subscriptionname “Microsoft Azure Enterprise”

 

    $VMs = Get-AzureVM

 

foreach($VM in $VMs)

{

if ($VM.Name -In “VMName1”, “VMName2”, “VMName3”, “VMName4” )

{

if ($VM.PowerState -ne “Started”)

        {

     Start-AzureVM -Name $VM.Name -ServiceName $VM.ServiceName -ErrorAction Continue

        }

}

}

}

7. Let’s put in the “Stop script”. It is basically the same procedure as creating the “start script”, so I just add the copy-past version of the script.

workflow StopVM

{

    $cred = get-automationpscredential -name “automation”

    add-azureaccount -credential $cred

    select-azuresubscription -subscriptionname “Microsoft Azure Enterprise”

 

    $VMs = Get-AzureVM

 

    foreach($VM in $VMs)

    {

    if ($VM.Name -In “VMName1”, “VMName2”, “VMName3”, “VMName4” )

    {

        if($vm.Status -eq ‘ReadyRole’)

        {

        Stop-AzureVm -Name $vm.Name -ServiceName $vm.ServiceName -Force

        }

 

    }

    }

}

 

Remember to press the “publish” button the scripts J

8. Let’s create a schedule (one for the Start runbook, and one for the stop runbook)


9. You can now monitor the start/stop scripts:

 

10. Go party with all the credits you have saved! And if you see me, and use this script, buy me a beer J

 

Happy DAX’ing J

The most important AX.HELP page

To always keep an eye on what’s happening is important. We see that AX.HELP is growing and becoming the number one center for understanding the new AX. I want to give you the most important page; https://ax.help.dynamics.com/en/wiki/help-get-started/ Read it!

What I did was to setup a RSS feed to get all the news and new articles, and the address is https://ax.help.dynamics.com/en/revisions/

Setting this up in Outlook is easy. Right-click on the RSS Subscription, and add https://ax.help.dynamics.com/en/revisions/

You will then get a RSS message for each new post and article. You will in 5 minutes every day get the overview of what have been published and update. No more slow searching, and you will quickly be the “go-to” expert, that knows it all.

Happy DAX’ing

 

 

 

New Dynamics AX – Pimp up your form-letters with document branding

The New Dynamics AX never stops to surprise me, and every day I find new possibilities and solutions. Microsoft have made available a new set of form letters, like purchase order, sales confirmation, invoice etc, and installing them is optional. What is new is that they are much nice and modern, but they are missing country specific requirements. Microsoft is calling them “Modern reports”, and you can read about them here.

But the main topic of this blog is about how to pimp-up your form letters, with document branding. The following report is the Purchase order in modern design, and where I have added a logo and some colors.

The menu item for controlling this is found under Organization à Setup à Document branding

We have the following menu items;

Document brands, is just the identifier, like company ID etc.

Document images is a container for logoes etc,

The Brand details is where we can override per formletter and design, and select colors, and override addresses and contact information.


 

I expect Microsoft have much more coming, so stay tuned J

AX RTW My ODATA and JSON journey – Part III

Now the fun begins, and let’s develop! The following post is only meant for hardcore Dynamics AX technical consultants J

In previous posts I wrote about how to access Dynamics AX data and metadata through ODATA and only using an internet Explorer. In these scenario’s we where only fetching data from Dynamics AX. This time, we will use Visual Studio to publish data into Dynamics AX, by accessing the ODATA services. I must let you know, that I consider myself as a newbie to creating C# code, and the following post is only for giving you a directional guide for you to start exploring for yourself.

What you need to make this happen is:

  1. Visual Studio 2015
  2. Administrator access to Azure AD
  3. A deployed New AX on Azure

What I wanted to achieve here, is to be able to add vendors from a C# program. A None-Dynamics AX developer may have no idea of the inner structure of AX, but they can be given access to the metadata. Based on this metadata it should be possible to create CRUD integrations. One issue with Visual Studio is that it is not possible to consume ODATA services directly. So we need to generate a proxy library. The MSDN OData v4 Client Code Generator is the best way of doing this, because it will generate wrapper classes for the data entities. To speed up a bit I found the AX-LAB12, where Microsoft is showing how to import a BOM, here I found the framework that we can use. This AX-LAB12 contains a word document that is good to understanding how to set this up. I’m “stealing” the following 4 first classes from the.

The AuthenticationUtility is the class that makes sure we are authenticated with Azure AD, and that we are logged in with the right user. In this class you can hardcode the user/password and the Tendant and the ActiveDirectoryClientAppId

The next step is to generate the ODataProxy. This is done in the Microsoft.Dynamics.DataEntities project. This basically means creating a bunch of classes that reflects all the metadata. It will give us a class, so that we can assign values to Odata fields and execute methods etc. But first we must specify where all the metadata can be downloaded from. In the picture below, you see that this is just a hardcoded string in the OdataProxyGenerator.tt file.

Then right-click as shown under, and select the “Run Custom Tool”.

This will then download all the metadata from the published data entities in Dynamics AX, and create one class per data entity. It takes a few minutes, and it creates thousands of classes.

Since we want to create vendors, it is interesting to see how the Vendor data entity looks in AX, and how the generated C# proxy class looks like:

As you see, we are consuming the ODATA Data entities into visual studio, that let’s to access fields and methods as we are used to in X++. And this by only generating proxy classes from the Odata metadata.

Then I may start developing against the ODATA proxy classes, and now I see that fields and method lookup, that we are used to in X++ is working. As seen in the following picture, I’m declaring the vendVendorEntity of the type Vendor, that have the same structure as defined in the Data Entity.

My complete code for creating a vendor using ODATA is therefore :

I build and run:

I then check AX to see if the vendor is created:

It works J

Let’s try to see if I change the code, and are selecting a vendor group that does not exists :

It correctly don’t let me create the vendor J

The conclusion:

The ability to create CRUD operations using ODATA, changes the game. External none-Dynamics developers can create apps and integrations through the Odata services, and it regulated through security and validation. They don’t need to know the internal structure of Dynamics, because this is exposed through the metadata service. Dynamics AX is truly a game changer.

Happy DAX’ing J

 

 

New Dynamics AX and the Excel Add-on

When using the «Open in Excel»( Dynamics Office Add-in) feature in the New Dynamics AX RTW, you may have some trouble opening it in Excel.

Especially if you have a corporate login, like me. It then seams that the login failed.

 

Microsoft have upgraded the Dynamics Office Add-in, but on existing demo data (Contoso) may also need to be changed.

Then the connector seams to be working (At least for me)

Also take a look at https://ax.help.dynamics.com/en/wiki/office-integration-troubleshooting/

Happy DAX’ing

New Dynamics AX On premise = Azure Stack

As we know, deploying the new Dynamics AX will basically come in 3 different flavors. I wanted to explain a bit what this means and what I have found. The information here should be double checked together with your partners, and also with Microsoft. Also remember that it all is very fresh technology, and that things may change quickly as must is in early releases and preview.

 

  1. AX Public cloud – Black-box, maintained by Microsoft in Azure and it just works.
    The public cloud “edition” was the first platform that the new Dynamics AX was released on. In the public cloud it is Microsoft personnel that is deploying and monitoring the instances. Customers and partners should have no technical access to the production environments. Data and code (like customizations) are created as packages and uploaded into LCS, where according to maintenance windows, and Microsoft will deploy them to the production environment. Customers pays a monthly fee per user, that includes licenses a production environment with high availability, disaster recovery and some sand-box environments (for testing and dev). The customer doesn’t have consider how to scale or what kind of virtual machines is needed. This is taken care of by Microsoft. Customers must expect to pay at least 110.000 USD per year in costs for this. It is my consideration that this offer actually is a very good offer, because it includes many of the services and licenses that we don’t normally consider when evaluating costs for operating a ERP system. I think than smaller customers (50-250 users) would benefit from this scenario.
  2. AX Private cloud – Maintained and deployed by customer/partner, but still on Azure.
    Private cloud is 100% running in Azure. Private just means that Microsoft is not deploying and monitoring the instances. In this scenario you will purchase AX licenses, and you will purchase Azure services and deployments. Basically 2-3 invoices J. You scale up the VM’s according to you needs, and it is your own responsibility. It is typical a partner that can help out, and you probably will have to purchase service agreements to monitor and maintain your Azure deployed instances. Will this be cheaper than the “public cloud” offer? If you compare apples with apples I don’t think so. There are many hidden costs, and if you sum up the costs, at least my internal calculations show that this offer quickly can be 20% more expensive than the Public Cloud offer. But the private cloud offers flexibility, but will demand a very knowledgeable technical department/partner. You can decide more by yourself within the boundaries of the Azure. I expect that larger customers (250+ users) would like to go for this scenario.
  3. AX On-Premise and Azure Stack – For those that have a datacenter to spare

    Azure Stack is the new hybrid cloud platform product that enables organization to deliver Azure services from their own datacenters. You get cloud services, yet maintain control. You decide where to keep your data and applications—in your own datacenter or on others/azure. You will still pay for the AX licenses, but the you will also have to pay for your own hardware. There is one problem. It is not released yet. We are waiting for Windows Server 2016 with Azure Stack, and SQL Server 2016. These are still in technical preview. But for those (like me) that like to try out, you can actually download it from https://azure.microsoft.com/en-us/overview/azure-stack/ . If you wonder what kind of machinery is needed, take a look her. (Basically 16 Cores , >128 Gb RAM and a few TB of disk). It will be a bit difficult to run the Azure Stack on my portable PC J. Also remember that there will still be lots of services that still have to be on the cloud. I assume that this option will be selected for large enterprises (1000+ users) and for hosting providers/ASP.

And remember that what I write here is not facts, but just my interpretation of how it can be.

Happy DAX’ing J