Business Central Planned Service Order Calendar 🗓️

If you have worked with the Service Management area of NAV/BC then you will have greater insight into the upcoming words and images I’ve thrown together. Within BC service you can log a service contract and assign items to be serviced on a routine basis

Image here shows some key fields which drive that functionality – “Service Period” is a date formula field showing how often we expect a planned service (here it is 1M to indicate once a month). “Next Planned Service Date” and “Contract Expiration Date” then act as the Start and End to the planned services. It is possible to not set an expiration date. For the “Calendar” idea I am about to layout you need that data in place – or you adapt the code/BC data to use defined planning horizon (end date).

All seems fine here I’m sure, but the mild inconvenience behind this post is that only 1 “planned” service order can exist at a time. BC uses the below processing only report – “Create Contract Service Orders” – to create the “planned” service orders. There is another two fields used within this routine called “Last Planned Service Date” and “Last Service Date”. These act as the control to ensure you have just the “required” service order generated by the routine.

*note that this routine does check the service management setup to see how far in the future in can create service orders for.

I’m not bringing this up because I have an issue with standard. Having recently discussed this with prospective users of BC, they want to use a calendar view to aid with resource planning. BC can handle resource allocations when the service order exists only. So what do you do in this case? One possible option is to look at a reporting based solution and just let BC get on with how it automatically creates the required service orders as and when – will certainly stop getting loads of clutter in the tables which hold the data.

The part of BC that has a vague similarity to what I want is Deferral Schedules. Confused? 😵 Let me show this:

Before a deferral is posted you can preview what it will produce. In the case of the example it is for 6 months and it was produced data for that – but only in this view. This is in effect what I think would be useful with service contracts so you can plan from that. I’m opting to produce something in Power BI on this occasion so I can display data in a matrix visual. This means a page (api or regular as a web service) is required. For this I’ve taken some inspiration from how the standard BC API handles Sales Order Lines. The table the API is based on is a temporary table and it isn’t even the Sales Order one:
For me the page isn’t that straight forward in how it eventually gets populated but it does it with the assistance of a codeunit in the OnFindRecord trigger of the page. For my page I have adopted a similar approach – but kept it simple. A link to the code is at the end of the post.
This is the kind of view I was able to generate because I have my API page producing data for future dates based on the “Service Period” defined per service contract line. Note that BC allows for service items to appear on multiple service contracts. So where I have a count of 2 (yellow dots) the service item is scheduled for two different services that month. It’s something standard BC just can’t generate right now and it gives so much more information to a user. I’m sure there are some other use cases for this concept – interested to see what others come up with. You could produce a matrix page in BC with the concept too 👍

Link to my code: https://github.com/JAng13sea/Blogs/tree/master/Service%20Calendar

Business Central Dynamic Bin Ranking 🚮

A business recently explained to me that they were running a dynamic bin ranking method, to enhance the likelihood of a correct pick walk not necessarily an efficient one. They had visited an Amazon fulfillment centre, which used something similar and figured if it is good enough for Amazon then it is good enough for us! Worth reading into some of the features of an Amazon warehouse, like this article: https://www.allaboutlean.com/amazon-fulfillment-3/

As well deciding bin ranking needed to be dynamic the business in question also ensured items were in scattered bins across the warehouse. This was to remove the likelihood of picking the wrong item – for example a pair of shoes would be in a bin next to a coat or trousers – hence the “correct pick” statement from earlier. In BC the “Bin Ranking” helps to control three functions: putting stock away, bin replenishment or pick line sorting. Due to this scenario it is important to ensure any “dynamic” facility sticks to specific numbering so that both facilities remain functioning. Need more info on bin ranking? Check out the standard docs about it: https://docs.microsoft.com/en-gb/dynamics365/business-central/design-details-warehouse-setup

General rule of thumb with bin ranking is that your pick face has the highest rank and the bins which are easiest to pick from (close to the floor or near the packing area) are given the highest rank number value. Therefore dynamically ranking bins in the way I’m about to explain might not fit the warehouse layout. You could certainly produce this routine in AL with a processing only report potentially and run it on the job queue. I have opted for Power Automate in this case to see if it could handle a scenario where multiple tables are checked to determine a result. Also it provides a single place for all business logic and any data references it makes. I understand these justifications won’t be enough for everyone but bin ranking isn’t for everyone either. Some techniques used might be helpful for other scenarios – like making use of a flowfilter with a query API or not using SystemID as the “row ID” for the BC connector in Power Automate.

Power Automate being chosen doesn’t make this a development free solution. Link to the code is at the end but you will need an API page for the bin table, API page for the bin content table and an API query for the item table. The image gallery below shows each step of the flow used. Added text on each image for each context or details on expressions/code:

Link to the code for the API pages and API query: https://github.com/JAng13sea/Blogs/tree/master/Dynamic%20Bin%20Ranking

Business Central Demand Forecast Save Filters 💾

A while ago I wrote this blog https://joshanglesea.wordpress.com/2020/07/18/demand-forecast-needs-something-extra/ and more recently this has landed in BC: https://docs.microsoft.com/en-us/dynamics365-release-plan/2022wave1/smb/dynamics365-business-central/new-ui-entering-demand-forecasts-add-support-variant-code-other-improvements so the timing is right to review.

This is currently a feature which needs enabling before you can use it (📝note it is auto enabled Q2 2023):

It’s a great set of improvements so my previous blog can big ignored from this point on. However, do I have everything I need to work quickly and effectively? That’s what I aim to question with this need post by posing an additional design idea.

When a user leaves the page it remembers the previous filters that were in place, which I really like 😍. Got me thinking though about saving other filters and then providing the ability to jump between those. Idea being that you have given a user a greater flexibility with filters so why not further enhance that experience? In standard BC it would be similar to that of the ability to save general journal templates or segment criteria, if you know CRM in BC.

For Demand forecasts I was thinking something like this:

The saving process could then be like this:

A user could be offered the chance to alter filters and apply them to other forecasts too:

If you think this would be a good addition please comment, like the post or use the voting buttons:

I’ll look at posting it as an idea if others think it’s worthwhile. Hopefully we’ll see if it makes it to the base app in the future 👍 If you don’t want to wait around for that here is the code which generates the experience shown: https://github.com/JAng13sea/Blogs/tree/master/Demand%20Forecast%20Saved%20Filters

Business Central Attach Email Body 📎📧

I recently did a post about how BC works with emails for CRM mostly. Check it out here: https://joshanglesea.wordpress.com/2022/07/01/business-central-crm-email-%f0%9f%93%a8/
This got me thinking about another scenario which I get asked about from time to time by prospective clients. If a customer/contact emails something important can you then attach that to a record in BC?

Now, I accept that attaching in BC is a nice user feature but I also appreciate long term it will build up the database size. I’m leaning towards the practical side of the argument in that checking emails in the interaction log entries isn’t that intuitive (check the referenced blog to understand more). Checking the attachments of a document or master record is though. Especially for critical information like a purchase order reference for a sales order. My idea for a solution stems from the wave 1 2022 release of this functionality – to save you checking it allows users to take email attachments into BC attachments for records: https://docs.microsoft.com/en-us/dynamics365-release-plan/2022wave1/smb/dynamics365-business-central/outlook-add-in-add-attachments-emails-directly-business-central-documents

If you have used the BC outlook add-in you will probably know about the “Suggested Line Items” feature. This is where you choose to create a new sales document and it reads the email body to find matches for items in the company. I’m mentioning this because the two features I’ve brought up resulted in my answer.
My scenario with an example. Here is an email exchange between Tim who is a BC user and his customer. They want to progress with the order and have responded with their Purchase Order reference with.
Using the BC Outlook add-in the user finds the Sales Order and follows the steps shown above – clicks on the ellipsis icon, scrolls and locates “Attachments”, finds a new button called “Attach email body”.
This will add the html of the email as an attachment record. I used the current date and time in conjunction with the email subject to create the name of the attachment. You could use something else of course.
Users aren’t able to view the attachments in the BC Outlook add-in – be it html or not. This is the message which pops up if they try.
However, in the BC web client a user can work with the attachment as normal by clicking on it to download, share or open in OneDrive. Given it is a HTML file I have added an extra option, which makes the most of a standard feature. The new button is “Preview Email” and it displays the HTML attachment. Granted it’s not perfect but it’s a quick and effective way of viewing the details without having to go rooting around in the interaction log entries
I’ve made it so the “enabled” property of the page action only works with files that have the html “File Extension”.

Like what you see? Want to try it out? You can find the code here: https://github.com/JAng13sea/Blogs/tree/master/Email%20Body

I attempted to make the HTML nicer by using https://www.hougaard.com/wysiwyg/ but the result was the same as the standard preview. If like me you want to have the preview formatted correctly you will need to try a different approach – which is another copy of how standard BC does it.

Here is how the preview page looks when it is passed the text value from a report object instead – which is then saved as HTML. Code for this one is on github too.

Business Central CRM Email 📨

During demo engagements I often get asked to demo BC CRM. This can be because it’s a good fit or a best of breed CRM is currently out of reach. As part of this they want to know how much can be done with emails – the bread and butter of CRM. There are a few different uses of the term “Email” in BC and how a user can send or review emails. I had to create a guide on each area so thought it might be worth a share. Given this is CRM focused it is worth noting that these features work with contacts*. I will cover the following email situations:

  • Outlook synchronisation (this is the main topic to cover)
    • From Outlook add-in
    • From within BC client
  • Create Interactions
  • Send By Email
  • Segments
  • Sending documents (*this works without contacts)

Outlook synchronisation

For details on how to setup this up check out this video: https://youtu.be/LXum6xzpRxE

Once the main setup has been done each user will be setup like so and will have a new folder added to Outlook called “Queue”. Jump to the bottom of the blog for new user setup tips

If contacts in BC is a new concept the easiest way to get up and running is using the BC Outlook add-in. Brand new contact from the lens of the BC Outlook add-in will ask to create a contact for the email address. This is the simplest way to get moving with contacts if you don’t have them in place already:

Creates a contact record – which is essentially a copy of the contacts you store in Outlook:

Respond to the email in Outlook as you would usually do. For the original email and the response entries are placed in the special “Queue” folder:

The “Queue” folder can be emptied if the above codeunit is running on the job queue.

Once the job queue has run it will bring in the emails to the interaction log entries:

*note the “Salesperson Code” depicts who the email is to/from. The “User ID” is in fact the user which originally setup the integration.

If a user chooses “Show Attachments”

It will display this message:

Which opens a new tab with the email from Outlook

If you reply in outlook, then the email will be available to view in the Interaction Log Entries – which you can open as shown before:

Detail is all available on the contact card:

What about if you use the “Send Email” feature in BC?

I’m choosing it from the Outlook add-in here but it is the same in the BC web client too

That also goes into the Outlook “queue” folder. The “Send Email” feature allows you to send as a different email address (the from address). If that isn’t a user synchronised with BC then the email will not go to the interaction log entries.

When done with the “current user” the email will appear in both the “Sent Items” and the “Queue” folders:

Once an email has been passed to BC the “Queue” folder will no longer contain the email. Because you have done this in BC you can also see the entries in another location, other than the Interaction Log Entries:

The Interaction Log Entries also has a copy just like before

The “Queue” is accessible by other users so you can see email traffic that has gone to BC defined contacts:

Different user accounts inbox view

What about seeing the BC details from Outlook? From the add-in you can access interaction log entries in a couple of places:

Drilldown on the

Create Interactions

“Create Interactions” is an action users see in scenarios where contacts are in use. It is available directly on the contact, opportunities, tasks or on the interaction log entries page. When using the feature you are effectively calling upon ready to go email or attachment templates.

Here is a basic free type email interaction to a contact
This pops open the BC email page before sending and adds it to the contact card in the “History” section:
The “Queue” folder will receive an entry too

What’s confusing about this is that you end up with two interaction log entries. One is the one instantly created by using the feature and the second is part of the Outlook synchronisation:

If the user clicks on the “Attachment” for the entry for Outlook it pops the message from earlier

If you choose the none Outlook one it opens a page like this:

How do you tell the difference? Use the “Email Logged” field as “TRUE” with a filter

Segments

The exception to this rule, if you use the “Segments” feature which uses interactions you do not get the duplication in the interaction log entries.

Here are some previously logged segments where specific contacts have been targeted. Can be for sales, marketing or customer service purposes.
When a user chooses an entry it will just show it in the preview page and not by opening Outlook

Sending Documents

The last email scenario a user can do is for documents. Posted Sales Invoice for example has 2 email related buttons but they both result in the same thing ultimately, however the 1st to the left asks an extra question about the output

Extra page which is shown when using the 1st “Send” email option

This will also create an interaction log entry:

And show it in your users “Sent Items”:

New User Setup

If a new user wants to have their emails synched then they we’ll need to firstly be setup in the user setup page and have a “Salespers./Purch. Code” assigned to them.

Following on from that the user will need to have their account synched. Access the above page from the early shown “Exchange Sync. Setup”. Add the “Queue” folder by choosing “Add Favourite” in Outlook. The “Queue” folder will be one of the options available to the user. If the new user doesn’t see any results in the Outlook “Queue” folder straight away don’t be surprised. Other users who have been connected for a while will see their activity though:
If you the user doesn’t get access to the folder then you can logon to exchange and force the issue by adding them to the public folder
New entries have the “JA” Salesperson Code

Business Central Approval Out of Office Check 🌴📧

This is a FAQ when the topic of approvals comes up with prospects or customers. Solving this conundrum requires you to determine where you will get a substitute or replacement approver from. In my case I have used the approval user setup in BC. To achieve this a custom api page needs to be generated. If you can’t do that then check out appsource for apps which help. At the time of writing the simple object designer is a good one. The process for building the flow can be seen in the image gallery below. The first two steps are just the regular BC workflow approval trigger and get record action. I have only done one out of office mailbox check but you can add more with the format I utilise. If you need an overview on the basics of using Power Automate for approvals check out this post: https://joshanglesea.wordpress.com/2022/04/29/business-central-customer-approval/

There are a few expressions that get used and I’ve added notes on all of the images of what expression is being used at that point:

I’ve exported the flow and you can grab it for import instead of building from scratch. The JSON format is available too if you use Azure logic apps instead of Power Automate: https://github.com/JAng13sea/Blogs/tree/master/Out%20of%20Office%20Approver%20Check

Business Central Horizontal page tabs ➡️

Back in the days when BC was NAV this was your typical card/document page:

Take note of the horizontal tabs as opposed to how BC currently does things with vertical tabs. It worth considering that this type of design can be seen in other ERP systems and it’s an efficient way of moving through details without scrolling. The switch over itself happened way before BC was even a idea:

This is where the history lesson stops and the new wave comes in. I’ve been dealing with a prospective client who uses an old version of AX and the switch to vertical tabs lacked some appeal. My initial thought of course was PowerApps but I wanted to challenge myself with BC page design. This blog is to show a concept which I think is nice but maybe on the novelty side for some. This is how it looks and functions on a card page for items:

You could argue that page actions could have been used but I felt this didn’t make it completely clear which “tab” the user is looking at if they have to marry the page part caption and page action caption together.

The main thing to share here is the code so please check it out if you would like to try this on a page you think this fits. So which is faster? Horizontal tabs or BC Fast tabs? who wins the fight? You decide! 😏

page 50204 "Horizontal Tab Item Card"
{
    Caption = 'Item Query';
    PageType = Card;
    SourceTable = Item;
    Editable = true;
    InsertAllowed = false;
    DeleteAllowed = false;

    layout
    {
        area(content)
        {
            group(ItemTabs)
            {
                Caption = '';
                grid(buttonrow)
                {
                    GridLayout = Rows;
                    field(GeneralTab; General)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = GeneralStyleText;
                        trigger OnDrillDown()
                        begin
                            VisOption := VisOption::General;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(OnHand; OnHand)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = OnHandStyleText;
                        trigger OnDrillDown()
                        begin
                            VisOption := VisOption::OnHand;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(SaleOrd; SaleOrd)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = SaleOrdStyleText;
                        trigger OnDrillDown()
                        var
                            SL: Record "Sales Line";
                        begin
                            VisOption := VisOption::SaleOrd;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(PurchOrd; PurchOrd)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = PurchOrdStyleText;
                        trigger OnDrillDown()
                        var
                            PL: Record "Purchase Line";
                        begin
                            VisOption := VisOption::PurchOrd;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(Transac; Transac)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = TransacStyleText;
                        trigger OnDrillDown()
                        var
                            ILE: Record "Item Ledger Entry";
                        begin
                            VisOption := VisOption::Transac;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(Website; Website)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = WebsiteStyleText;
                        trigger OnDrillDown()
                        begin
                            VisOption := VisOption::Website;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                    field(Stats; Stats)
                    {
                        ApplicationArea = All;
                        ShowCaption = false;
                        DrillDown = true;
                        StyleExpr = StatsStyleText;
                        trigger OnDrillDown()
                        begin
                            VisOption := VisOption::Stats;
                            ChangeStyleExpr(VisOption);
                            CurrPage.Update(false);
                        end;
                    }
                }
            }

            part(AXItemCard; "Item Card General")
            {
                Visible = VisOption = VisOption::General;
                ApplicationArea = All;
                SubPageLink = "No." = FIELD("No."),
                              "Date Filter" = FIELD("Date Filter"),
                              "Global Dimension 1 Filter" = FIELD("Global Dimension 1 Filter"),
                              "Global Dimension 2 Filter" = FIELD("Global Dimension 2 Filter"),
                              "Location Filter" = FIELD("Location Filter"),
                              "Drop Shipment Filter" = FIELD("Drop Shipment Filter"),
                              "Bin Filter" = FIELD("Bin Filter"),
                              "Variant Filter" = FIELD("Variant Filter"),
                              "Lot No. Filter" = FIELD("Lot No. Filter"),
                              "Serial No. Filter" = FIELD("Serial No. Filter");
            }

            part(SalesOrdsPart; "Sales Orders List Part")
            {
                Visible = VisOption = VisOption::SaleOrd;
                ApplicationArea = All;
                SubPageLink = "No." = FIELD("No.");
            }
            part(PurchOrdsPart; "Purchase Order Line ListPart")
            {
                Visible = VisOption = VisOption::PurchOrd;
                ApplicationArea = All;
                SubPageLink = "No." = FIELD("No.");
            }
            part(Transactions; ItemTransactionsCardPart)
            {
                Visible = VisOption = VisOption::Transac;
                ApplicationArea = All;
                SubPageLink = "Item No." = FIELD("No.");
            }
            part(SKuList; "SKU List")
            {
                Visible = VisOption = VisOption::OnHand;
                ApplicationArea = All;
                SubPageLink = "Item No." = field("No.");
            }
            part(picture; "Item Picture")
            {
                Visible = VisOption = VisOption::Picture;
                ApplicationArea = All;
                SubPageLink = "No." = field("No.");
            }
        }
    }

    trigger OnOpenPage()
    begin
        VisOption := VisOption::General;
    end;

    procedure ChangeStyleExpr(VisOption: Option General,OnHand,SaleOrd,PurchOrd,Transac,Picture,Stats): Text[50]
    Begin
        ResetTabs();
        case VisOption of
            VisOption::General:
                GeneralStyleText := 'StrongAccent';
            VisOption::OnHand:
                OnHandStyleText := 'StrongAccent';
            VisOption::PurchOrd:
                PurchOrdStyleText := 'StrongAccent';
            VisOption::SaleOrd:
                SaleOrdStyleText := 'StrongAccent';
            VisOption::Stats:
                StatsStyleText := 'StrongAccent';
            VisOption::Transac:
                TransacStyleText := 'StrongAccent';
            VisOption::Website:
                WebsiteStyleText := 'StrongAccent';
        end;
    End;

    local procedure ResetTabs();
    begin
        GeneralStyleText := 'Standard';
        OnHandStyleText := 'Standard';
        SaleOrdStyleText := 'Standard';
        PurchOrdStyleText := 'Standard';
        TransacStyleText := 'Standard';
        WebsiteStyleText := 'Standard';
        StatsStyleText := 'Standard';
    end;


    var
        General: Label 'General';
        OnHand: Label 'On Hand';
        PurchOrd: Label 'Purchase Orders';
        SaleOrd: Label 'Sales Orders';
        Transac: Label 'Transactions';
        Website: Label 'Picture';
        Stats: Label 'Statistics';
        [InDataSet]
        VisOption: Option General,OnHand,SaleOrd,PurchOrd,Transac,Picture,Stats;
        [InDataSet]
        GeneralStyleText: Text;
        [InDataSet]
        OnHandStyleText: Text;
        [InDataSet]
        PurchOrdStyleText: Text;
        [InDataSet]
        SaleOrdStyleText: Text;
        [InDataSet]
        TransacStyleText: Text;
        [InDataSet]
        WebsiteStyleText: Text;
        [InDataSet]
        StatsStyleText: Text;
}

Business Central select document approver 🔍👤✅

This post came about due to the below question on the Dynamics Community forum:

Here is how I would solve this one with Power Automate. Initially use one of the templates available on power automate for creating document approvals. This gives you the perfect head start. In BC itself you need to decide on using one of two standard fields. On sales and purchase documents you have either salesperson/purchaser code or assigned user ID. The latter is my favoured option. The list is maintained in a better way and you don’t have to worry about character limits. Is it a problem that a user could choose themselves? Read on to find out 😏

Both of the mentioned fields are already in the data available from Power Automate – not ruling out a custom field but not including it in my post.

Four things to do in Power Automate once you have the template pointing to the correct database and company etc.:

The assigned to property of the approvals action needs a valid email address for a user in the directory. Simple hard code your internal email domain after the user value. Purchaser/salesperson codes might not follow a username type logic forcing you down the assigned user ID route.
Three alterations remain from the template. I’ve moved the approval action into the “if yes” of condition 2. This is where you check a self approval isn’t being done. The approver email must not match the sender email. It makes sense to add a rejection action in the “if no” part of the condition. Repeat this for condition 3 of the template – difference being that is the main rejection step.

There is now a dependency on the assigned user ID field being populated. Arguably a new condition should be added so you can reject and email/team’s message the sender and have them populate the field. I’ve opted against that and adjusted the workflow record in BC instead. This will stop the approval request from triggering so well trained users will spot what is missing.

Go to the workflow record and access the “On condition” of the top row in the workflow steps.
I’ve added a filter for my chosen field of “assigned user ID”

Business Central Customer Approval with Power Automate 🏢✅

Recent Microsoft Dynamics Community forum post has prompted this blog:

In this post I will give an answer but also review briefly why I chose Power Automate over regular BC workflow.

BC workflow has a template for credit limit changes. That was my initial suggestion but it’s the requirement around blocking the customer which stopped me. Yes, BC workflow has record restrictions but they only stop postings. The blocked field on the customer stops document creation like orders or invoices. BC workflow doesn’t allow for other fields to be updated. Only allows for the changed one to be reverted back and altered on approval. The Power automate route however will allow you to make other alterations, or other actions, with greater ease. I’ve used a different field but here is how I’ve tackled this.

In Power Automate use the template for customer approval. This will add 80%of what is needed for the solution. Fill in any of the areas that are blank, like which database and company etc. Upon saving the flow it will then pass a workflow record to BC. You will find it in the regular BC workflows list:

The “When Event” will of course be about an approval being triggered:

That will be one of the things to alter along with a specific data condition. All to be done here in BC. The rest is for Power Automate to handle 🤘🏼
Change the when event to “A customer record is changed”. Then drilldown on the “On Condition” to setup what you need for your circumstance.
In my case I only want to handle records which meet my filter conditions and if the “Gen. Bus. Posting Group” is changed. I wouldn’t go without filters or you risk firing off approvals at daft times like when manually creating a record. Step by step field population counts as a modification

That’s the BC setup done ☑️ just the Power Automate parts now 😊

Two additions to the original template are required. They are nearly the same too. Choose the “update record” action from the Power Automate BC connector. Use the Id from the “Get Record” action above it. Choose the customer table and add the word All to the blocked field
Within the “Yes” condition, for when the request is approved, add the same action. This time the Blocked field needs to have a single space. This will revert it back to blank – which is an unblocked state

Business Central Multi Company Approvals ✅❌🔀

I have some things to state upfront as there are things to justify for why the solution is shaped in a certain way. The premise of the post is to solve the fact a user who works across many companies might have many approvals. The standard UX is a big driver for why I’ve chosen the topic. If using out the box BC approvals you can get emails. However, they don’t have any action buttons.

So you’re forced to login. You’d have to do that for all companies you have approvals for. If you’re using Power Automate approvals you do have action buttons in emails. You do have to read every email though to access them.

Is there a better way?! 🤔

I figured the “newish” company hub app would be a good starting point. Have you seen it? A user can view all user tasks they have in each company they have access. Sounds close to what I am thinking 🧐. However, I found the table for this it’s not extendable. The method for how it gathers user tasks seemed fairly involved. I liked the idea of shoehorning it in there but felt that an easier route existed. Access to the detail from the role centre is certainly the way to go though. But access elsewhere is fine too.

Company Hub view users can access

My answer is to use a PowerApp embedded in a Power BI report. Handles plenty of the circumstances required. Here is a version of the app running through some basic functions it could offer to a user:

I have added few other considerations at the end of the post. I will cover off the main steps I took to create the above experience – **which covers approvals controlled by BC Workflows only**. A full explanation of how the entire thing was built would be best by video. In absence of that I will adopt a high level approach for readability, calling out key components.

Components required and where I got some of my details. The order of them is important:

Code for the new api page can be copied from here: https://github.com/JAng13sea/Blogs/blob/master/Multi%20Company%20Approvals/Pag57250.MultiCompApprovalActions.al

Why do we need the custom actions? Doesn’t the BC connector for Power Automate already have those? Yes, they do. However, they are for approvals done via Power Automate only. I am yet to figure out a company based difference in the Dataverse table to bring this type of experience solely for approvals done via Power Automate. **Watch this space**

Using the new API page and the companies endpoint as queries in your PBI report file. Follow Steven Renders blog to ensure the approvalActions is for all companies.
Create a relationship between the two tables with the “Company Name” value being the link
So that a user only sees their approvals add in security so the approverEmail must match up with the user that is logged in and using the report.

Now the Power Apps visual can be added to the report. Use this guide if it is a new experience: https://docs.microsoft.com/en-us/power-apps/maker/canvas-apps/powerapps-custom-visual

A copy of my PowerApp and the power automate flow used can be found here: https://github.com/JAng13sea/Blogs/blob/master/Multi%20Company%20Approvals/MultiCompanyApprovalsApp_20220525155716.zip

One of the big takeaways from the app design is that the user works with a collection instead of the direct data from the PBI report. When the app starts the initial collection is made and that is displayed to the user.
Other components that are crucial are the action buttons to approving, rejecting and substituting. Each button is the same and only the “Type” value is altered on each. Idea being that a new collection for only the entries we need to action are taken and sent to Power Automate. The original collection which is what the user sees is then updated. This is because the PBI report doesn’t get refreshed constantly – it will be on a defined schedule:

Now for more detail on the Power Automate flow. In a standard scenario the connector for BC is not dynamic to a company. In this scenario it has to be. It is a general possibility to pass custom text to connectors in Power Automate and they can be based on variables you setup. The same as setting a header on a HTTP call.

Conclusion: This method certainly falls into the low code bracket but I also think it has caveats which might not work if you have to scale it out to many users. Users do often set aside a time in their working day or week to do approvals so having a one stop shop approach is a bonus. The refreshing of the data is a drawback because of it being held together and formed by the PBI report. On the other hand it it such a small and discreet thing a user has to do so living with some level of drawbacks might suffice because the standard experience is far worse IMO. The time and effort to do approach this in AL is probably worth it in the long run and it’s an obvious thing to have in the “Company Hub” app. The nicest takeaway is the ability to make the BC connector in Power Automate be company dynamic. Have to see what other scenarios crop up where that could be used 😏

I think this should be in the Company Hub app and I have logged it as an idea which will need votes: https://experience.dynamics.com/ideas/idea/?ideaid=658e8d1c-e0dc-ec11-bea1-0003ff45e08b

Business Central automated pricing group update 💷

The aim of this blog is to discuss a concept built around a specific requirement customers/prospects have been mentioning. Annually some businesses will review the level of sales with a customer to determine what band/bracket/tier/group they fall into for things like pricing, discounts or payment terms. I will use this post to tackle the idea of pricing groups i.e. spend X in a year and the customer falls into price group A, B or C. This means the customer could go up or down on those pricing groups based on their annual spend.

The chosen tool for performing this is ultimately Power Automate but some components are needed in BC. I have taken some inspiration for the final solution from @MaryThompson_WM with this video: https://youtu.be/THZeSnH-agc and @eHougaard with the Simple Object Designer: https://www.hougaard.com/product/simple-object-designer/ With both in play I was able to meet the usual objective of keeping this low code 😁

My target field in this scenario is the “Customer Price Group”. It is a field which is not available in the standard customer api page – hence why I needed the Simple Object Designer to expose it for updating. The app allows you to create custom API pages for nearly all tables in the database. It is a guided wizard type experience, making it easy for many types of users to pick up and get out what they need.
On the customer record there is a field called “Sales (LCY)” which shows the sales for a defined period of time. Here I have a “Date Filter” applied to manipulate the value. This is something you can expose using the standard api page for customerSales. Note it only returns values if the filter criteria are met – i.e. it doesn’t work like the customer api page where it returns many results, it’s more targeted. It is this value which will tell me if the customer is eligible to move into a different price group or not. Example of the endpoint with the filter and the output:
{{BCurl}}/api/v2.0/companies({{companyId}})/customerSales?$filter=dateFilter_FilterOnly ge 2017-12-31T00:00:00.0000000Z and dateFilter_FilterOnly le 2021-12-16T00:00:00.0000000Z

{
  "customerId": "8a11a07d-09cd-4080-b973-77d99d5862b9",
  "customerNumber": "10000",
  "name": "Adatum Corporation",
  "totalSalesAmount": 50922.9,
  "dateFilter_FilterOnly": null
}
To remove the need to do any modifications in BC I have used an Excel sheet with an Excel table to store the data I want to be assessed. The codes here match the values in BC and are manually maintained.

The Power Automate flow now has all of the required data components to work. Merely a case of constructing it. Check the image gallery for each step. Notes have been assigned on the flow action if pertinent.

This is the if statement in the "Set AmtBoolean" flow action:

if(and(lessOrEquals(outputs('Excel_Min_Amount_to_Float'), outputs('Total_Customer_Sales_for_Year')), greaterOrEquals(outputs('Excel_Max_Amount_to_Float'), outputs('Total_Customer_Sales_for_Year'))), true, false)

I’ve exported the flow and added it here if you don’t want to follow the steps I’ve laid out: https://github.com/JAng13sea/Blogs/blob/master/BusinessCentral-UpdateCustomerPricingGroup_20220422201332.zip

Business Central Data Dictionary 📖

I’ve been asked about this topic a few times recently, by colleagues and customers – where do I get a Business Central data dictionary or database schema from? This is a very brief post on how I tackled this. My default answer has typically been Jet Reports because the sample reports come with one. You don’t need Jet Reports to have access to it. Just go to the website and download it – https://insightsoftware.com/sample-reports/data-dictionary/. It is saved in such a way that it is just a plain Excel file, and looks much like this:

This could fulfill your needs perfectly well, however you can’t refresh it, unless you have Jet Reports. What if you have custom tables or tables from appsource extensions? Is there another free method to get hold of the database tables and fields? For this we will use a Power BI report. Big caveat is that we will need to customise the system ever so slightly. There are two ways you can go here. Either create API pages (best practice) or create standard list page objects. I did both and oddly had much faster results from the none API pages – weird 😖. To the extent I built my below sample from the standard list pages:

Something along these lines could be the final result. The PBIX I have created is also available to try out if you don’t want to build the pages or use the Jet Report suggestion.
I have taken complete inspiration from the Jet Report example and created pages in the report based on theirs.
This is just a small subset of what could be generated from the these two system tables.

Code for the two pages (in both styles) and the PBIX I took the screen grabs from are available on GitHub here: https://github.com/JAng13sea/Blogs/tree/master/Data%20Dictionary

How is the Power BI option free? The entry level account is free: https://docs.microsoft.com/en-gb/dynamics365/business-central/admin-powerbi-setup and that is all I had to build the report.

Send Attachments/Documents to Business Central with Power Automate 📎

“We have API’s now no need for physical copies!”

A while back I did a post on the attachments API for BC: https://joshanglesea.wordpress.com/2021/01/25/business-central-attachmentsapi/

I only talked about using it with Postman or another REST client of your choosing. What about using it with Power Automate instead? I will caveat the fact I hit a tonne of errors trying to get this to work with the latest API pages at the time of writing, v2.0. But we are all about the working version in this post!

Let me set the scene on how I have approached this topic. I want to take an email with an attachment – which is not an image and have it create a purchase invoice for the vendor emailing me. The purchase invoice will have a record of the attached document from the email. I will also store the attachment in OneDrive – could be deleted at the end if you wish.

My starting point is with this trigger. A shared mailbox is possible too.

I’ll break this post into chunks because the flow has a few steps to it. Challenge 1 – the vendor to create the invoice for. In my case I wanted to have a level of dynamism to how the document in BC knows what Vendor/Customer to use. In your case you might not find this possible and have to revert to a placeholder record. I’ve opted to take the first 3 characters of the email domain – for example if the e-mail address is Josh.Anglesea@somecompany.com the filter I will use is “Som” – yes we convert the 1st character to uppercase too. A fair few actions for this first challenge so check the image gallery and the embedded code.

I found this post helpful to solve this challenge: https://powerusers.microsoft.com/t5/Building-Flows/extract-s-string-between-two-characters/td-p/80830

//Index_of_@ 
add(int(indexOf(outputs('Email_From'),'@')),1)

//Index_of_.co
int(indexOf(outputs('Email_From'), '.co'))

//Vendor_Name_position
sub(outputs('Index_of_.co'),outputs('Index_of_@'))

//Vendor_Name_Filter_Value
substring(outputs('Email_From'), outputs('Index_of_@'),outputs('Vendor_Name_position'))

//Filter_in_Proper_Case
concat(toUpper(substring(outputs('Vendor_Name_Filter_Value'),0,1)),substring(outputs('Vendor_Name_Filter_Value'),1,2))

Challenge 2 – is avoiding pesky extra attachments on the e-mail. Because we are pulling this directly from an e-mail you can get all the images from the e-mail signature too 🙄 which means we will need to loop through until we get the typical file type we’re expecting. In my case I’ve said anything which is not a .png – change yours as you need to.

To be able to assess what the file type is we need to save it somewhere first. OneDrive was nice and easy here and the “Create File” action passes back the full file name. Simply a case of adding in a condition to check the property. You could have multiples here. You will then get two lanes – what happens if the file does not end with “.png” and what happens if the file does end with “.png”

Challenge 3 – get the file content from OneDrive and add it to BC. Yes this is the main part of the blog but it’s a challenge mainly because the BC connector for Power Automate does not play ball ⚾ Instead I have used the HTTP connector. You will need to use this post to help with the authentication part of using the HTTP connector: https://joshanglesea.wordpress.com/2022/03/01/business-central-oauth2-for-power-automate-%f0%9f%9b%82/

I’ll use a gallery of images with comments to explain each of the subsequent steps

//Expression to get the first vendor found in BC with the earlier created filter:
outputs('Find_Vendor_to_Invoice')?['body/value']?[0]?['number']

//Parse JSON Schema for the POST action response
{
    "type": "object",
    "properties": {
        "@@odata.context": {
            "type": "string"
        },
        "@@odata.etag": {
            "type": "string"
        },
        "parentId": {
            "type": "string"
        },
        "id": {
            "type": "string"
        },
        "fileName": {
            "type": "string"
        },
        "byteSize": {
            "type": "integer"
        },
        "content@odata.mediaEditLink": {
            "type": "string"
        },
        "content@odata.mediaReadLink": {
            "type": "string"
        },
        "lastModifiedDateTime": {
            "type": "string"
        }
    }
}
I decided to wrap things up by deleting the file from OneDrive now that it will exist in BC too. Are you reading this and thinking “Hey! I got the standard BC connector to work for this!”. Please get in touch if so, or get the steps shared!👍

Business Central CSV import with Power Automate 📥

There are plenty of things I try for blog posts that don’t make the cut. This was almost one of them. I’ll spare you the details on why exactly (until the end) so that we can keep focused on the good parts. My inspiration for this post came from watching this video: https://www.youtube.com/watch?v=2TIolfmtWQM, where a Excel script and a neat trick in Power Automate allow for a CSV file to be converted into an Excel table. The approach itself is very elegant and I’ve used the majority of it to my benefit. I recommend watching the video as I’ll skip over parts covered by it. The Excel script feature is truly awesome 😎

Let’s lay out a scenario where a supplier/vendor will send a CSV file with new items, with some basic details, so you can have them created for you in BC. My super simple CSV for this example:

Barcode,Name,UoM,Price,RRP
4545FB9797,Football,PCS,2.75,8
4545BB1324,Basketball,PCS,3.26,10.5
4545TB1344,Tennis Ball,Pack,0.95,3
4545SB9976,Squash Ball,Pack,0.78,3
4545RB4457,Rugby Ball,PCS,3.25,10.99

As mentioned I’m skipping some details, due to the recommended video, but once you have reached the Excel script stage you will want to add a few lines to cater for an extra row which is created and a caption change to one header value. Here is my script:

function main(workbook: ExcelScript.Workbook, lineCSV: string[]) {
  let selectedSheet = workbook.getActiveWorksheet();
  const alllines = lineCSV;
  let counter = 1;
  for (let line of alllines) {
    if (line.includes(",")) {
      let items = line.split(",");
      selectedSheet.getRange("A" + counter + ":E" + counter).setValues([[items[0], items[1], items[2], items[3], items[4]]]);
      counter++;
    }
  }
 
  workbook.addTable(selectedSheet.getRange("A1:E" + counter), true).setName("VendorItems");
  
//Delete Last blank line
 selectedSheet.getRange(counter+":"+counter).delete(ExcelScript.DeleteShiftDirection.up);
  //Retype the header otherwise the value is RRP\r
  selectedSheet.getRange("E1").setValue("RRP");
  }

From this point onward the video details are complete and it is time to think about extracting the data from the newly created Excel file to BC.

If you check over image 3 of the gallery you will notice that the “unitCost” and “unitPrice” values make use of an expression. This is to convert the data to the correct type. With it being CSV everything is text. This is an example of the expression to convert to decimals:

float(items('Apply_to_each_2')['RRP'])

All seems too simple right? Well yes in hindsight it sort of is. However, I will share the thing that tripped me up. The suggestion from the video is to use a compose action which is a carriage return. The end result of using that idea (with my data) is like this:

  "Barcode,Name,UoM,Price,RRP\r",
  "UVH72WFZ5EO1,facilisis,PCS,$25.33,55.93\r",
  "SVP51SRG4WY1,sodales,PCS,$85.55,50.16\r",
  "2RBS17JLI4CT,at,PCS,$71.72,29.32\r",
  "3YNT44PPY7LW,metus. In,PCS,$9.71,65.62\r",
  "4OGL41FFB6NB,fermentum convallis,PCS,$94.99,30.4\r",
  ""

Take note of the carriage returns which are the “\r” parts at the end of each row. Even after the data is converted into an Excel table those carriage returns remain in place. Remedy number 1 is to have the extra line to your script to rename the last column. Remedy number 2 is then the float() expression. I of course did not stumble across remedy number 2 that quickly, and tried some useless string manipulation 🤐. The performance of the import isn’t too bad. I tried a batch of 5000 records and it took 4.5mins to process. Baring in mind it is an automated process so those aren’t 4.5mins I lost in the space time continuum ⏳

The final thing I would do with this scenario is to move the file you have imported to an archive folder. I think for my scenario to work better you would need a custom api page. I think the catalogue item table is probably a better place for the type of data I was working with 🤷

Business Central OAuth2 for Power Automate 🛂

Generally when writing a post I quickly explain the scenario and get straight to the solution. I need to break that rule here to avoid confusion. As the title suggests this post is about using OAuth2 with Power Automate – applies also to Azure Logic Apps. However, if you have Power Automate flows using the BC connector you are already covered. If you have custom connectors for BC data which use OAuth2 for authentication. You are also covered. So what’s the point of this blog you ask? If you use the HTTP connector in your flows to interact with BC data – you have opened the right blog post 👍🏼. Why does this post matter? Well at the date of writing, 01/03/22, basic authentication for BC web services is being deprecated (https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/upgrade/deprecated-features-platform). OAuth2 is the replacement. If you can move away from using the HTTP connector then please do. That way you can just natively pickup the correct authentication. I say there are two main reasons you wouldn’t move things to the BC connector. If you have published a codeunit as a web service or you use query objects (which aren’t the API type). Lesser reasons might be that you don’t have development resource or want to avoid cost of refactoring. After all Power BI get to use legacy web services so Power Automate should get to as well right? For PowerApps you need to use a custom connector or the BC connector. You might have Power Automate flows which run from PowerApps. In which case if they use the HTTP connector, same explanation as above. Woah! 😮 that was a wordy intro. If you’re still with me let’s get to the good bit…

I will admit early on I only came across my answer thanks to some brilliant blogs! I highly recommend reviewing Arend Jan’s posts in particular if you have little background on working/setting up OAuth2 for BC. Links:

The solution is to have a standalone flow which will retrieve the access token. You will use this flow as a child to you main PA flow. This pattern will work well as you don’t have to alter your existing PA flows using the HTTP connector that much. Let’s get the child flow out the way first as that’s the longer part.

Here is the text string from the compose action mentioned in the gallery of images. Replace the your_client_ID and your_client_secret parts to form the correct string:

concat('grant_type=',encodeUriComponent('client_credentials'),'&scope=',encodeUriComponent('https://api.businesscentral.dynamics.com/.default'),'&client_id=',encodeUriComponent('your_client_ID'),'&client_secret=',encodeUriComponent('your_client_secret'))

Now we have the child flow up and running let’s plumb it in to a parent flow 🧑‍🔧 I will go through each step with images again.

It’s a fairly elegant way to slot in what is needed for the PA flows this might apply to. There are a number of my own previous blog posts which need this solution – like this one: https://joshanglesea.wordpress.com/2020/10/19/business-central-month-end/

To make life easy I’ve created a template of the child flow which you could import into your environment. There are setups that need dealing with before it will work though: https://github.com/JAng13sea/Blogs/tree/master/Template%20Get%20Access%20Token%20PA%20Flow

Business Central Service Management Email Queue 📨

The service management module of Business Central has an emailing feature which is designed to inform internal employees if service level agreements (SLA) are being missed. It also has a feature to update a customer on completion of a repair for a specific service item. The standard documentation on this area isn’t that great. Therefore I will explain in brief more about the feature and the required setups to make the most of it. At the time of writing this area is not part of the enhanced email capabilities and has little scope for configuration. An area I hope that improves in the future.

Part 1 is the Internal warning emails:

In the service management setup page you can determine the duration of time (in hours) that a series of warnings should fire off to internal staff members. In the “Send <> Warning To” fields just add the email address. It could be the same person each time or it could be a group email, the choice is yours.

Now that’s in place how is it being calculated that something has hit a “warning”? Over on a service contract we start to piece the puzzle together:

Each service item line can have a differing “Response Time (Hours)”. Note you can set this at header level too – in the “Service” fast tab. In my case the time to respond is seconds – like to set the bar high 😂 worth noting here that a service item must be part of a service contract for the SLA to count.

Once a service order is logged we have a date and time for that taking place – “Order Date”, “Order Time”.

This data coupled with the “Response Time (Hours)” value on the order, which just so happens to come from the service contract, gives us the expected SLA.

You can see the detail on the line level of the service order. Note it is the calculated value. In an earlier screenshot there is a field called “Warning Status” – this is where we can see what warning message has been sent. In my case it will jump straight to the third warning. Of course in a real life scenario it will step through as the clock ticks. Worth noting that the message will only ever happen once. So, if a user manually resets to the “Warning Status” but a message about it has already been sent then no new message will appear.

**Emails will only send if the Service Order has a status of Pending and the service order line has a “Repair Status Code” which is marked with “Initial” true** The out of the box setup here is something worth sticking to if you can but if you have your own repair status codes then I am referring to this setup:

In the case the service order line must be INITIAL for a warning email to send. Which effectively means, no one has started working on it yet. The interesting pat about this is when you have a multi line service order and you start work on it the overall “Status” of the order changes. Again I will state only “Pending” service orders get warnings. This is a good use case for the setup option “One Service Item Line/Order” – which would then give you greater control over getting warnings for all service items 👍

So, what do we need for the emails to start flying out? Two codeunits adding to the job queue entries page. Set them up to run as you need.

An example of the data in the service email queue:

Thing we can’t do is configure the “Subject Line” – would like that to change in future releases. If we need different detail then a customisation is needed here 😐 The table has a “Body Line” field which then produces the below email:

Part 2 – Customer notifications

A customer can be kept informed of the order with email messages when the codeunits from part 1 are in place. On the header of the service order there is a field called “Notify Customer” with a series of set options. Unfortunately, as standard the field is blank when the header is created. You would expect it to be linked to a customer record or even set at service order type, but no…..This means we have to set this manually per order, without customisation. You will also need the “Email” field to be populated, as standard it is blank too 👀 – you can see what I mean by having this area catch up with the enhanced email features:

To trigger an email to the chosen address you must mark the “Repair Status Code” to a value which sets the overall service order “Status” to Finished.

Upon doing that you will get this kind of message:

Unlike the warning messages this one will fire off multiple times if you chop and change the repair status code:

Final email looks like this. Again, with no current configuration options for changing what the email body is…

Some limitations in the standard offering but could be a nice thing to use if you have this module. The details mentioned are applicable to older versions of NAV which have the same module.

Business Central Inventory/Sales Analysis – Item Group Dimension Code❓

I’ll be the first to admit that when I don’t locate a report in BC I will knock something up in Jet Reports or Power BI. Nothing wrong with that approach, as you often end up with something that has nicer visuals and better aggregation. However, when possible shouldn’t we push the boundaries with standard? When it comes to inventory or sales analysis what can we do? Finance have account schedules. Well, if you weren’t aware, inventory/sales analysis have a similar set of options. Search for “Sales Analysis Report” or “Inventory Analysis Report”. There are some good blogs for explaining the necessary setups for these – just search for “business central sales analysis report” to find them. Just think of account schedules but for item ledger entries/item budget entries. I’m skipping the basics in this post and attacking a specific scenario.

The part I want to focus on is shown below:

The “Item Group Dimension Code” is part of the inventory setup page. What does it do? Why should I use it? How can I maintain the data in an easier way? All relevant questions that I will try to address. My aim in this blog is to tackle the usage of this field and the relating data.

The first point where this could be used is when building a set of sales analysis line templates:

A user can create a view where item groups are aggregated to give a top level summary of performance or other metrics of their choosing.

If I choose the “Insert Item Groups” option I get the following error message:

This ties back to the earlier screen grab around the “Inventory Setup” field. But that is just a regular dimension code. How do I make it so I can tie that dimension back to data users recognise on the item? Surely it should be related to item categories or a different field? Maintaining both item categories and a standalone dimension is an admin headache if not tackled correctly. Or do I create something which is truly standalone and then expect a user to fill that detail in on an item record too? Such a predicament 🤔

In my case I have opted for the Item Category as the field of choice to flesh out what my dedicated “Item Group” dimension will have. I haven’t included it in my solution but I chose this field because it has a parent/child aspect which lends itself well to Dimensions values. The solution for any new items is to use Power Automate to create a default dimension for the item record when it is created. Any existing item records could be resolved with a variation on this or by using a more standard method like a configuration package. Check out this blog post to see how I solved this: https://joshanglesea.wordpress.com/?p=714

Nicely leads on to a further predicament. If you intend to use this feature then you need to have the dimension setup in advance of any postings. The current “Dimension Correction” feature of BC does not cover the item ledger entries – so all prior postings are out of scope unless corrective action, in another form, is taken. My suggestion would be to start it in a new sales year to avoid conflicts. Reason being that you can plug a sales/inventory budget into the data – very similar theme to account schedules with a budget vs. actual comparison.

This is exactly what I did. Created a new sales budget and then devised it against the new “Item Group” dimension determining an amount of sales per grouping.

Generate a sales analysis view and use the “Item Group” dimension

If you intend to use the budget area ensure you “Include Budgets” so that data pulls through as well. Note I have left the “Update on Posting” off as it can be a system overhead. Common practice is to have that running as a job queue entry.

Now that the “Item Group” dimension is setup it is possible to use the earlier page action we got an error on to flood the template with data:

Use the “Sales Analysis Report” to then throw together each of the elements:

Which produces a visual like below. Again, perform a wider search on the inner workings of the reporting capabilities. My aim was purely to show how to get the “Item Group” aspect working:

It is possible to apply some of the logic you’ve read here to the fields shown below. If we track back to the Sales Analysis Lines setup you can setup reporting based on the dimension values within:

*Customer Group Dimension Code and Salesperson Dimension Code

Business Central – Automatically add default dimensions with Power Automate ☑️

A scenario came up in a different blog post where I needed this solution: https://joshanglesea.wordpress.com/?p=681

What was the aim from the scenario? I required a default dimension to be added to an item record based on a different field value. I’d made the decision to use the Item Category as a dimension so I could use it for inventory/sales reporting. This solution could be applied to other scenarios as you see fit.

  • The starting point for my scenario was to use the creation of an item record as the trigger.
  • A variable to store the ID value for the dimension in the company you intend to use
  • A delay because you are using a “creation” trigger so you need to ensure, as best you can, that the user has keyed in the field you will use – in my case it is the Item Category Code.
  • Get the detail of the record which has been created.
  • The find records action is being used to check if the Item Category Code exists as a dimension value code for the dimension being used.

Check out the below gallery for a view on each step of the flow I took. I have added a note on each step for explanation purposes.

The interesting takeaway from using the “Find Records” action of the BC connector was that you need to handle the possibility of it throwing back an error. This is where the “scope” facility of Power Automate is supremely useful 😁

An example of the result from using this flow. 👍 no manual input required. Plenty of other use cases for this for sure.

Download the flow if you like: https://github.com/JAng13sea/Blogs/blob/master/CreateaDefaultDimension-ItemGroupCode.zip

Business Central Multi Journal Line Dimension Update 🔃

In Business Central we have a few areas where working with dimension updates takes place for many records at a time. For instance, when assigning default dimensions there is a method to apply the same values to many records. Or when working on a document, like a purchase order, you can set the dimensions on the header and BC asks to pass them to the lines. With a journal line though it is very much an individual way of working and a user relies heavily on default dimensions values being inherited from related records. Granted edit-in-excel can be seen as the remedy here as you can make updates to the shortcut dimension columns on mass. What about when it’s not a shortcut dimension? Or the user doesn’t have access to edit-in-excel (no O365 or the add-in is blocked)?

As the gif loops through notice that all dimensions can be chosen despite whether they are on the page or not (shortcut dimensions). If the line already has previous dimension values it keeps them and the new one’s are added.

The Selection page is based on the dimension set entry table so it handles the errors for you. For example if a user tries to add the same Dimension Code twice:

For the purpose of a compare and contrast to edit-in-Excel the user does not get the captions of the dimensions and no lookup. Not attempting to have a versus battle with edit-in-Excel, just stating the idea may have some merit.
Feels like it nicely compliments the “Journal Check” functionality by letting you action this for more than one line at a time too.

And what about if you update a default dimension and don’t want to add the Account No. again to validate the data. In my example the “DEPARTMENT” has been made code mandatory for this G/L Account:

Performing a test with 5000 journal lines proved to be faster than edit-in-Excel by some distance. I clocked about 3.5 mins to update just one dimension code. Which is completely fair given it is done outside of BC and then pushed back. Doing the same with my idea takes considerably less time for that volume of lines:

It is something I logged as an idea a while back but wasn’t sure how to go about creating for myself: Microsoft Idea  · Add dimension values to multiple journal lines at the same time (dynamics.com)

Give it a try for yourself or vote for the idea (if you like it 🙈) in the hope it could be part of the standard product 👍

https://github.com/JAng13sea/Blogs/tree/master/MultiDimLines

Business Central Team Member Item Data Insert Hack ☠️

Emphasis on the hack part here. Right now this is a possibility so why not share – will it stay that way? No guarantees. The inspiration is born out of the fact a team member user license is not able to insert item data (amongst other data such as, Customers and Vendors). The “New” button is greyed out for that user type. Now many businesses like to use a team member license for raising purchase quotes. In the scenario where you are requesting to purchase an item which is not on the system yet, you want to give access but keep control. My idea is to satisfy both sides of that with a simple solution – HTTP request and no tables.

No ability to add an item or delete an item as a Team member licensed user can be seen above

To resolve this I’ve opted for a mix of AL and Power Automate. The process flow will be like this:

  • Create a page that is not related to a table – so no conflict on entering data into tables
  • Have the page raise a HTTP Post request to a define endpoint in Power Automate
  • Pick up the data from the endpoint in Power Automate and use it for the resulting business process
  • As Power Automate has plenty of connectors you could have an approval process, Teams message with an adaptive card, details written to SharePoint etc.
The UI design is simple. Have a new page action from the Purchase Quote which opens a NavigatePage page type.
Add the fields you want the user to provide. Note this does use table relations but it does not save the data in a table – it is all just in memory on that instance of the page. I’ve gone for a small amount of fields but the final version would cover more details. Dedicated buttons are then available to either halt the process altogether (Cancel) or to progress and send the data on.
enable “Allow HttpClient Requests” in the settings of the extension
Create a flow in Power Automate and use the above trigger in a manual flow. Set it to expect a POST. In the request body add a JSON object and each of the fields setup to be sent from BC. My full JSON body is below:
{
    "type": "object",
    "properties": {
        "ItemDescription": {
            "type": "string"
        },
        "ItemUoM": {
            "type": "string"
        },
        "ItemCategory": {
            "type": "string"
        }
    }
}
Add whatever additional steps you need to have in the flow. Eventually you can use the BC connector a use the “Create Record” action. Note the dynamic content from the HTTP connector. The Power Automate flow will need to run as an essentials or premium user. It it that user who performs the insert/create.

Here is how the experience can look. Note I just create the item right after requesting it:

Try it for yourself: https://github.com/JAng13sea/Blogs/tree/master/Team%20Member%20Hack