D365 Business Central, Power Automate

Business Central Power Automate Document Approvals – Comments 📝

When a Power Automate document approval is enabled for Business Central we can leverage the approvals connector or another platform. In either of those scenarios then comments an approver makes will be between the requester and the approver. Now in most scenarios this is fine. However, if we compare this to standard OOTB document approvals for BC then we get all the comment history….so why should we settle for less just because we’ve adopted a different platform for approvals? Plus, the fact the approver for Power Automate doesn’t have to be a BC user means we have no real grasp on who approved. Couple this with the fact the user assigned to the Power Automate BC connector shows as the “approver” in BC

Looks like a self approval to anyone in BC – but it won’t have been 😬

Now to cater for many we will need something to enable the logging of comments across all possibilities. So, I’m talking Sales, Purchase, Customers, Vendors and Items – all of these areas can have approvals in Power Automate for BC. All of them have a standard “Comments” table which can be used as well. Looking to avoid the standard approvals comments because the page action doesn’t exist when you use Power Automate BC Document approvals:

Instead I will use the regular “Comments” area for each piece of data:

There are many ways, like always, to solve this but I’m opting for the quickest. Which of course means, I considered creating a codeunit to act like a broker for each comments area. However, when you can it is great to use all standard objects and tools! For instance if you’re not familiar with “Page Inspection” check it out. I used it to determine the page numbers for all the areas I need to eventually cover: https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/devenv-inspecting-pages?tabs=page

Simply a case of publishing three pages as web services:

So what kind of data do we have to send to the service so it attaches to the correct record? I will bundle together Customer/Vendor/Item as the protocol is the same you just swap out the table number. Sales and purchase we’ll handle separately.
       //Customer, Vendor or Item can use a minimum JSON body of the below. Just swap out the "Table_Name" and "No"
       {
            "Table_Name": "Item",
            "No": "1896-S",
            "Comment": "Test"
        }

       //Purchase Documents just swap out the "Document_Type" and "No"
       {
           "Document_Type": "Invoice",
           "No": "107209",
           "Document_Line_No": 0,
           "Comment": "Test Comment"
       }

       //Sales Documents just swap out the "Document_Type" and "No"
       {
          "Document_Type": "Invoice",
          "No": "107209",
          "Document_Line_No": 0,
          "Comment": "Test Comment"
       }

So, that’s BC organised, what about Power Automate? I have opted for a OOTB BC template Purchase Invoice approval in Power Automate. The posting of comments, for me at least, can go in two possible places. One is right after the “Start an approval” action

Or you could place them after the condition so that you can add more context to the comment. In my case I’ve use a concat() function to add an emoji

Here is the result:

Minor draw back of using the standard tables/pages here is that the comments table(s) could well be editable to the user. It is reasonable that permissions can control this as the user who posts the comment is the one assigned to the BC Power Automate connector. Of course, permission sets aren’t the only standard choice. You could use change log or field monitoring too. If that doesn’t sit well with you then I’d suggest adding a boolean field to control whether the comment record can be altered or not. You can add that to the JSON body you send once it’s available. Final thing being the “Comment” field in the standard tables only houses 80 characters – but that’s the same as the standard BC approvals comment field – so no essay long approval comments folks 😜

D365 Business Central, Power Automate

Business Central Power Automate Document Approvals – Cancel ❌

Have you noticed when using Power Automate for BC document approvals you have the same approval buttons? One for sending and one for cancelling. The sending one works fine. Cancel on the other hand isn’t doing what we need. The approvers will still have an outstanding approval and the flow will keep waiting to run. Eventually when it does run it will fail as the request has been cancelled BC side. If no one approves and the rest of the flow doesn’t run then you have a flow run ongoing until it expires…These things are ok. I just think it makes the administration more difficult.

My goal with this post is to have the standard cancel action to actually cancel my Power Automate approval and my Power Automate flow run. Full housekeeping 😄🏡. Let’s cover some basics first:

The approval entries can be viewed by selecting the “Approvals” button. If using Power Automate a different page will appear. It will show a single entry each time an approval is sent. That entry will be adjusted if the approver accepts the request or rejects the request.
Standard BC document approvals would show a line for each approval entry created. With Power Automate we don’t have that detail in BC. This is as much as we get.

With Power Automate you might be using different methods to get the approval dealt with – MS Approvals, Teams, Outlook, other. I will look to solve the MS Approvals scenario as it’s the most common.

So what do we need? Well first thing is a record of information from Power Automate back in BC. So for starters a table extension to the “Workflow Webhook Entry” table. This is where the data from the “Flow Entries” page (above) lives. We have 3 fields as I want to capture details on numerous attributes so I can cancel all the relevant parts.

tableextension 50100 PA_Approvals extends "Workflow Webhook Entry"
{
    fields
    {
        field(50100; "Power Automate Approval ID"; Guid)
        {
            Caption = 'Power Automate Approval ID';
            DataClassification = CustomerContent;
        }

        field(50101; "Power Automate Flow ID"; Guid)
        {
            Caption = 'Power Automate Flow ID';
            DataClassification = CustomerContent;
        }
        field(50102; "Power Automate Run_ID"; Text[100])
        {
            DataClassification = CustomerContent;
            Caption = 'Power Automate Run ID';
        }

Alongside our new fields we are going to need an API page so we can make updates to the BC data. Just showing the properties as this is the important part. I made only my new fields editable

page 50100 PA_Approvals
{
    APIGroup = 'jaGroup';
    APIPublisher = 'ja';
    APIVersion = 'v2.0';
    Caption = 'paApprovals';
    DelayedInsert = true;
    EntityName = 'paApproval';
    EntitySetName = 'paApprovals';
    PageType = API;
    SourceTable = "Workflow Webhook Entry";
    ODataKeyFields = "Workflow Step Instance ID";
    ModifyAllowed = true;
    InsertAllowed = false;
    Permissions = tabledata "Workflow Webhook Entry" = RIMD;

More AL DEV follows but let’s tackle the Power Automate Document approval next. Start off with the standard template for whichever document you need this for. I’ve chosen Purchase Orders.

Kept the BC connector steps in to show the start of the flow. The changes and additions can be seen after the BC connector steps. The rest of the flow fits the standard template.

The compose action “Workflow ID” contains the expression workflow()?[‘name’] so that I can record the GUID of the workflow. The “Workflow Run ID” contains the expression workflow()?[‘run’][‘name’] so that I can record the text string name of the flow run. The template for a BC document approval comes with a “Start and Wait” approval. I’ve removed this and added the “Create an approval” so that I can find out and record the ID of the created approval. You then need the “Wait for an approval” so the flow has the response. The final change/addition is the HTTP step.

Used this instead of the BC connector – where you can select custom API pages – as I got an error about permissions. I believe this is due to the If-Match * being required as a header to the request. Add the URL to the API page and make reference to the “Workflow Step ID” from the BC Connector – this filters on that record only. The request will then pass a body to the three new fields. Note that each of the values come from the previously mentioned changes/additions to the document approval template.
The result will be the values of the specific Power Automate Flow back in BC.

So all that remains is the cancellation. For the data in BC to end up back in Power Automate a codeunit is required:

codeunit 50100 PA_CancelApproval
{
    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Workflow Webhook Management", 'OnCancelWorkflow', '', false, false)]
    local procedure CancelPowerAutomateApprovalEntry(WorkflowWebhookEntry: Record "Workflow Webhook Entry")
    var

    begin
        if IsNullGuid(WorkflowWebhookEntry."Power Automate Approval ID") then
            exit
        else
            SendToFlow(WorkflowWebhookEntry."Power Automate Approval ID", WorkflowWebhookEntry."Power Automate Flow ID", WorkflowWebhookEntry."Power Automate Run_ID");
    end;

    local procedure SendToFlow(ApprovalID: Guid; FlowID: Guid; RunID: Text)
    Var
        client: HttpClient;
        cont: HttpContent;
        header: HttpHeaders;
        response: HttpResponseMessage;
        Jobject: JsonObject;
        tmpString: Text;
    Begin
        Jobject.Add('ApprovalID', ApprovalID);
        Jobject.Add('FlowID', FlowID);
        Jobject.Add('RunID', RunID);
        Jobject.WriteTo(tmpString);
        cont.WriteFrom(tmpString);
        cont.ReadAs(tmpString);
        cont.GetHeaders(header);
        header.Remove('Content-Type');
        header.Add('Content-Type', 'application/json');
        client.Post('<Add your URL here', cont, response);
    end;

The idea here is that another Power Automate flow will be setup to receive a POST. Note that you will need to add the URL to your Power Automate flow for this to work.

Here are the main steps to the flow. Note that the final step is a custom connector. Use this excellent blog to create/download the connector: Everything to know about Power Automate REST API to manage and administer your flows – Mohamed Ashiq Faleel

The URL shown here is the one you need in the codeunit. Add it to a table if you would rather store it in a visual place. Add a schema of the expected JSON body – link available to paste in a sample payload*
Locate the Dataverse connector and choose the Approvals table. This will remove the approval from a users open queue 👍 The “Row ID” has an expression as the JSON contains a GUID with speech marks and Dataverse doesn’t like that 🙈 to bypass that use substring(triggerBody()?[‘ApprovalID’],1,36). The same thing is needed for the GUID of the FlowID
A user keeps an audit trail though. Note the “Outcome” is “Canceled”.
The Flow run history will show you any that are legitimately waiting on an approvers review with the “Status” of “Running”. Any that have been passed to the cancellation flow are dealt with accordingly. 🏡 house back in order 😅

Use of a webhook could be done here as well. I opted not to use that as you need something in place to renew the webhook and it would have to POST the information to flow I chose anyway. Hardcoding the endpoint is a bit 🤮 but it is simple

D365 Business Central, Power Automate

Business Central CSV export using Power Automate

A CSV export with Business Central (or NAV previously) is nothing new or exciting. Why blog about it then? Well let’s tackle the topic using a different solution. This is a CSV export using Power Automate with Business Central data. It could be argued that providing you have the relevant licensing this method would stop use needing 3rd party tools.

My scenario will be basic but the application is broad. I’ll use a web service (API page or query is fine too) to expose item data showing some fields. I’ll filter the web service to keep my results condensed and only select the fields I need. The goal: retrieve the required data from BC and create a CSV file on a recurring basis. The file should be added to a chosen storage location.

Power Automate has a recurring type flow which is suitable for this scenario. To ensure better error trapping and subsequent actions I am making use of the “Scope” feature in my flow. I will then add the actions I need. Two more scopes take place after and will only run if the 1st scope succeeds or fails – 1 scope for each possible outcome.

“Get BC Data” is the first scope block. Inside that we have the we have: HTTP call to a defined web service – details below. Parse JSON so we can understand the schema and finally a compose action. More on that as you read

The HTTP call in my case is using the Item Card, but this is adaptable to other data. The main thing I found useful here is the ability to use OData filtering to limit the results. Example of the url and the filtering and the ability to select specific fields: https://api.businesscentral.dynamics.com/v2.0/<tenantID>/Production/ODataV4/Company(<companyName&gt;)/Item_Card?$filter=Inventory%20gt%200&$select=No,Description,Unit_Price,Item_Category_Code,Inventory

I also liked using a header in the HTTP request of “Accept” which has this as the value: application/json OData.metadata=none. It’s not terribly exciting but when you intension is to read data only (I don’t want to insert or modify) it removes some of the data you won’t be using like the etag (typically used for tracking modifications). I’d say the majority of the work in this entire process is with the BC data endpoint. You might find a custom api page is needed. For example, to get all the fields you want from each of the tables you need them from.

Most of what I’ve covered thus far is fairly normal in Power Automate. The worthwhile part of this post comes in explaining the use of a compose action – which from the picture is titled as “Array for CSV”. Why do we need it? Well the action that follows for creating a CSV table (standard feature in Power Automate) expects an array. Now in the case of BC it will typically output a JSON object. Without going into extensive detail on json objects or arrays just think about the fact you need the part in between [ ] (square brackets). A simple expression takes care of converting the JSON object you get from BC to an array, which can be converted to a CSV. Create a compose action and add expression body([‘Parse_JSON’]?[‘value’]). This will extract just the array part which is typically titled as “value” when we retrieve data from a BC web service. Example of this can be seen below:

{
    "value": [
        {
            "No": "150A4444A",
            "Description": "ProBook 450",
            "Item_Category_Code": "LAPTOP",
            "Inventory": 22,
            "Unit_Price": 900
        }
   ]
}

The remainder of the flow is super simple. I’ve used another scope block and adjusted the settings of it to run only if the first BC data block was successful. Use a built in “Create CSV table” action and take the output of the earlier compose action with the array you need – this function only accepts an array. Then you can choose where the file should output to. I chose to create a SharePoint file – ensure you put .CSV at the end of the file name

This is a flexible solution where you can quickly have automated or on demand CSV exports working for your solution. It is much more rapid than the full BC method which would involved job queue tasks. You can of course layer in better notifications as well. This is why I opted for the Scope feature where you can alter the “Configure run after” setting for a previous action.

Here you can see that I will run the “Send Failure Notification” if any of the steps in the “Get BC Data” scope block fail.
D365 Business Central, Power Automate

Business Central line level approvals

Part 1 – Using one line value

What am I getting at here? Well, think about a document approval scenario where you might need to have let’s say 1 shortcut dimension that you want to use as a driver for who approves the line. This could be a cost centre or a specific project, maybe even a department. Standard BC requires you to create multiple workflow records to handle different filtering. Personally found that “Header” focused fields are the main one’s which get checked. For creating line level approvals this is a challenge on performance as all workflows will be checked. Is there another way? Power Automate provides a fresh view on the topic. You are able to ignore the rigid nature of BC workflow engine and come up with something new. Couple that with the fact you have more notifications on offer it’s an intriguing proposition.

“Department Code” represents the value I want to create approvals for.

Let’s work with the business process being you need approval from the department head on a line if a specific department value appears. In my example I will use a global dimension but in theory you can adapt this to any line field exposed by an API or Odata page. There this a set of document approval templates available with Power Automate for the BC connector but they won’t be used here.

The initial event triggers of “when a document approval is requested” is the starting position of the flow. Naturally followed by GETting the details of the record. From this point onwards it’s all about the line data and working it so we can cycle through accordingly and get approvals. In my scenario I have a multi line document which has repeating “Department” values. I want to create a single approval per department not a single approval per line. The approver is approving the total for their department.

The principal mechanism that makes this flow produce the desired results is counting the approvals result. If we have a unique total of values, so departments in my case, then I need to match that with approval results to get an approved document. So 3 unique values means 3 approval results to get an approved document.

Parse JSON is used after the HTTP call to get the line detail. This enables us to pass those values to other areas. So that I have a simple array to work with I select only the Dimension value from the parsed JSON, creating a new clean array. Given we only want unique values, and or current example has 7 lines with dimension data, it is necessary to use an expression. I am using the “union” expression to compare the array from the select to itself. This results, in my case, with 3 unique values.
After this we want to have two variables, which we will compare later on. Set them as above and use the “length” expression to count the number of values created by the previous “Unique Dimensions” compose action.
The mentioned “Outputs” is from the Unique Dimensions compose action. We are using an “apply to each” to loop through the results of that. The “Switch” then checks the value and performs the necessary action.

I draw some comparisons with the standard workflow in terms of maintenance with a “Switch” statement. This is where I check for results on my chosen line level field value. If I get a match then I send an approval to the relevant person. 📝 Note that we could lookup the approver recipients from a setup table if needed. Feel like how it is currently setup resembles some of the standard BC workflow setup.

The condition of each switch statement approval is to count the approval result and add it to the earlier set “Approval_Counter” variable. Note that we take no action if the approval request is rejected.
Outside of the “apply to each” create another condition. Compare the “Approval_Counter” with the “Unique_Count” variable. If they match then you can complete the BC Approve action. If they don’t you can complete the BC Reject action.

If you like this pattern please comment. Submitting it as a template for more users to gain access might be worthwhile 👍🏼 If you want to avoid doing all the steps described then access the template for download here: https://github.com/JAng13sea/Blogs/tree/master/Line%20level%20approvals

Want to see it in action? Check out my video from Scottish Summit 2021 where I demo it! https://youtu.be/ldOnzRySzcY

D365 Business Central

Business Central Attachments with standard API – Part 1 Basic Explanation

A nice feature which is explained fairly well in the Microsoft docs site but it is especially nice to have a working example. (https://docs.microsoft.com/en-us/dynamics-nav/api-reference/v1.0/api/dynamics_attachment_patch). The attachments API itself actually points to the “Incoming Documents” feature of BC, so terminology might catch you out. The scope of the API therefore covers whatever you can use “Incoming Document” for: Purchase Invoice/Credit, Sales Invoice/Credit and Journal lines.

You need to use to HTTP calls to make this one work and this is the pattern:

Firstly you POST a simple line to say what you are attaching to i.e. our Invoice/Credit record or our journal line. This must be the ID or SystemID value of that record. How this plays out in a practical sense will depend on how you implement the API with your solution. In the scenario where you create a purchase invoice and then have an attachment of that latched against it you will need to perform the creation of the purchase invoice record first – a follow up post on this. Here is how the first action looks:

https://api.businesscentral.dynamics.com/v2.0/<tenantID>/uat/api/v1.0/companies(companyID)/attachments

Body:

{    “parentId”: “5a086bc7-195f-eb11-89f9-0022481ab2d5”,    “fileName”: “Example.pdf”}

How it looks in a HTTP client like Postman. Note that the “filename” property is just a value you choose. So if this was part of a solution you might do it as the invoice number or document no. of the journal line you created before this step

Now the PATCH part is where the order of this becomes very important. You need a value from the POST to do the PATCH…..😮

This is the response I get from doing the POST command. It is the ID value which is new to the story and is how I perform the actual attachment process. POST is just to create the Incoming Document record….penny drops 🙈

Once the above is known then the PATCH command shown in Microsoft Docs becomes very easy to follow. Here is the command: PATCH businesscentralPrefix/companies({companyId})/attachments(parentId={parentId},id={attachmentId})/content . Simply add the information we now know. So in my case, with the above screenshot, my “parentID” is 5a086bc7-195f-eb11-89f9-0022481ab2d5 and my “id” is 7a1f7c6c-1e5f-eb11-89f9-0022481ab2d5. Notice that the odata.etag has been provided from the previous POST step so you can add that in as well for the “Headers” of the PATCH command. Here is what I mean:

etag is there to track the changes – make sure someone else hasn’t modified the record since you last got it. It is possible to use a wildcard here if needs must

In postman at least the body of the PATCH command is something like below but in a real solution you just need to make sure you pass a binary file:

In postman if you are testing this you can select a file to work with and it will pop open your file explorer.

Once each of the mentioned are in place you can go ahead with the PATCH. You should get a 204 status response but there will be no body type output.

When you head over to BC though you will see your record, which started life as our ID no has an “Incoming Document” record against it.
Here is the record in the Incoming Documents table. Once posted a user will be able to view document as well.
D365 Business Central, PowerApps

Business Central Images to a Canvas Power App Gallery

For those who missed it I did a session for BC Tech Talk last year:

During that session I showed how to get a single image for an item record with an API call (35mins and beyond). The method was very simple to create a custom connector for the standard Picture API and just add a extra bit of text. Similar to this: https://api.businesscentral.dynamics.com/v1.0/8d0632b1-47bf-41fe-ba13-a838adf98c48/sandbox/api/v1.0/companies({companyID})/items({itemID})/picture({pictureID})/content

The green text pulls the image directly into an image control in the canvas app 👍. I mention in the video that you could have the picture images in the main gallery view – it will be slower this way but I was asked to at least explain how. The standard API isn’t that helpful here – VERY HAPPY TO BE TOLD HOW TO USE IT CORRECTLY – the endpoint doesn’t interact in the same was as a HTTP request would and it wants to produce a table of data and no matter how I work with the connector with the available formulas I just get errors. Can see from the Power Users forum this is a similar theme with others:

So to get around this a new page is needed with some extra capability. The end goal here is to create a custom connector for use in powerapps that gives the base64 image value for BC item records. I took inspiration for the solution from this blog which I’ve adapted slightly for latest version (at the time of writing being 17.2 – may have changed since): How to get picture from MediaSet through standard APIs. – Dynamics 365 Business Central Community

Final code is on github – Code: https://github.com/JAng13sea/Blogs/tree/master/ItemImages – but in essence you need a new API page for the item table. A field on the API page which is populated by a codeunit return value of text which has been converted into base64:

    procedure "Pic as JSON"(ItemNo: Code[20]) : Text;
    var
        Item: Record Item;
        TenantMedia: Record "Tenant Media";
        PicText: Text;
        PicInstr: InStream;
        JObject: JsonObject;
        JToken: JsonToken;
        TempBlob: Codeunit "Temp Blob";
        PicOStr: OutStream;
        Base64: Codeunit "Base64 Convert";
    begin
        Item.Get(ItemNo);
        If Item.Picture.Count = 0 then
            exit('');
        TenantMedia.Get(Item.Picture.Item(1));
        TenantMedia.CalcFields(Content);
        if TenantMedia.Content.HasValue then begin
            Clear(PicText);
            Clear(PicInstr);
            TenantMedia.Content.CreateInStream(PicInstr);
            PicText := Base64.ToBase64(PicInstr);
            JObject.Add('picture',PicText);
            JObject.SelectToken('picture',JToken);
        end;
        exit(JToken.AsValue().AsText());
    end;

Note that it has been added into a JSON structure to make it easy for other solutions to read it. From here you will need to create a custom connector for Power Platform. I did another video on this so check it out if you need guidance on that part:

Once the connector is in place and you have pulled the data from BC into a collection in PowerApps you can address the image value in the gallery. When using a gallery in PowerApps a predefined method exists for stating each row of the gallery – ThisItem – the value passed from BC is in Base64 but it needs decoding in PowerApps for an image control to display it. In the image control add the text “data:image; application/octet-stream; base64,” & ThisItem.picture

Depending on the volume of items with images this type of call could be very long so tread with caution. If you happen to be using images with other data or storing other blobs this pattern is a way to get the data into PowerApps.

D365 Business Central

Business Central 2020 Wave 2 Enhanced Email Capabilities – How to setup

If you read over the latest changes shipped with version 17.1 you would have seen the enhanced email capabilities mentioned. This can be enabled early by checking out the feature management page:

Just doing this though doesn’t mean you can start to enjoy the new features straight away, there is some setup to handle. A lot of the required setup is described very well on the docs.microsoft.com site: https://docs.microsoft.com/en-us/dynamics365/business-central/admin-how-setup-email

However, that doesn’t always have all the steps in enough detail. The end goal of this functionality is to have dedicated email accounts for specific scenarios that need emailing from the system. Here is a configured list to illustrate where we want to end up:

All of the documents shown will now have the “From” address of accounts. This means a contact is more likely to reply directly to this group email rather than a direct contact 👍

Before getting started in Business Central go to the Office 365 admin centre and add shared email accounts for each of the scenarios that can be covered. Out of the box this includes documents based on sales, purchase, service and internal messages like approvals notifications (can be extended to cover more – save that for another time). These type of mailboxes therefore make sense:

The majority of the connection setup is done through a wizard where you either pick the current user account so you can send emails as your logged in BC user or choose one of the above shared mailboxes. The current user method is very simple so I will cover the shared mailbox

Search for “Set up Email” and cycle through the wizard page to the one shown (page 2), choose the shared mailboxes option. Note that if you’re not a user with the SUPER permission assigned you will need a new permission set called EMAIL SETUP which is available in the main list.
Add the details of one of the shared mailboxes from the Office 365 admin centre. Finish the wizard and test the connection. Repeat this step for the other accounts that you need.
Once you have added all the accounts if you search for “Email Accounts” you can see all of the available ones to work with. Note that the legacy SMTP connection will be created for you already and it will be the “default”. There is an action to reassign the default available from this page as well as testing the accounts or composing a none document related email with that account.
The next area of interest in this page is under the “Navigate” tab where you can see email activity with that account from BC and which “Email Scenarios” are assigned to the email account.
Simply choose the documents you want to be sent from that email account
From that point onwards your chosen account will be used for that document. Any previous email dialog screens have been replaced with this one. If you have email layouts setup they will work too.
On the role centre you will notice a new activity cue. The draft cues means that users are asked to save as a draft or to discard the email – much like a regular email experience.

I’ve configured this in two separate tenants thus far (as of 09/12/20) and the experience has been fairly slick. The only issue I experienced was to do with an Exchange configuration where no Office 365 connector was in place. If you are repeatedly given a message in BC about not being able to access the “Set up Email” page jump over to the Exchange admin centre and see the “Connectors” which currently lives under the “Mail Flow” area:

Users are able to search for “Compose an Email” to access the email editor with no document scenario assignment:

Would have been nice if the “Attach File” option let you search documents stored in BC but that is currently not the case. Right now that will let you choose a file from a locally accessible location.

A much richer experience overall. Look forward to seeing what others do to enhance this.

D365 Business Central, Power Automate

Business Central Batch post and e-mail with Power Automate – no code

Batch posting has been in the product for some time but I am not seeing a clear way, happy to be wrong, on batch post and e-mail. Here are my options (below) as a user conducting the posting. Yes, I have post via the job queue as well but from what I’ve tested this isn’t sending my e-mail for me. Even the print option breeds no results here. Thought it would at least use the “Document Sending Profile” on the customer record.

Even when you use the “Post Batch” you get nothing about e-mailing? 😬 Not forgetting the workflows in BC have no option for “Post & Send” it’s just “Post”. Usually these gaps would need filling with a code modification but not in this blog.

Where there is a will there is a way and my idea here is to use Power Automate to do the heavy lifting. On this occasion I will use a manual trigger but if this was a production ready scenario I would use a scheduled type flow.

The connector for BC in Power Automate is fairly small but I’m sure it will get better over time. There is a slight restriction though as the action of posting and emailing exists but it’s on a singular level like it is in the regular UI. So the first thing I need to do is fire in some data I can use in Power Automate from BC. I will of course need the GUID ID reference to the sales invoice as that’s what the BC Power Automate connector likes. The page that fits the bill here is available as a web service already. I will use the top one in the list 2811. This page is a mixture of open and posted sales invoices so some filtering is needed:

Example of the payload from this page:

Adding a filter to the ODATA query so I only post invoices that are ready to be posted. In the case of this page the “status” values are different to regular BC. In my case I have a choice between draft aka “Open” or open aka “Released”. So I have added ?$filter=status%20eq%20%27Draft%27%20and%20totalAmountIncludingTax%20gt%200 to the end of the web service URL I got from BC. Reason being the feature in the Power Automate BC connector restricts me to draft only and I don’t want to post invoices with a 0 value. Will log something on the Power Automate forum about the connector wanting “Open” invoices rather than “Released”. Backwards logic otherwise, what if I’m using approvals for instance. I think this is possibly wrong so I have posted this to find out more: https://powerusers.microsoft.com/t5/Connecting-To-Data/Is-the-Business-Central-Post-amp-Send-Invoice-action-correct/m-p/637081#M9492

Anyway….

Our flow starts to take shape like this. Check out my last blog for a more detailed account on the parse JSON part (https://joshanglesea.wordpress.com/2020/07/22/power-automate-purchase-order-alerts-to-vendor/)

Once we add the “id” from the parse JSON step you will get an “Apply to each” wrap around the chosen BC action which is exactly what we want (automatically). Remember our goal Batch post and e-mail invoices – so we better placed than standard.

In my case a small batch of invoices have been handled and emails dished out accordingly. I added an extra step for updating the posting date as well. Just so it takes something useful from the existing “Batch Posting” feature 👍

D365 Business Central

Business Central Demand Forecast

Have you ever looked at the demand forecast in D365 BC and thought that is missing features which actually make it usable! I mean look at it…but actually look at it:

Yes I can toggle the matrix view to alter the dates and change to component items instead of sales but if I want to forecast surely I need to do it for items that actually need manufacturing first! After all that is the premise with it. You populate it and then it gets used as part of MRP and MPS. How can we change this situation so the page is actually friendly and a user can apply sensible filters? Initial thoughts would be just personalise it, oh wait you don’t get any fields to choose…

Well the answer lies in the make up of the page object. Which as it turns out is the Item table:

So our goal here is Add sensible fields to the page that can be used for better filtering when devising a demand forecast. This might seem overly simple as an idea but the value comes from the fact the item table, especially for a manufacturer, can become very large indeed. You will have items that get blocked, you will get items that have been replaced by others or are now defective. The list in it’s current state over time will become too difficult to work with.

The method to achieve this goal is simple. Create a page extension to page 9245 and throw in the necessary fields the user is most likely to plan with.

Other fields can be added of course but straight away a user has something way more friendly:

For more information on demand forecasting check this out: https://docs.microsoft.com/en-gb/dynamics365/business-central/production-how-to-create-a-forecast

Code available here: https://github.com/JAng13sea/Blogs/tree/master/Demand%20Forecast

D365 Business Central

Business Central Gen. Jrnl. Imports with Data Exchange Definitions

Have you seen this button on the general journal page and thought that’s useful?

Some of you might not have seen it though as the button only appears if you complete some setup in the General Ledger Setup page:

The above field shows a list of Data Exchange Definitions for the type of “Payroll Import”. So it is fit for purpose but only usable for payroll – and more importantly just one import file structure. For demos I have often mocked up a Data Exchange Definition as a payroll import type and just passed over a CSV file. This falls into my one of many ways a journal can have external data added to it. This is fine for one off demos sessions but how can this idea be taken seriously in a project/production scenario? The answer lies in how is the current “Payroll Import” button working and can we change it for our situation? My end goal here would be to have the user pick from a range of different data exchange definitions which work for different CSV files. In the past customers may of had bespoke import routines passing data through the excel buffer. If they can compromise on having the file converted to CSV this could be a worthwhile solution!

Over in the base app the “Payroll Import” action button looks like this:

So it is only ever looking at that setup page to know which data exchange definition to use. It also has it’s own codeunit to run the process. So the component parts are small and we can adjust them with ease for our end goal.

Very simply a page extension is needed for the general journal and you can go with something like below to add the type of page action we want. In my case I have opted to use the “Generic Import” type on the data exchange definition – which I have set as a global variable on the page.

Codeunit wise I have done a complete copy of the base app and then renamed the function names. Very easy indeed. The process itself works fine so no need to change anything it is more about what you offer the user when they make that button selection.

The end result is a dynamic journal import selection. The user can now maintain and update a data exchange definition or add new one’s as new journal import requirements come along! 👍😁

Don’t know much about data exchange definitions? Check this out: https://docs.microsoft.com/en-gb/dynamics365/business-central/across-how-to-set-up-data-exchange-definitions

Code available here: https://github.com/JAng13sea/DED-Jrnl-Import