Business Central Dynamic Bin Ranking 🚮

A business recently explained to me that they were running a dynamic bin ranking method, to enhance the likelihood of a correct pick walk not necessarily an efficient one. They had visited an Amazon fulfillment centre, which used something similar and figured if it is good enough for Amazon then it is good enough for us! Worth reading into some of the features of an Amazon warehouse, like this article: https://www.allaboutlean.com/amazon-fulfillment-3/

As well deciding bin ranking needed to be dynamic the business in question also ensured items were in scattered bins across the warehouse. This was to remove the likelihood of picking the wrong item – for example a pair of shoes would be in a bin next to a coat or trousers – hence the “correct pick” statement from earlier. In BC the “Bin Ranking” helps to control three functions: putting stock away, bin replenishment or pick line sorting. Due to this scenario it is important to ensure any “dynamic” facility sticks to specific numbering so that both facilities remain functioning. Need more info on bin ranking? Check out the standard docs about it: https://docs.microsoft.com/en-gb/dynamics365/business-central/design-details-warehouse-setup

General rule of thumb with bin ranking is that your pick face has the highest rank and the bins which are easiest to pick from (close to the floor or near the packing area) are given the highest rank number value. Therefore dynamically ranking bins in the way I’m about to explain might not fit the warehouse layout. You could certainly produce this routine in AL with a processing only report potentially and run it on the job queue. I have opted for Power Automate in this case to see if it could handle a scenario where multiple tables are checked to determine a result. Also it provides a single place for all business logic and any data references it makes. I understand these justifications won’t be enough for everyone but bin ranking isn’t for everyone either. Some techniques used might be helpful for other scenarios – like making use of a flowfilter with a query API or not using SystemID as the “row ID” for the BC connector in Power Automate.

Power Automate being chosen doesn’t make this a development free solution. Link to the code is at the end but you will need an API page for the bin table, API page for the bin content table and an API query for the item table. The image gallery below shows each step of the flow used. Added text on each image for each context or details on expressions/code:

Link to the code for the API pages and API query: https://github.com/JAng13sea/Blogs/tree/master/Dynamic%20Bin%20Ranking

Business Central Approval Out of Office Check 🌴📧

This is a FAQ when the topic of approvals comes up with prospects or customers. Solving this conundrum requires you to determine where you will get a substitute or replacement approver from. In my case I have used the approval user setup in BC. To achieve this a custom api page needs to be generated. If you can’t do that then check out appsource for apps which help. At the time of writing the simple object designer is a good one. The process for building the flow can be seen in the image gallery below. The first two steps are just the regular BC workflow approval trigger and get record action. I have only done one out of office mailbox check but you can add more with the format I utilise. If you need an overview on the basics of using Power Automate for approvals check out this post: https://joshanglesea.wordpress.com/2022/04/29/business-central-customer-approval/

There are a few expressions that get used and I’ve added notes on all of the images of what expression is being used at that point:

I’ve exported the flow and you can grab it for import instead of building from scratch. The JSON format is available too if you use Azure logic apps instead of Power Automate: https://github.com/JAng13sea/Blogs/tree/master/Out%20of%20Office%20Approver%20Check

Business Central select document approver 🔍👤✅

This post came about due to the below question on the Dynamics Community forum:

Here is how I would solve this one with Power Automate. Initially use one of the templates available on power automate for creating document approvals. This gives you the perfect head start. In BC itself you need to decide on using one of two standard fields. On sales and purchase documents you have either salesperson/purchaser code or assigned user ID. The latter is my favoured option. The list is maintained in a better way and you don’t have to worry about character limits. Is it a problem that a user could choose themselves? Read on to find out 😏

Both of the mentioned fields are already in the data available from Power Automate – not ruling out a custom field but not including it in my post.

Four things to do in Power Automate once you have the template pointing to the correct database and company etc.:

The assigned to property of the approvals action needs a valid email address for a user in the directory. Simple hard code your internal email domain after the user value. Purchaser/salesperson codes might not follow a username type logic forcing you down the assigned user ID route.
Three alterations remain from the template. I’ve moved the approval action into the “if yes” of condition 2. This is where you check a self approval isn’t being done. The approver email must not match the sender email. It makes sense to add a rejection action in the “if no” part of the condition. Repeat this for condition 3 of the template – difference being that is the main rejection step.

There is now a dependency on the assigned user ID field being populated. Arguably a new condition should be added so you can reject and email/team’s message the sender and have them populate the field. I’ve opted against that and adjusted the workflow record in BC instead. This will stop the approval request from triggering so well trained users will spot what is missing.

Go to the workflow record and access the “On condition” of the top row in the workflow steps.
I’ve added a filter for my chosen field of “assigned user ID”

Business Central Customer Approval with Power Automate 🏢✅

Recent Microsoft Dynamics Community forum post has prompted this blog:

In this post I will give an answer but also review briefly why I chose Power Automate over regular BC workflow.

BC workflow has a template for credit limit changes. That was my initial suggestion but it’s the requirement around blocking the customer which stopped me. Yes, BC workflow has record restrictions but they only stop postings. The blocked field on the customer stops document creation like orders or invoices. BC workflow doesn’t allow for other fields to be updated. Only allows for the changed one to be reverted back and altered on approval. The Power automate route however will allow you to make other alterations, or other actions, with greater ease. I’ve used a different field but here is how I’ve tackled this.

In Power Automate use the template for customer approval. This will add 80%of what is needed for the solution. Fill in any of the areas that are blank, like which database and company etc. Upon saving the flow it will then pass a workflow record to BC. You will find it in the regular BC workflows list:

The “When Event” will of course be about an approval being triggered:

That will be one of the things to alter along with a specific data condition. All to be done here in BC. The rest is for Power Automate to handle 🤘🏼
Change the when event to “A customer record is changed”. Then drilldown on the “On Condition” to setup what you need for your circumstance.
In my case I only want to handle records which meet my filter conditions and if the “Gen. Bus. Posting Group” is changed. I wouldn’t go without filters or you risk firing off approvals at daft times like when manually creating a record. Step by step field population counts as a modification

That’s the BC setup done ☑️ just the Power Automate parts now 😊

Two additions to the original template are required. They are nearly the same too. Choose the “update record” action from the Power Automate BC connector. Use the Id from the “Get Record” action above it. Choose the customer table and add the word All to the blocked field
Within the “Yes” condition, for when the request is approved, add the same action. This time the Blocked field needs to have a single space. This will revert it back to blank – which is an unblocked state

Business Central Multi Company Approvals ✅❌🔀

I have some things to state upfront as there are things to justify for why the solution is shaped in a certain way. The premise of the post is to solve the fact a user who works across many companies might have many approvals. The standard UX is a big driver for why I’ve chosen the topic. If using out the box BC approvals you can get emails. However, they don’t have any action buttons.

So you’re forced to login. You’d have to do that for all companies you have approvals for. If you’re using Power Automate approvals you do have action buttons in emails. You do have to read every email though to access them.

Is there a better way?! 🤔

I figured the “newish” company hub app would be a good starting point. Have you seen it? A user can view all user tasks they have in each company they have access. Sounds close to what I am thinking 🧐. However, I found the table for this it’s not extendable. The method for how it gathers user tasks seemed fairly involved. I liked the idea of shoehorning it in there but felt that an easier route existed. Access to the detail from the role centre is certainly the way to go though. But access elsewhere is fine too.

Company Hub view users can access

My answer is to use a PowerApp embedded in a Power BI report. Handles plenty of the circumstances required. Here is a version of the app running through some basic functions it could offer to a user:

I have added few other considerations at the end of the post. I will cover off the main steps I took to create the above experience – **which covers approvals controlled by BC Workflows only**. A full explanation of how the entire thing was built would be best by video. In absence of that I will adopt a high level approach for readability, calling out key components.

Components required and where I got some of my details. The order of them is important:

Code for the new api page can be copied from here: https://github.com/JAng13sea/Blogs/blob/master/Multi%20Company%20Approvals/Pag57250.MultiCompApprovalActions.al

Why do we need the custom actions? Doesn’t the BC connector for Power Automate already have those? Yes, they do. However, they are for approvals done via Power Automate only. I am yet to figure out a company based difference in the Dataverse table to bring this type of experience solely for approvals done via Power Automate. **Watch this space**

Using the new API page and the companies endpoint as queries in your PBI report file. Follow Steven Renders blog to ensure the approvalActions is for all companies.
Create a relationship between the two tables with the “Company Name” value being the link
So that a user only sees their approvals add in security so the approverEmail must match up with the user that is logged in and using the report.

Now the Power Apps visual can be added to the report. Use this guide if it is a new experience: https://docs.microsoft.com/en-us/power-apps/maker/canvas-apps/powerapps-custom-visual

A copy of my PowerApp and the power automate flow used can be found here: https://github.com/JAng13sea/Blogs/blob/master/Multi%20Company%20Approvals/MultiCompanyApprovalsApp_20220525155716.zip

One of the big takeaways from the app design is that the user works with a collection instead of the direct data from the PBI report. When the app starts the initial collection is made and that is displayed to the user.
Other components that are crucial are the action buttons to approving, rejecting and substituting. Each button is the same and only the “Type” value is altered on each. Idea being that a new collection for only the entries we need to action are taken and sent to Power Automate. The original collection which is what the user sees is then updated. This is because the PBI report doesn’t get refreshed constantly – it will be on a defined schedule:

Now for more detail on the Power Automate flow. In a standard scenario the connector for BC is not dynamic to a company. In this scenario it has to be. It is a general possibility to pass custom text to connectors in Power Automate and they can be based on variables you setup. The same as setting a header on a HTTP call.

Conclusion: This method certainly falls into the low code bracket but I also think it has caveats which might not work if you have to scale it out to many users. Users do often set aside a time in their working day or week to do approvals so having a one stop shop approach is a bonus. The refreshing of the data is a drawback because of it being held together and formed by the PBI report. On the other hand it it such a small and discreet thing a user has to do so living with some level of drawbacks might suffice because the standard experience is far worse IMO. The time and effort to do approach this in AL is probably worth it in the long run and it’s an obvious thing to have in the “Company Hub” app. The nicest takeaway is the ability to make the BC connector in Power Automate be company dynamic. Have to see what other scenarios crop up where that could be used 😏

I think this should be in the Company Hub app and I have logged it as an idea which will need votes: https://experience.dynamics.com/ideas/idea/?ideaid=658e8d1c-e0dc-ec11-bea1-0003ff45e08b

Business Central automated pricing group update 💷

The aim of this blog is to discuss a concept built around a specific requirement customers/prospects have been mentioning. Annually some businesses will review the level of sales with a customer to determine what band/bracket/tier/group they fall into for things like pricing, discounts or payment terms. I will use this post to tackle the idea of pricing groups i.e. spend X in a year and the customer falls into price group A, B or C. This means the customer could go up or down on those pricing groups based on their annual spend.

The chosen tool for performing this is ultimately Power Automate but some components are needed in BC. I have taken some inspiration for the final solution from @MaryThompson_WM with this video: https://youtu.be/THZeSnH-agc and @eHougaard with the Simple Object Designer: https://www.hougaard.com/product/simple-object-designer/ With both in play I was able to meet the usual objective of keeping this low code 😁

My target field in this scenario is the “Customer Price Group”. It is a field which is not available in the standard customer api page – hence why I needed the Simple Object Designer to expose it for updating. The app allows you to create custom API pages for nearly all tables in the database. It is a guided wizard type experience, making it easy for many types of users to pick up and get out what they need.
On the customer record there is a field called “Sales (LCY)” which shows the sales for a defined period of time. Here I have a “Date Filter” applied to manipulate the value. This is something you can expose using the standard api page for customerSales. Note it only returns values if the filter criteria are met – i.e. it doesn’t work like the customer api page where it returns many results, it’s more targeted. It is this value which will tell me if the customer is eligible to move into a different price group or not. Example of the endpoint with the filter and the output:
{{BCurl}}/api/v2.0/companies({{companyId}})/customerSales?$filter=dateFilter_FilterOnly ge 2017-12-31T00:00:00.0000000Z and dateFilter_FilterOnly le 2021-12-16T00:00:00.0000000Z

{
  "customerId": "8a11a07d-09cd-4080-b973-77d99d5862b9",
  "customerNumber": "10000",
  "name": "Adatum Corporation",
  "totalSalesAmount": 50922.9,
  "dateFilter_FilterOnly": null
}
To remove the need to do any modifications in BC I have used an Excel sheet with an Excel table to store the data I want to be assessed. The codes here match the values in BC and are manually maintained.

The Power Automate flow now has all of the required data components to work. Merely a case of constructing it. Check the image gallery for each step. Notes have been assigned on the flow action if pertinent.

This is the if statement in the "Set AmtBoolean" flow action:

if(and(lessOrEquals(outputs('Excel_Min_Amount_to_Float'), outputs('Total_Customer_Sales_for_Year')), greaterOrEquals(outputs('Excel_Max_Amount_to_Float'), outputs('Total_Customer_Sales_for_Year'))), true, false)

I’ve exported the flow and added it here if you don’t want to follow the steps I’ve laid out: https://github.com/JAng13sea/Blogs/blob/master/BusinessCentral-UpdateCustomerPricingGroup_20220422201332.zip

Send Attachments/Documents to Business Central with Power Automate 📎

“We have API’s now no need for physical copies!”

A while back I did a post on the attachments API for BC: https://joshanglesea.wordpress.com/2021/01/25/business-central-attachmentsapi/

I only talked about using it with Postman or another REST client of your choosing. What about using it with Power Automate instead? I will caveat the fact I hit a tonne of errors trying to get this to work with the latest API pages at the time of writing, v2.0. But we are all about the working version in this post!

Let me set the scene on how I have approached this topic. I want to take an email with an attachment – which is not an image and have it create a purchase invoice for the vendor emailing me. The purchase invoice will have a record of the attached document from the email. I will also store the attachment in OneDrive – could be deleted at the end if you wish.

My starting point is with this trigger. A shared mailbox is possible too.

I’ll break this post into chunks because the flow has a few steps to it. Challenge 1 – the vendor to create the invoice for. In my case I wanted to have a level of dynamism to how the document in BC knows what Vendor/Customer to use. In your case you might not find this possible and have to revert to a placeholder record. I’ve opted to take the first 3 characters of the email domain – for example if the e-mail address is Josh.Anglesea@somecompany.com the filter I will use is “Som” – yes we convert the 1st character to uppercase too. A fair few actions for this first challenge so check the image gallery and the embedded code.

I found this post helpful to solve this challenge: https://powerusers.microsoft.com/t5/Building-Flows/extract-s-string-between-two-characters/td-p/80830

//Index_of_@ 
add(int(indexOf(outputs('Email_From'),'@')),1)

//Index_of_.co
int(indexOf(outputs('Email_From'), '.co'))

//Vendor_Name_position
sub(outputs('Index_of_.co'),outputs('Index_of_@'))

//Vendor_Name_Filter_Value
substring(outputs('Email_From'), outputs('Index_of_@'),outputs('Vendor_Name_position'))

//Filter_in_Proper_Case
concat(toUpper(substring(outputs('Vendor_Name_Filter_Value'),0,1)),substring(outputs('Vendor_Name_Filter_Value'),1,2))

Challenge 2 – is avoiding pesky extra attachments on the e-mail. Because we are pulling this directly from an e-mail you can get all the images from the e-mail signature too 🙄 which means we will need to loop through until we get the typical file type we’re expecting. In my case I’ve said anything which is not a .png – change yours as you need to.

To be able to assess what the file type is we need to save it somewhere first. OneDrive was nice and easy here and the “Create File” action passes back the full file name. Simply a case of adding in a condition to check the property. You could have multiples here. You will then get two lanes – what happens if the file does not end with “.png” and what happens if the file does end with “.png”

Challenge 3 – get the file content from OneDrive and add it to BC. Yes this is the main part of the blog but it’s a challenge mainly because the BC connector for Power Automate does not play ball ⚾ Instead I have used the HTTP connector. You will need to use this post to help with the authentication part of using the HTTP connector: https://joshanglesea.wordpress.com/2022/03/01/business-central-oauth2-for-power-automate-%f0%9f%9b%82/

I’ll use a gallery of images with comments to explain each of the subsequent steps

//Expression to get the first vendor found in BC with the earlier created filter:
outputs('Find_Vendor_to_Invoice')?['body/value']?[0]?['number']

//Parse JSON Schema for the POST action response
{
    "type": "object",
    "properties": {
        "@@odata.context": {
            "type": "string"
        },
        "@@odata.etag": {
            "type": "string"
        },
        "parentId": {
            "type": "string"
        },
        "id": {
            "type": "string"
        },
        "fileName": {
            "type": "string"
        },
        "byteSize": {
            "type": "integer"
        },
        "content@odata.mediaEditLink": {
            "type": "string"
        },
        "content@odata.mediaReadLink": {
            "type": "string"
        },
        "lastModifiedDateTime": {
            "type": "string"
        }
    }
}
I decided to wrap things up by deleting the file from OneDrive now that it will exist in BC too. Are you reading this and thinking “Hey! I got the standard BC connector to work for this!”. Please get in touch if so, or get the steps shared!👍

Business Central CSV import with Power Automate 📥

There are plenty of things I try for blog posts that don’t make the cut. This was almost one of them. I’ll spare you the details on why exactly (until the end) so that we can keep focused on the good parts. My inspiration for this post came from watching this video: https://www.youtube.com/watch?v=2TIolfmtWQM, where a Excel script and a neat trick in Power Automate allow for a CSV file to be converted into an Excel table. The approach itself is very elegant and I’ve used the majority of it to my benefit. I recommend watching the video as I’ll skip over parts covered by it. The Excel script feature is truly awesome 😎

Let’s lay out a scenario where a supplier/vendor will send a CSV file with new items, with some basic details, so you can have them created for you in BC. My super simple CSV for this example:

Barcode,Name,UoM,Price,RRP
4545FB9797,Football,PCS,2.75,8
4545BB1324,Basketball,PCS,3.26,10.5
4545TB1344,Tennis Ball,Pack,0.95,3
4545SB9976,Squash Ball,Pack,0.78,3
4545RB4457,Rugby Ball,PCS,3.25,10.99

As mentioned I’m skipping some details, due to the recommended video, but once you have reached the Excel script stage you will want to add a few lines to cater for an extra row which is created and a caption change to one header value. Here is my script:

function main(workbook: ExcelScript.Workbook, lineCSV: string[]) {
  let selectedSheet = workbook.getActiveWorksheet();
  const alllines = lineCSV;
  let counter = 1;
  for (let line of alllines) {
    if (line.includes(",")) {
      let items = line.split(",");
      selectedSheet.getRange("A" + counter + ":E" + counter).setValues([[items[0], items[1], items[2], items[3], items[4]]]);
      counter++;
    }
  }
 
  workbook.addTable(selectedSheet.getRange("A1:E" + counter), true).setName("VendorItems");
  
//Delete Last blank line
 selectedSheet.getRange(counter+":"+counter).delete(ExcelScript.DeleteShiftDirection.up);
  //Retype the header otherwise the value is RRP\r
  selectedSheet.getRange("E1").setValue("RRP");
  }

From this point onward the video details are complete and it is time to think about extracting the data from the newly created Excel file to BC.

If you check over image 3 of the gallery you will notice that the “unitCost” and “unitPrice” values make use of an expression. This is to convert the data to the correct type. With it being CSV everything is text. This is an example of the expression to convert to decimals:

float(items('Apply_to_each_2')['RRP'])

All seems too simple right? Well yes in hindsight it sort of is. However, I will share the thing that tripped me up. The suggestion from the video is to use a compose action which is a carriage return. The end result of using that idea (with my data) is like this:

  "Barcode,Name,UoM,Price,RRP\r",
  "UVH72WFZ5EO1,facilisis,PCS,$25.33,55.93\r",
  "SVP51SRG4WY1,sodales,PCS,$85.55,50.16\r",
  "2RBS17JLI4CT,at,PCS,$71.72,29.32\r",
  "3YNT44PPY7LW,metus. In,PCS,$9.71,65.62\r",
  "4OGL41FFB6NB,fermentum convallis,PCS,$94.99,30.4\r",
  ""

Take note of the carriage returns which are the “\r” parts at the end of each row. Even after the data is converted into an Excel table those carriage returns remain in place. Remedy number 1 is to have the extra line to your script to rename the last column. Remedy number 2 is then the float() expression. I of course did not stumble across remedy number 2 that quickly, and tried some useless string manipulation 🤐. The performance of the import isn’t too bad. I tried a batch of 5000 records and it took 4.5mins to process. Baring in mind it is an automated process so those aren’t 4.5mins I lost in the space time continuum ⏳

The final thing I would do with this scenario is to move the file you have imported to an archive folder. I think for my scenario to work better you would need a custom api page. I think the catalogue item table is probably a better place for the type of data I was working with 🤷

Business Central OAuth2 for Power Automate 🛂

Generally when writing a post I quickly explain the scenario and get straight to the solution. I need to break that rule here to avoid confusion. As the title suggests this post is about using OAuth2 with Power Automate – applies also to Azure Logic Apps. However, if you have Power Automate flows using the BC connector you are already covered. If you have custom connectors for BC data which use OAuth2 for authentication. You are also covered. So what’s the point of this blog you ask? If you use the HTTP connector in your flows to interact with BC data – you have opened the right blog post 👍🏼. Why does this post matter? Well at the date of writing, 01/03/22, basic authentication for BC web services is being deprecated (https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/upgrade/deprecated-features-platform). OAuth2 is the replacement. If you can move away from using the HTTP connector then please do. That way you can just natively pickup the correct authentication. I say there are two main reasons you wouldn’t move things to the BC connector. If you have published a codeunit as a web service or you use query objects (which aren’t the API type). Lesser reasons might be that you don’t have development resource or want to avoid cost of refactoring. After all Power BI get to use legacy web services so Power Automate should get to as well right? For PowerApps you need to use a custom connector or the BC connector. You might have Power Automate flows which run from PowerApps. In which case if they use the HTTP connector, same explanation as above. Woah! 😮 that was a wordy intro. If you’re still with me let’s get to the good bit…

I will admit early on I only came across my answer thanks to some brilliant blogs! I highly recommend reviewing Arend Jan’s posts in particular if you have little background on working/setting up OAuth2 for BC. Links:

The solution is to have a standalone flow which will retrieve the access token. You will use this flow as a child to you main PA flow. This pattern will work well as you don’t have to alter your existing PA flows using the HTTP connector that much. Let’s get the child flow out the way first as that’s the longer part.

Here is the text string from the compose action mentioned in the gallery of images. Replace the your_client_ID and your_client_secret parts to form the correct string:

concat('grant_type=',encodeUriComponent('client_credentials'),'&scope=',encodeUriComponent('https://api.businesscentral.dynamics.com/.default'),'&client_id=',encodeUriComponent('your_client_ID'),'&client_secret=',encodeUriComponent('your_client_secret'))

Now we have the child flow up and running let’s plumb it in to a parent flow 🧑‍🔧 I will go through each step with images again.

It’s a fairly elegant way to slot in what is needed for the PA flows this might apply to. There are a number of my own previous blog posts which need this solution – like this one: https://joshanglesea.wordpress.com/2020/10/19/business-central-month-end/

To make life easy I’ve created a template of the child flow which you could import into your environment. There are setups that need dealing with before it will work though: https://github.com/JAng13sea/Blogs/tree/master/Template%20Get%20Access%20Token%20PA%20Flow

Business Central Inventory/Sales Analysis – Item Group Dimension Code❓

I’ll be the first to admit that when I don’t locate a report in BC I will knock something up in Jet Reports or Power BI. Nothing wrong with that approach, as you often end up with something that has nicer visuals and better aggregation. However, when possible shouldn’t we push the boundaries with standard? When it comes to inventory or sales analysis what can we do? Finance have account schedules. Well, if you weren’t aware, inventory/sales analysis have a similar set of options. Search for “Sales Analysis Report” or “Inventory Analysis Report”. There are some good blogs for explaining the necessary setups for these – just search for “business central sales analysis report” to find them. Just think of account schedules but for item ledger entries/item budget entries. I’m skipping the basics in this post and attacking a specific scenario.

The part I want to focus on is shown below:

The “Item Group Dimension Code” is part of the inventory setup page. What does it do? Why should I use it? How can I maintain the data in an easier way? All relevant questions that I will try to address. My aim in this blog is to tackle the usage of this field and the relating data.

The first point where this could be used is when building a set of sales analysis line templates:

A user can create a view where item groups are aggregated to give a top level summary of performance or other metrics of their choosing.

If I choose the “Insert Item Groups” option I get the following error message:

This ties back to the earlier screen grab around the “Inventory Setup” field. But that is just a regular dimension code. How do I make it so I can tie that dimension back to data users recognise on the item? Surely it should be related to item categories or a different field? Maintaining both item categories and a standalone dimension is an admin headache if not tackled correctly. Or do I create something which is truly standalone and then expect a user to fill that detail in on an item record too? Such a predicament 🤔

In my case I have opted for the Item Category as the field of choice to flesh out what my dedicated “Item Group” dimension will have. I haven’t included it in my solution but I chose this field because it has a parent/child aspect which lends itself well to Dimensions values. The solution for any new items is to use Power Automate to create a default dimension for the item record when it is created. Any existing item records could be resolved with a variation on this or by using a more standard method like a configuration package. Check out this blog post to see how I solved this: https://joshanglesea.wordpress.com/?p=714

Nicely leads on to a further predicament. If you intend to use this feature then you need to have the dimension setup in advance of any postings. The current “Dimension Correction” feature of BC does not cover the item ledger entries – so all prior postings are out of scope unless corrective action, in another form, is taken. My suggestion would be to start it in a new sales year to avoid conflicts. Reason being that you can plug a sales/inventory budget into the data – very similar theme to account schedules with a budget vs. actual comparison.

This is exactly what I did. Created a new sales budget and then devised it against the new “Item Group” dimension determining an amount of sales per grouping.

Generate a sales analysis view and use the “Item Group” dimension

If you intend to use the budget area ensure you “Include Budgets” so that data pulls through as well. Note I have left the “Update on Posting” off as it can be a system overhead. Common practice is to have that running as a job queue entry.

Now that the “Item Group” dimension is setup it is possible to use the earlier page action we got an error on to flood the template with data:

Use the “Sales Analysis Report” to then throw together each of the elements:

Which produces a visual like below. Again, perform a wider search on the inner workings of the reporting capabilities. My aim was purely to show how to get the “Item Group” aspect working:

It is possible to apply some of the logic you’ve read here to the fields shown below. If we track back to the Sales Analysis Lines setup you can setup reporting based on the dimension values within:

*Customer Group Dimension Code and Salesperson Dimension Code

Business Central – Automatically add default dimensions with Power Automate ☑️

A scenario came up in a different blog post where I needed this solution: https://joshanglesea.wordpress.com/?p=681

What was the aim from the scenario? I required a default dimension to be added to an item record based on a different field value. I’d made the decision to use the Item Category as a dimension so I could use it for inventory/sales reporting. This solution could be applied to other scenarios as you see fit.

  • The starting point for my scenario was to use the creation of an item record as the trigger.
  • A variable to store the ID value for the dimension in the company you intend to use
  • A delay because you are using a “creation” trigger so you need to ensure, as best you can, that the user has keyed in the field you will use – in my case it is the Item Category Code.
  • Get the detail of the record which has been created.
  • The find records action is being used to check if the Item Category Code exists as a dimension value code for the dimension being used.

Check out the below gallery for a view on each step of the flow I took. I have added a note on each step for explanation purposes.

The interesting takeaway from using the “Find Records” action of the BC connector was that you need to handle the possibility of it throwing back an error. This is where the “scope” facility of Power Automate is supremely useful 😁

An example of the result from using this flow. 👍 no manual input required. Plenty of other use cases for this for sure.

Download the flow if you like: https://github.com/JAng13sea/Blogs/blob/master/CreateaDefaultDimension-ItemGroupCode.zip

Business Central Team Member Item Data Insert Hack ☠️

Emphasis on the hack part here. Right now this is a possibility so why not share – will it stay that way? No guarantees. The inspiration is born out of the fact a team member user license is not able to insert item data (amongst other data such as, Customers and Vendors). The “New” button is greyed out for that user type. Now many businesses like to use a team member license for raising purchase quotes. In the scenario where you are requesting to purchase an item which is not on the system yet, you want to give access but keep control. My idea is to satisfy both sides of that with a simple solution – HTTP request and no tables.

No ability to add an item or delete an item as a Team member licensed user can be seen above

To resolve this I’ve opted for a mix of AL and Power Automate. The process flow will be like this:

  • Create a page that is not related to a table – so no conflict on entering data into tables
  • Have the page raise a HTTP Post request to a define endpoint in Power Automate
  • Pick up the data from the endpoint in Power Automate and use it for the resulting business process
  • As Power Automate has plenty of connectors you could have an approval process, Teams message with an adaptive card, details written to SharePoint etc.
The UI design is simple. Have a new page action from the Purchase Quote which opens a NavigatePage page type.
Add the fields you want the user to provide. Note this does use table relations but it does not save the data in a table – it is all just in memory on that instance of the page. I’ve gone for a small amount of fields but the final version would cover more details. Dedicated buttons are then available to either halt the process altogether (Cancel) or to progress and send the data on.
enable “Allow HttpClient Requests” in the settings of the extension
Create a flow in Power Automate and use the above trigger in a manual flow. Set it to expect a POST. In the request body add a JSON object and each of the fields setup to be sent from BC. My full JSON body is below:
{
    "type": "object",
    "properties": {
        "ItemDescription": {
            "type": "string"
        },
        "ItemUoM": {
            "type": "string"
        },
        "ItemCategory": {
            "type": "string"
        }
    }
}
Add whatever additional steps you need to have in the flow. Eventually you can use the BC connector a use the “Create Record” action. Note the dynamic content from the HTTP connector. The Power Automate flow will need to run as an essentials or premium user. It it that user who performs the insert/create.

Here is how the experience can look. Note I just create the item right after requesting it:

Try it for yourself: https://github.com/JAng13sea/Blogs/tree/master/Team%20Member%20Hack

Business Central Power Automate Document Approvals – Comments 📝

When a Power Automate document approval is enabled for Business Central we can leverage the approvals connector or another platform. In either of those scenarios then comments an approver makes will be between the requester and the approver. Now in most scenarios this is fine. However, if we compare this to standard OOTB document approvals for BC then we get all the comment history….so why should we settle for less just because we’ve adopted a different platform for approvals? Plus, the fact the approver for Power Automate doesn’t have to be a BC user means we have no real grasp on who approved. Couple this with the fact the user assigned to the Power Automate BC connector shows as the “approver” in BC

Looks like a self approval to anyone in BC – but it won’t have been 😬

Now to cater for many we will need something to enable the logging of comments across all possibilities. So, I’m talking Sales, Purchase, Customers, Vendors and Items – all of these areas can have approvals in Power Automate for BC. All of them have a standard “Comments” table which can be used as well. Looking to avoid the standard approvals comments because the page action doesn’t exist when you use Power Automate BC Document approvals:

Instead I will use the regular “Comments” area for each piece of data:

There are many ways, like always, to solve this but I’m opting for the quickest. Which of course means, I considered creating a codeunit to act like a broker for each comments area. However, when you can it is great to use all standard objects and tools! For instance if you’re not familiar with “Page Inspection” check it out. I used it to determine the page numbers for all the areas I need to eventually cover: https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/developer/devenv-inspecting-pages?tabs=page

Simply a case of publishing three pages as web services:

So what kind of data do we have to send to the service so it attaches to the correct record? I will bundle together Customer/Vendor/Item as the protocol is the same you just swap out the table number. Sales and purchase we’ll handle separately.
       //Customer, Vendor or Item can use a minimum JSON body of the below. Just swap out the "Table_Name" and "No"
       {
            "Table_Name": "Item",
            "No": "1896-S",
            "Comment": "Test"
        }

       //Purchase Documents just swap out the "Document_Type" and "No"
       {
           "Document_Type": "Invoice",
           "No": "107209",
           "Document_Line_No": 0,
           "Comment": "Test Comment"
       }

       //Sales Documents just swap out the "Document_Type" and "No"
       {
          "Document_Type": "Invoice",
          "No": "107209",
          "Document_Line_No": 0,
          "Comment": "Test Comment"
       }

So, that’s BC organised, what about Power Automate? I have opted for a OOTB BC template Purchase Invoice approval in Power Automate. The posting of comments, for me at least, can go in two possible places. One is right after the “Start an approval” action

Or you could place them after the condition so that you can add more context to the comment. In my case I’ve use a concat() function to add an emoji

Here is the result:

Minor draw back of using the standard tables/pages here is that the comments table(s) could well be editable to the user. It is reasonable that permissions can control this as the user who posts the comment is the one assigned to the BC Power Automate connector. Of course, permission sets aren’t the only standard choice. You could use change log or field monitoring too. If that doesn’t sit well with you then I’d suggest adding a boolean field to control whether the comment record can be altered or not. You can add that to the JSON body you send once it’s available. Final thing being the “Comment” field in the standard tables only houses 80 characters – but that’s the same as the standard BC approvals comment field – so no essay long approval comments folks 😜

Business Central Power Automate Document Approvals – Cancel ❌

Have you noticed when using Power Automate for BC document approvals you have the same approval buttons? One for sending and one for cancelling. The sending one works fine. Cancel on the other hand isn’t doing what we need. The approvers will still have an outstanding approval and the flow will keep waiting to run. Eventually when it does run it will fail as the request has been cancelled BC side. If no one approves and the rest of the flow doesn’t run then you have a flow run ongoing until it expires…These things are ok. I just think it makes the administration more difficult.

My goal with this post is to have the standard cancel action to actually cancel my Power Automate approval and my Power Automate flow run. Full housekeeping 😄🏡. Let’s cover some basics first:

The approval entries can be viewed by selecting the “Approvals” button. If using Power Automate a different page will appear. It will show a single entry each time an approval is sent. That entry will be adjusted if the approver accepts the request or rejects the request.
Standard BC document approvals would show a line for each approval entry created. With Power Automate we don’t have that detail in BC. This is as much as we get.

With Power Automate you might be using different methods to get the approval dealt with – MS Approvals, Teams, Outlook, other. I will look to solve the MS Approvals scenario as it’s the most common.

So what do we need? Well first thing is a record of information from Power Automate back in BC. So for starters a table extension to the “Workflow Webhook Entry” table. This is where the data from the “Flow Entries” page (above) lives. We have 3 fields as I want to capture details on numerous attributes so I can cancel all the relevant parts.

tableextension 50100 PA_Approvals extends "Workflow Webhook Entry"
{
    fields
    {
        field(50100; "Power Automate Approval ID"; Guid)
        {
            Caption = 'Power Automate Approval ID';
            DataClassification = CustomerContent;
        }

        field(50101; "Power Automate Flow ID"; Guid)
        {
            Caption = 'Power Automate Flow ID';
            DataClassification = CustomerContent;
        }
        field(50102; "Power Automate Run_ID"; Text[100])
        {
            DataClassification = CustomerContent;
            Caption = 'Power Automate Run ID';
        }

Alongside our new fields we are going to need an API page so we can make updates to the BC data. Just showing the properties as this is the important part. I made only my new fields editable

page 50100 PA_Approvals
{
    APIGroup = 'jaGroup';
    APIPublisher = 'ja';
    APIVersion = 'v2.0';
    Caption = 'paApprovals';
    DelayedInsert = true;
    EntityName = 'paApproval';
    EntitySetName = 'paApprovals';
    PageType = API;
    SourceTable = "Workflow Webhook Entry";
    ODataKeyFields = "Workflow Step Instance ID";
    ModifyAllowed = true;
    InsertAllowed = false;
    Permissions = tabledata "Workflow Webhook Entry" = RIMD;

More AL DEV follows but let’s tackle the Power Automate Document approval next. Start off with the standard template for whichever document you need this for. I’ve chosen Purchase Orders.

Kept the BC connector steps in to show the start of the flow. The changes and additions can be seen after the BC connector steps. The rest of the flow fits the standard template.

The compose action “Workflow ID” contains the expression workflow()?[‘name’] so that I can record the GUID of the workflow. The “Workflow Run ID” contains the expression workflow()?[‘run’][‘name’] so that I can record the text string name of the flow run. The template for a BC document approval comes with a “Start and Wait” approval. I’ve removed this and added the “Create an approval” so that I can find out and record the ID of the created approval. You then need the “Wait for an approval” so the flow has the response. The final change/addition is the HTTP step.

Used this instead of the BC connector – where you can select custom API pages – as I got an error about permissions. I believe this is due to the If-Match * being required as a header to the request. Add the URL to the API page and make reference to the “Workflow Step ID” from the BC Connector – this filters on that record only. The request will then pass a body to the three new fields. Note that each of the values come from the previously mentioned changes/additions to the document approval template.
The result will be the values of the specific Power Automate Flow back in BC.

So all that remains is the cancellation. For the data in BC to end up back in Power Automate a codeunit is required:

codeunit 50100 PA_CancelApproval
{
    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Workflow Webhook Management", 'OnCancelWorkflow', '', false, false)]
    local procedure CancelPowerAutomateApprovalEntry(WorkflowWebhookEntry: Record "Workflow Webhook Entry")
    var

    begin
        if IsNullGuid(WorkflowWebhookEntry."Power Automate Approval ID") then
            exit
        else
            SendToFlow(WorkflowWebhookEntry."Power Automate Approval ID", WorkflowWebhookEntry."Power Automate Flow ID", WorkflowWebhookEntry."Power Automate Run_ID");
    end;

    local procedure SendToFlow(ApprovalID: Guid; FlowID: Guid; RunID: Text)
    Var
        client: HttpClient;
        cont: HttpContent;
        header: HttpHeaders;
        response: HttpResponseMessage;
        Jobject: JsonObject;
        tmpString: Text;
    Begin
        Jobject.Add('ApprovalID', ApprovalID);
        Jobject.Add('FlowID', FlowID);
        Jobject.Add('RunID', RunID);
        Jobject.WriteTo(tmpString);
        cont.WriteFrom(tmpString);
        cont.ReadAs(tmpString);
        cont.GetHeaders(header);
        header.Remove('Content-Type');
        header.Add('Content-Type', 'application/json');
        client.Post('<Add your URL here', cont, response);
    end;

The idea here is that another Power Automate flow will be setup to receive a POST. Note that you will need to add the URL to your Power Automate flow for this to work.

Here are the main steps to the flow. Note that the final step is a custom connector. Use this excellent blog to create/download the connector: Everything to know about Power Automate REST API to manage and administer your flows – Mohamed Ashiq Faleel

The URL shown here is the one you need in the codeunit. Add it to a table if you would rather store it in a visual place. Add a schema of the expected JSON body – link available to paste in a sample payload*
Locate the Dataverse connector and choose the Approvals table. This will remove the approval from a users open queue 👍 The “Row ID” has an expression as the JSON contains a GUID with speech marks and Dataverse doesn’t like that 🙈 to bypass that use substring(triggerBody()?[‘ApprovalID’],1,36). The same thing is needed for the GUID of the FlowID
A user keeps an audit trail though. Note the “Outcome” is “Canceled”.
The Flow run history will show you any that are legitimately waiting on an approvers review with the “Status” of “Running”. Any that have been passed to the cancellation flow are dealt with accordingly. 🏡 house back in order 😅

Use of a webhook could be done here as well. I opted not to use that as you need something in place to renew the webhook and it would have to POST the information to flow I chose anyway. Hardcoding the endpoint is a bit 🤮 but it is simple

Business Central CSV export using Power Automate

A CSV export with Business Central (or NAV previously) is nothing new or exciting. Why blog about it then? Well let’s tackle the topic using a different solution. This is a CSV export using Power Automate with Business Central data. It could be argued that providing you have the relevant licensing this method would stop use needing 3rd party tools.

My scenario will be basic but the application is broad. I’ll use a web service (API page or query is fine too) to expose item data showing some fields. I’ll filter the web service to keep my results condensed and only select the fields I need. The goal: retrieve the required data from BC and create a CSV file on a recurring basis. The file should be added to a chosen storage location.

Power Automate has a recurring type flow which is suitable for this scenario. To ensure better error trapping and subsequent actions I am making use of the “Scope” feature in my flow. I will then add the actions I need. Two more scopes take place after and will only run if the 1st scope succeeds or fails – 1 scope for each possible outcome.

“Get BC Data” is the first scope block. Inside that we have the we have: HTTP call to a defined web service – details below. Parse JSON so we can understand the schema and finally a compose action. More on that as you read

The HTTP call in my case is using the Item Card, but this is adaptable to other data. The main thing I found useful here is the ability to use OData filtering to limit the results. Example of the url and the filtering and the ability to select specific fields: https://api.businesscentral.dynamics.com/v2.0/<tenantID>/Production/ODataV4/Company(<companyName&gt;)/Item_Card?$filter=Inventory%20gt%200&$select=No,Description,Unit_Price,Item_Category_Code,Inventory

I also liked using a header in the HTTP request of “Accept” which has this as the value: application/json OData.metadata=none. It’s not terribly exciting but when you intension is to read data only (I don’t want to insert or modify) it removes some of the data you won’t be using like the etag (typically used for tracking modifications). I’d say the majority of the work in this entire process is with the BC data endpoint. You might find a custom api page is needed. For example, to get all the fields you want from each of the tables you need them from.

Most of what I’ve covered thus far is fairly normal in Power Automate. The worthwhile part of this post comes in explaining the use of a compose action – which from the picture is titled as “Array for CSV”. Why do we need it? Well the action that follows for creating a CSV table (standard feature in Power Automate) expects an array. Now in the case of BC it will typically output a JSON object. Without going into extensive detail on json objects or arrays just think about the fact you need the part in between [ ] (square brackets). A simple expression takes care of converting the JSON object you get from BC to an array, which can be converted to a CSV. Create a compose action and add expression body([‘Parse_JSON’]?[‘value’]). This will extract just the array part which is typically titled as “value” when we retrieve data from a BC web service. Example of this can be seen below:

{
    "value": [
        {
            "No": "150A4444A",
            "Description": "ProBook 450",
            "Item_Category_Code": "LAPTOP",
            "Inventory": 22,
            "Unit_Price": 900
        }
   ]
}

The remainder of the flow is super simple. I’ve used another scope block and adjusted the settings of it to run only if the first BC data block was successful. Use a built in “Create CSV table” action and take the output of the earlier compose action with the array you need – this function only accepts an array. Then you can choose where the file should output to. I chose to create a SharePoint file – ensure you put .CSV at the end of the file name

This is a flexible solution where you can quickly have automated or on demand CSV exports working for your solution. It is much more rapid than the full BC method which would involved job queue tasks. You can of course layer in better notifications as well. This is why I opted for the Scope feature where you can alter the “Configure run after” setting for a previous action.

Here you can see that I will run the “Send Failure Notification” if any of the steps in the “Get BC Data” scope block fail.

Business Central line level approvals

Part 1 – Using one line value

What am I getting at here? Well, think about a document approval scenario where you might need to have let’s say 1 shortcut dimension that you want to use as a driver for who approves the line. This could be a cost centre or a specific project, maybe even a department. Standard BC requires you to create multiple workflow records to handle different filtering. Personally found that “Header” focused fields are the main one’s which get checked. For creating line level approvals this is a challenge on performance as all workflows will be checked. Is there another way? Power Automate provides a fresh view on the topic. You are able to ignore the rigid nature of BC workflow engine and come up with something new. Couple that with the fact you have more notifications on offer it’s an intriguing proposition.

“Department Code” represents the value I want to create approvals for.

Let’s work with the business process being you need approval from the department head on a line if a specific department value appears. In my example I will use a global dimension but in theory you can adapt this to any line field exposed by an API or Odata page. There this a set of document approval templates available with Power Automate for the BC connector but they won’t be used here.

The initial event triggers of “when a document approval is requested” is the starting position of the flow. Naturally followed by GETting the details of the record. From this point onwards it’s all about the line data and working it so we can cycle through accordingly and get approvals. In my scenario I have a multi line document which has repeating “Department” values. I want to create a single approval per department not a single approval per line. The approver is approving the total for their department.

The principal mechanism that makes this flow produce the desired results is counting the approvals result. If we have a unique total of values, so departments in my case, then I need to match that with approval results to get an approved document. So 3 unique values means 3 approval results to get an approved document.

Parse JSON is used after the HTTP call to get the line detail. This enables us to pass those values to other areas. So that I have a simple array to work with I select only the Dimension value from the parsed JSON, creating a new clean array. Given we only want unique values, and or current example has 7 lines with dimension data, it is necessary to use an expression. I am using the “union” expression to compare the array from the select to itself. This results, in my case, with 3 unique values.
After this we want to have two variables, which we will compare later on. Set them as above and use the “length” expression to count the number of values created by the previous “Unique Dimensions” compose action.
The mentioned “Outputs” is from the Unique Dimensions compose action. We are using an “apply to each” to loop through the results of that. The “Switch” then checks the value and performs the necessary action.

I draw some comparisons with the standard workflow in terms of maintenance with a “Switch” statement. This is where I check for results on my chosen line level field value. If I get a match then I send an approval to the relevant person. 📝 Note that we could lookup the approver recipients from a setup table if needed. Feel like how it is currently setup resembles some of the standard BC workflow setup.

The condition of each switch statement approval is to count the approval result and add it to the earlier set “Approval_Counter” variable. Note that we take no action if the approval request is rejected.
Outside of the “apply to each” create another condition. Compare the “Approval_Counter” with the “Unique_Count” variable. If they match then you can complete the BC Approve action. If they don’t you can complete the BC Reject action.

If you like this pattern please comment. Submitting it as a template for more users to gain access might be worthwhile 👍🏼 If you want to avoid doing all the steps described then access the template for download here: https://github.com/JAng13sea/Blogs/tree/master/Line%20level%20approvals

Want to see it in action? Check out my video from Scottish Summit 2021 where I demo it! https://youtu.be/ldOnzRySzcY

Business Central Company Master Data Synchronisation with Power Automate

There are apps that handle this and I’ve seen a colleagues version of it recently which is great. However, why code when you don’t have to? Limiting factor with no code will be that there is no restriction on other companies i.e. other companies can create data. Some solutions look to reduce this possibility. This is a basic version that doesn’t cover that 🤪

So here’s the goal – have one BC company create master data and pass it onto other companies. Have one companies master data modifications passed to other companies. Have one companies master data deletions result in deletions in other companies.

Note that each type of data action will be in an individual Power Automate Flow – i.e. one for Insert, one for modify and one for delete.

Code when you don’t have to isn’t quite what I’ve gone for but it is certainly low code. My reasons are totally justified as you’ll see. This solution could be used on G/L Accounts, Customers, Vendors, Items, Dimensions, Fixed Assets, Posting Groups etc. I have chosen to demo it with G/L Accounts as you need a little extra there. If you check the standard API for accounts (g/l accounts) it only has a GET command. Which is fine for one part of the process but no good after that. I’ve knocked up a page with the idea of using it as a web service, could be done with an API page too. Either way that gives us an endpoint to work with in Power Automate. Whether you are dealing with insert, modify or delete the start of the flow will use the regular BC connector. Only after that will you look at the HTTP connector which enables you to work with endpoint URL, like that of our web service page. Once you have that page you are able to POST, PATCH or DELETE. The start of the flow can use an event based trigger from the BC connector for insert, modified or deleted. Insert and modify follow the same concept so I’ll cover just one. For DELETE something else is needed. Code can be found here: https://github.com/JAng13sea/Blogs/tree/master/Master%20Data%20Sync

Each flow will start with the BC connector and use one of the 3 triggers we are interested in. Note that Power Automate refers to the actions as New (Insert), Changed (Modified) and Deleted (that’s the same 😁)

In my example with G/L Accounts not much information can be taken from the standard BC connector with a GET command. That is why I have the supplementary page in addition to it allowing me to perform more CRUD actions.

The second part of the flow is to GET the details of the record which has been created or modified – the delete step does something slightly different, read on to find out. The HTTP connector is being used here to pull the required data. An ODATA filter has been added so that the correct record is shown. The standard BC connector will give you in “Row id” value that is dynamically displayed.

Once the HTTP connector is dealt with for pulling the right information the in built Parse JSON action can be used. This will make each part of the response data from the GET command be accessible dynamically so we can produce a new JSON for patching or posting data into one of the other companies 👍 The parse JSON allows you to paste a schema. If you have tested the URI from the GET command you will have something to copy and paste

Once a schema has been generate from the sample – note the button you can press – it will place a valid JSON schema in the main area of this action, like above.
Depending on how many companies this will be for you need to create the above for each. Thankfully a copy and paste feature does exist and you can just change the company value. I have used an ODATA page here so company name is accepted. For an API page use the company ID value

An “Apply to each” starts each block that handles the companies being updated – this is due to the JSON we parse – it is just a loop but it will only run once per record. Place the “value” into the “Apply to each” so it can iterate through the record. Add a new HTTP action and choose the relevant command (PATCH for modify, POST for insert, DELETE for delete). The URI here selects the “Number” of the record to be worked with and that has come directly from the JSON that was parsed. Once your headers are entered you just need to build up a body. If you access the URI from a browser or with a HTTP test tool grab a sample body. Remove the static elements and place in the dynamic JSON. As mentioned the copy and paste feature speeds this process up for the other companies where a change on the company name or ID is the only added step.

Create parallel branches for the other companies you want to update. Passing the data at the same time is fine as in the back end they are all separate tables

What has to happen for the delete option though? If we delete the data from the “master” company the reference information is lost and of course per company the ID value will be different as that is a unique GUID. To aid this I have opted for an event subscriber in a codeunit for the OnDelete trigger for each table I intend to use the feature for. There is a supporting table which then holds the deleted information so I have enough to then use a primary key value in one of the other companies to do a delete.

A supporting page is then used for the initial GET command so the details can be retrieved about the deleted record. After that the structure of the flow remains similar.


Business Central Month End with Power Automate

A month end procedure is a regular topic to be covered when implementing finance with D365 BC. Some finance systems throw a user into a defined routine but with D365 BC it’s much more simplistic. To remove the need for users to be experts wouldn’t a defined routine be useful?

Here is the goal: Create a Power Automate flow that will handle the month end procedure in BC with minimal user input. Principal thing here is that month end is routine so why not have something to cater for it. Caveat here being that job knowledge makes up for a big portion of this finance process so not all of it can be catered for. The scope is laid out in the next paragraph.

So what makes up a month end in BC and what is the scope of this procedure?:

Our first pointer is around the User Setup and this table could be home to a number of users that are in and out of scope for the procedure. So without any code the solution here is to define a “User Group” – perfect standard system placeholder. The added benefit is this table has “Company Name” included so if you want a differing solution per company you got it! 😊

Add the users that you want to be part of the allow to/from posting date change only. People in the finance team for example may need to have longer to post so you could define another group for them and create another step in the process to roll on their dates. The page for this is 9831 and you need to know this as a web service is needed – once published move on to the next section

Once page 9831 is published as a web service some ODATA filtering is needed to get what we need from the exposed data. I called my web service “MonthEndUsers” so I just add the orange part onto the end of the published web service URL: MonthEndUsers?$filter=User_Group_Code%20eq%20%27MONTH-END%27

Whilst on the web services page publish page 119 – User Setup – as this is needed too.

Head over to Power Automate and create a scheduled flow:

Flows run using UTC so if you need a specific time zone to be used check the advanced options once you have created the flow in the “Recurrence” step.

First thing needed is the dates for the posting from and to which will be dynamic each time. Add a “Compose” action for both of these and use the following formulas to get what we need:

  • Start of the month = startOfMonth(utcNow(),’yyyy-MM-dd’)
  • End of the month = formatDateTime(subtractFromTime(startOfMonth(addToTime(utcNow(),1,’month’)),1,’day’),’yyyy-MM-dd’)

Add a HTTP action to the flow and paste in the ODATA URL with the filter and use the GET method. Add a Parse JSON to grab the body from the HTTP trigger. Paste in a schema which you can get from going to the URL in a browser and authenticating. Add an “Apply to each” control and use the “value” of the Parse JSON. Last step is to add two PATCH requests to the User Setup page. Why two you ask? Well the page as standard won’t allow you to have a “From” date ahead of the “To” date and the body of a HTTP request is not dealt with in the order we send it so to prevent errors two requests are needed. First one to the “To” date and then the second to the “From” date.

The PATCH command needs specific syntax for filtering on the user record. At the end of the published web service URL the orange text represents one user value. The flow will use the “UserName” from the parse JSON step: UserSetup(‘UserName’). In the Header of you HTTP add the below to prevent the likelihood of errors:

General Ledger setup cannot be altered with ODATA so a different solution would be needed for that page most likely with modification. I will not cover that.

Remaining activity is the recurring journals and the General Ledger Setup. At the time of writing the standard BC connector for Power Automate has a post-journal action but it does not work with recurring journals. If your scenario is without recurring journals the standard guide on journal posting will work fine. I will cover what to do about recurring journals and I will use the batch name instead of the ID for ease of use. Publishing the General Ledger Setup page as a web service does not enable you to modify the data, it errors. As a result these two areas need development to give the final result needed.

To do this I’ve opted for a single codeunit which I will publish as a web service and use as an unbound action to complete the work by passing specific parameters:

Not a lot of code is needed here and it is more about the HTTP syntax and passing a body in the request

Unbound actions with codeunits published as web services is fairly new and highly useful. Simply publish your new codeunit as a web service and then use this logic within the URL (note the orange part changes depending on the function you need to use):

https://api.businesscentral.dynamics.com/v2.0/{tenantID}/Sandbox/ODataV4/PostReccurJrnl_postJrnl?company={companyID}

or

https://api.businesscentral.dynamics.com/v2.0/{tenantID}/Sandbox/ODataV4/PostReccurJrnl_AdjustGLSetup?company={companyID}

Within Power Automate we now have an endpoint to use and we can simply pass in the necessary parameters. Note that in this instance both will be POST actions. The body is just the name of the parameters and the required values:

I’ve tagged mine onto the end of the User date change as I already have the dates figured out and I want it to run for that time anyway 😁

Code available here: https://github.com/JAng13sea/Blogs/tree/master/Month%20End

Power Automate with Business Central update via codeunit using SharePoint data

Writing this is a quicker style than usual as the understanding I have around it was gained from another blog so I deserve no credit on that front. Grazie Stefano: https://demiliani.com/2019/06/12/dynamics-365-business-central-using-odata-v4-bound-actions/

Also a mention to my colleague Dan Kinsella (https://dankinsella.blog/). He helped with the codeunit element of this solution. Cheers Dan 🍻 I owe you a beer!

Why have I decided to blog about this? Well I had a meeting with a prospect recently that sparked it. Idea was that they would use SharePoint to hold website images as a repository. Naturally they wanted to see the images in BC. A modification to store more images could have been explored but I left that alone. Instead I suggested the use of the “Links” feature. User then clicks on the SharePoint URL to see the image…no need to store it twice, so better for database size 👍

Getting the data from SharePoint to BC though was another thing as the “Links” area is a system table so no chance of a direct HTTP call with Power Automate. So the goal here is: Create a codeunit that is accessed from a web service that stamps a link on an item record.

So we start off with a very basic codeunit

To go along with that we need a page as it is the page we publish as a web service. Read Stefano’s blog for the reason why, he discovered it after all.

Add the fields that you need for the circumstance. In my case it is just the item number, system ID and the description – only the item number is needed really in my scenario. A function is then needed on the page and you need to ensure you have the [ServiceEnabled] part. Again check out Stefano’s blog to understand why.

Testing it in postman the final result looks like this. Note the addition to the URL which has /NAV.AddItemURL which is the name of the page function from the last screenshot

After posting with Power Automate I get the following:

Copy of the AL code is here: https://github.com/JAng13sea/Blogs/tree/master/Add%20Item%20Link%20-%20AL

Business Central Batch post and e-mail with Power Automate – no code

Batch posting has been in the product for some time but I am not seeing a clear way, happy to be wrong, on batch post and e-mail. Here are my options (below) as a user conducting the posting. Yes, I have post via the job queue as well but from what I’ve tested this isn’t sending my e-mail for me. Even the print option breeds no results here. Thought it would at least use the “Document Sending Profile” on the customer record.

Even when you use the “Post Batch” you get nothing about e-mailing? 😬 Not forgetting the workflows in BC have no option for “Post & Send” it’s just “Post”. Usually these gaps would need filling with a code modification but not in this blog.

Where there is a will there is a way and my idea here is to use Power Automate to do the heavy lifting. On this occasion I will use a manual trigger but if this was a production ready scenario I would use a scheduled type flow.

The connector for BC in Power Automate is fairly small but I’m sure it will get better over time. There is a slight restriction though as the action of posting and emailing exists but it’s on a singular level like it is in the regular UI. So the first thing I need to do is fire in some data I can use in Power Automate from BC. I will of course need the GUID ID reference to the sales invoice as that’s what the BC Power Automate connector likes. The page that fits the bill here is available as a web service already. I will use the top one in the list 2811. This page is a mixture of open and posted sales invoices so some filtering is needed:

Example of the payload from this page:

Adding a filter to the ODATA query so I only post invoices that are ready to be posted. In the case of this page the “status” values are different to regular BC. In my case I have a choice between draft aka “Open” or open aka “Released”. So I have added ?$filter=status%20eq%20%27Draft%27%20and%20totalAmountIncludingTax%20gt%200 to the end of the web service URL I got from BC. Reason being the feature in the Power Automate BC connector restricts me to draft only and I don’t want to post invoices with a 0 value. Will log something on the Power Automate forum about the connector wanting “Open” invoices rather than “Released”. Backwards logic otherwise, what if I’m using approvals for instance. I think this is possibly wrong so I have posted this to find out more: https://powerusers.microsoft.com/t5/Connecting-To-Data/Is-the-Business-Central-Post-amp-Send-Invoice-action-correct/m-p/637081#M9492

Anyway….

Our flow starts to take shape like this. Check out my last blog for a more detailed account on the parse JSON part (https://joshanglesea.wordpress.com/2020/07/22/power-automate-purchase-order-alerts-to-vendor/)

Once we add the “id” from the parse JSON step you will get an “Apply to each” wrap around the chosen BC action which is exactly what we want (automatically). Remember our goal Batch post and e-mail invoices – so we better placed than standard.

In my case a small batch of invoices have been handled and emails dished out accordingly. I added an extra step for updating the posting date as well. Just so it takes something useful from the existing “Batch Posting” feature 👍