D365 Business Central, Power Automate

Business Central CSV export using Power Automate

A CSV export with Business Central (or NAV previously) is nothing new or exciting. Why blog about it then? Well let’s tackle the topic using a different solution. This is a CSV export using Power Automate with Business Central data. It could be argued that providing you have the relevant licensing this method would stop use needing 3rd party tools.

My scenario will be basic but the application is broad. I’ll use a web service (API page or query is fine too) to expose item data showing some fields. I’ll filter the web service to keep my results condensed and only select the fields I need. The goal: retrieve the required data from BC and create a CSV file on a recurring basis. The file should be added to a chosen storage location.

Power Automate has a recurring type flow which is suitable for this scenario. To ensure better error trapping and subsequent actions I am making use of the “Scope” feature in my flow. I will then add the actions I need. Two more scopes take place after and will only run if the 1st scope succeeds or fails – 1 scope for each possible outcome.

“Get BC Data” is the first scope block. Inside that we have the we have: HTTP call to a defined web service – details below. Parse JSON so we can understand the schema and finally a compose action. More on that as you read

The HTTP call in my case is using the Item Card, but this is adaptable to other data. The main thing I found useful here is the ability to use OData filtering to limit the results. Example of the url and the filtering and the ability to select specific fields: https://api.businesscentral.dynamics.com/v2.0/<tenantID>/Production/ODataV4/Company(<companyName&gt;)/Item_Card?$filter=Inventory%20gt%200&$select=No,Description,Unit_Price,Item_Category_Code,Inventory

I also liked using a header in the HTTP request of “Accept” which has this as the value: application/json OData.metadata=none. It’s not terribly exciting but when you intension is to read data only (I don’t want to insert or modify) it removes some of the data you won’t be using like the etag (typically used for tracking modifications). I’d say the majority of the work in this entire process is with the BC data endpoint. You might find a custom api page is needed. For example, to get all the fields you want from each of the tables you need them from.

Most of what I’ve covered thus far is fairly normal in Power Automate. The worthwhile part of this post comes in explaining the use of a compose action – which from the picture is titled as “Array for CSV”. Why do we need it? Well the action that follows for creating a CSV table (standard feature in Power Automate) expects an array. Now in the case of BC it will typically output a JSON object. Without going into extensive detail on json objects or arrays just think about the fact you need the part in between [ ] (square brackets). A simple expression takes care of converting the JSON object you get from BC to an array, which can be converted to a CSV. Create a compose action and add expression body([‘Parse_JSON’]?[‘value’]). This will extract just the array part which is typically titled as “value” when we retrieve data from a BC web service. Example of this can be seen below:

{
    "value": [
        {
            "No": "150A4444A",
            "Description": "ProBook 450",
            "Item_Category_Code": "LAPTOP",
            "Inventory": 22,
            "Unit_Price": 900
        }
   ]
}

The remainder of the flow is super simple. I’ve used another scope block and adjusted the settings of it to run only if the first BC data block was successful. Use a built in “Create CSV table” action and take the output of the earlier compose action with the array you need – this function only accepts an array. Then you can choose where the file should output to. I chose to create a SharePoint file – ensure you put .CSV at the end of the file name

This is a flexible solution where you can quickly have automated or on demand CSV exports working for your solution. It is much more rapid than the full BC method which would involved job queue tasks. You can of course layer in better notifications as well. This is why I opted for the Scope feature where you can alter the “Configure run after” setting for a previous action.

Here you can see that I will run the “Send Failure Notification” if any of the steps in the “Get BC Data” scope block fail.
D365 Business Central, Power Automate

Business Central line level approvals

Part 1 – Using one line value

What am I getting at here? Well, think about a document approval scenario where you might need to have let’s say 1 shortcut dimension that you want to use as a driver for who approves the line. This could be a cost centre or a specific project, maybe even a department. Standard BC requires you to create multiple workflow records to handle different filtering. Personally found that “Header” focused fields are the main one’s which get checked. For creating line level approvals this is a challenge on performance as all workflows will be checked. Is there another way? Power Automate provides a fresh view on the topic. You are able to ignore the rigid nature of BC workflow engine and come up with something new. Couple that with the fact you have more notifications on offer it’s an intriguing proposition.

“Department Code” represents the value I want to create approvals for.

Let’s work with the business process being you need approval from the department head on a line if a specific department value appears. In my example I will use a global dimension but in theory you can adapt this to any line field exposed by an API or Odata page. There this a set of document approval templates available with Power Automate for the BC connector but they won’t be used here.

The initial event triggers of “when a document approval is requested” is the starting position of the flow. Naturally followed by GETting the details of the record. From this point onwards it’s all about the line data and working it so we can cycle through accordingly and get approvals. In my scenario I have a multi line document which has repeating “Department” values. I want to create a single approval per department not a single approval per line. The approver is approving the total for their department.

The principal mechanism that makes this flow produce the desired results is counting the approvals result. If we have a unique total of values, so departments in my case, then I need to match that with approval results to get an approved document. So 3 unique values means 3 approval results to get an approved document.

Parse JSON is used after the HTTP call to get the line detail. This enables us to pass those values to other areas. So that I have a simple array to work with I select only the Dimension value from the parsed JSON, creating a new clean array. Given we only want unique values, and or current example has 7 lines with dimension data, it is necessary to use an expression. I am using the “union” expression to compare the array from the select to itself. This results, in my case, with 3 unique values.
After this we want to have two variables, which we will compare later on. Set them as above and use the “length” expression to count the number of values created by the previous “Unique Dimensions” compose action.
The mentioned “Outputs” is from the Unique Dimensions compose action. We are using an “apply to each” to loop through the results of that. The “Switch” then checks the value and performs the necessary action.

I draw some comparisons with the standard workflow in terms of maintenance with a “Switch” statement. This is where I check for results on my chosen line level field value. If I get a match then I send an approval to the relevant person. 📝 Note that we could lookup the approver recipients from a setup table if needed. Feel like how it is currently setup resembles some of the standard BC workflow setup.

The condition of each switch statement approval is to count the approval result and add it to the earlier set “Approval_Counter” variable. Note that we take no action if the approval request is rejected.
Outside of the “apply to each” create another condition. Compare the “Approval_Counter” with the “Unique_Count” variable. If they match then you can complete the BC Approve action. If they don’t you can complete the BC Reject action.

If you like this pattern please comment. Submitting it as a template for more users to gain access might be worthwhile 👍🏼 If you want to avoid doing all the steps described then access the template for download here: https://github.com/JAng13sea/Blogs/tree/master/Line%20level%20approvals

Want to see it in action? Check out my video from Scottish Summit 2021 where I demo it! https://youtu.be/ldOnzRySzcY

D365 Business Central

Business Central Attachments with standard API – Part 1 Basic Explanation

A nice feature which is explained fairly well in the Microsoft docs site but it is especially nice to have a working example. (https://docs.microsoft.com/en-us/dynamics-nav/api-reference/v1.0/api/dynamics_attachment_patch). The attachments API itself actually points to the “Incoming Documents” feature of BC, so terminology might catch you out. The scope of the API therefore covers whatever you can use “Incoming Document” for: Purchase Invoice/Credit, Sales Invoice/Credit and Journal lines.

You need to use to HTTP calls to make this one work and this is the pattern:

Firstly you POST a simple line to say what you are attaching to i.e. our Invoice/Credit record or our journal line. This must be the ID or SystemID value of that record. How this plays out in a practical sense will depend on how you implement the API with your solution. In the scenario where you create a purchase invoice and then have an attachment of that latched against it you will need to perform the creation of the purchase invoice record first – a follow up post on this. Here is how the first action looks:

https://api.businesscentral.dynamics.com/v2.0/<tenantID>/uat/api/v1.0/companies(companyID)/attachments

Body:

{    “parentId”: “5a086bc7-195f-eb11-89f9-0022481ab2d5”,    “fileName”: “Example.pdf”}

How it looks in a HTTP client like Postman. Note that the “filename” property is just a value you choose. So if this was part of a solution you might do it as the invoice number or document no. of the journal line you created before this step

Now the PATCH part is where the order of this becomes very important. You need a value from the POST to do the PATCH…..😮

This is the response I get from doing the POST command. It is the ID value which is new to the story and is how I perform the actual attachment process. POST is just to create the Incoming Document record….penny drops 🙈

Once the above is known then the PATCH command shown in Microsoft Docs becomes very easy to follow. Here is the command: PATCH businesscentralPrefix/companies({companyId})/attachments(parentId={parentId},id={attachmentId})/content . Simply add the information we now know. So in my case, with the above screenshot, my “parentID” is 5a086bc7-195f-eb11-89f9-0022481ab2d5 and my “id” is 7a1f7c6c-1e5f-eb11-89f9-0022481ab2d5. Notice that the odata.etag has been provided from the previous POST step so you can add that in as well for the “Headers” of the PATCH command. Here is what I mean:

etag is there to track the changes – make sure someone else hasn’t modified the record since you last got it. It is possible to use a wildcard here if needs must

In postman at least the body of the PATCH command is something like below but in a real solution you just need to make sure you pass a binary file:

In postman if you are testing this you can select a file to work with and it will pop open your file explorer.

Once each of the mentioned are in place you can go ahead with the PATCH. You should get a 204 status response but there will be no body type output.

When you head over to BC though you will see your record, which started life as our ID no has an “Incoming Document” record against it.
Here is the record in the Incoming Documents table. Once posted a user will be able to view document as well.
Power Automate

Business Central Month End with Power Automate

A month end procedure is a regular topic to be covered when implementing finance with D365 BC. Some finance systems throw a user into a defined routine but with D365 BC it’s much more simplistic. To remove the need for users to be experts wouldn’t a defined routine be useful?

Here is the goal: Create a Power Automate flow that will handle the month end procedure in BC with minimal user input. Principal thing here is that month end is routine so why not have something to cater for it. Caveat here being that job knowledge makes up for a big portion of this finance process so not all of it can be catered for. The scope is laid out in the next paragraph.

So what makes up a month end in BC and what is the scope of this procedure?:

Our first pointer is around the User Setup and this table could be home to a number of users that are in and out of scope for the procedure. So without any code the solution here is to define a “User Group” – perfect standard system placeholder. The added benefit is this table has “Company Name” included so if you want a differing solution per company you got it! 😊

Add the users that you want to be part of the allow to/from posting date change only. People in the finance team for example may need to have longer to post so you could define another group for them and create another step in the process to roll on their dates. The page for this is 9831 and you need to know this as a web service is needed – once published move on to the next section

Once page 9831 is published as a web service some ODATA filtering is needed to get what we need from the exposed data. I called my web service “MonthEndUsers” so I just add the orange part onto the end of the published web service URL: MonthEndUsers?$filter=User_Group_Code%20eq%20%27MONTH-END%27

Whilst on the web services page publish page 119 – User Setup – as this is needed too.

Head over to Power Automate and create a scheduled flow:

Flows run using UTC so if you need a specific time zone to be used check the advanced options once you have created the flow in the “Recurrence” step.

First thing needed is the dates for the posting from and to which will be dynamic each time. Add a “Compose” action for both of these and use the following formulas to get what we need:

  • Start of the month = startOfMonth(utcNow(),’yyyy-MM-dd’)
  • End of the month = formatDateTime(subtractFromTime(startOfMonth(addToTime(utcNow(),1,’month’)),1,’day’),’yyyy-MM-dd’)

Add a HTTP action to the flow and paste in the ODATA URL with the filter and use the GET method. Add a Parse JSON to grab the body from the HTTP trigger. Paste in a schema which you can get from going to the URL in a browser and authenticating. Add an “Apply to each” control and use the “value” of the Parse JSON. Last step is to add two PATCH requests to the User Setup page. Why two you ask? Well the page as standard won’t allow you to have a “From” date ahead of the “To” date and the body of a HTTP request is not dealt with in the order we send it so to prevent errors two requests are needed. First one to the “To” date and then the second to the “From” date.

The PATCH command needs specific syntax for filtering on the user record. At the end of the published web service URL the orange text represents one user value. The flow will use the “UserName” from the parse JSON step: UserSetup(‘UserName’). In the Header of you HTTP add the below to prevent the likelihood of errors:

General Ledger setup cannot be altered with ODATA so a different solution would be needed for that page most likely with modification. I will not cover that.

Remaining activity is the recurring journals and the General Ledger Setup. At the time of writing the standard BC connector for Power Automate has a post-journal action but it does not work with recurring journals. If your scenario is without recurring journals the standard guide on journal posting will work fine. I will cover what to do about recurring journals and I will use the batch name instead of the ID for ease of use. Publishing the General Ledger Setup page as a web service does not enable you to modify the data, it errors. As a result these two areas need development to give the final result needed.

To do this I’ve opted for a single codeunit which I will publish as a web service and use as an unbound action to complete the work by passing specific parameters:

Not a lot of code is needed here and it is more about the HTTP syntax and passing a body in the request

Unbound actions with codeunits published as web services is fairly new and highly useful. Simply publish your new codeunit as a web service and then use this logic within the URL (note the orange part changes depending on the function you need to use):

https://api.businesscentral.dynamics.com/v2.0/{tenantID}/Sandbox/ODataV4/PostReccurJrnl_postJrnl?company={companyID}

or

https://api.businesscentral.dynamics.com/v2.0/{tenantID}/Sandbox/ODataV4/PostReccurJrnl_AdjustGLSetup?company={companyID}

Within Power Automate we now have an endpoint to use and we can simply pass in the necessary parameters. Note that in this instance both will be POST actions. The body is just the name of the parameters and the required values:

I’ve tagged mine onto the end of the User date change as I already have the dates figured out and I want it to run for that time anyway 😁

Code available here: https://github.com/JAng13sea/Blogs/tree/master/Month%20End

PowerApps

PowerApps biz card reader to Business Central

This one comes with a premium as the PowerApps business card reader isn’t included in the license for BC. However, it’s a really good use of AI and something that could be utilised in the right scenario. Ironic that I’m blogging about something that won’t be getting much use right now. I haven’t acquired a new business card since March 2020. However, it feels as though the exchange of fancy pieces of card might be behind us and scanning something is a cleaner option, in more ways than one 😊

A number of other online resources can explain the PowerApps build up so I will highlight the important parts. The goal here will be to take an image of a business card and have the details from the AI model sent to BC so a company or person contact can be created.

Add the business card reader from the AI builder and then add text input boxes which reference the properties of the business card reader. Use text input boxes as the data from the reader won’t always be perfect so some editing might be needed.

As you can see here the Company Name doesn’t quite work out so some manual adjustment is needed.

The next thing is to have the data passed and to do this I’ve used Power Automate. A button will be created in PowerApps that calls the flow that is needed. A few steps to the flow so I’ll break it into sections. Use a PowerApps trigger as the starting point:

In PowerApps we will build a JSON so add a Parse JSON and request the content from PowerApps. Note that it will most likely change the name as mine is above. A sample schema is needed:

{    “type”: “array”,    “items”: {        “type”: “object”,        “properties”: {            “Address1”: {                “type”: “string”            },            “AddressCity”: {                “type”: “string”            },            “CompanyName”: {                “type”: “string”            },            “Country”: {                “type”: “string”            },            “Email”: {                “type”: “string”            },            “FirstName”: {                “type”: “string”            },            “JobTitle”: {                “type”: “string”            },            “LastName”: {                “type”: “string”            },            “Mobile”: {                “type”: “string”            },            “OfficePhone”: {                “type”: “string”            },            “PostCode”: {                “type”: “string”            },            “Website”: {                “type”: “string”            }        },        “required”: [            “Address1”,            “AddressCity”,            “CompanyName”,            “Country”,            “Email”,            “FirstName”,            “JobTitle”,            “LastName”,            “Mobile”,            “OfficePhone”,            “PostCode”,            “Website”        ]    }}

Next up an “Apply to each” section is required where the body of the JSON is used. In our case there will be one value at a time but a JSON can handle multiples. A series of HTTP triggers will follow along:

  1. A GET to determine if the “Company Name” value can be found in BC already. For this to happen I have published page 5050 as a web service and particular ODATA filtering is needed ?$filter=Type%20eq%20%27Company%27%20and%20Company_Name%20eq%20%27′<CompanyName>’%27
  2. A POST for a Company Contact for the occasions where this detail isn’t available yet. The displayed “Headers” will be needed and the “Body” will be made up of a mix of static values and those from the PowerApps JSON:

3. A final POST to handle the creation of a person contact. This will be repeated so use the “Copy to Clipboard” feature to save time having to type it out again.

A “Condition” has been added off the back of using the GET command and the returned “Body” is checked to see if the Company Name from the PowerApps JSON exists in the JSON of the GET Company HTTP. This will result in the “Yes” command or the “No” commands taking place. Once this is saved then it is ready for hooking up to PowerApps.

Create a new button in PowerApps and use the “Action” part of the ribbon to call on Power Automate. Choose the flow that was devised from the earlier steps. This will add one line to the formula area of PowerApps. Use the shift key and enter to add some additional lines above that inserted line. Some earlier steps are needed before the detail for Power Automate is ready. I’ll break this into sections too:

A Collection will be built up of all the values from the Biz card reader. Refer to the online documentation to understand the PowerApps formulas further:

ClearCollect(BizCardContact,{CompanyName: CompanyName.Text,FirstName:Concatenate(firstname.Text,” “,Lastname.Text),LastName: Lastname.Text,Email:Email.Text,Address1: BusinessCardReader1.AddressStreet,AddressCity: BusinessCardReader1.AddressCity,PostCode: BusinessCardReader1.AddressPostalCode, Country: BusinessCardReader1.AddressCountry,JobTitle: JobTitle.Text,OfficePhone: OfficePhone.Text,Mobile:MobilePhone.Text,Website: Website.Text});

A JSON is required in Power Automate and it can be built in PowerApps using the collection from the formula above:

Set(ContactJSON,JSON(BizCardContact));

Last thing is to pass the JSON to Power Automate:

‘Copyof-BizCardReadertoBC’.Run(ContactJSON)

In full flight you will get one of two results, two contacts or just a person contact. Note the green ticks in the top right of each window to show what has been executed:

Power Automate

Power Automate with Business Central update via codeunit using SharePoint data

Writing this is a quicker style than usual as the understanding I have around it was gained from another blog so I deserve no credit on that front. Grazie Stefano: https://demiliani.com/2019/06/12/dynamics-365-business-central-using-odata-v4-bound-actions/

Also a mention to my colleague Dan Kinsella (https://dankinsella.blog/). He helped with the codeunit element of this solution. Cheers Dan 🍻 I owe you a beer!

Why have I decided to blog about this? Well I had a meeting with a prospect recently that sparked it. Idea was that they would use SharePoint to hold website images as a repository. Naturally they wanted to see the images in BC. A modification to store more images could have been explored but I left that alone. Instead I suggested the use of the “Links” feature. User then clicks on the SharePoint URL to see the image…no need to store it twice, so better for database size 👍

Getting the data from SharePoint to BC though was another thing as the “Links” area is a system table so no chance of a direct HTTP call with Power Automate. So the goal here is: Create a codeunit that is accessed from a web service that stamps a link on an item record.

So we start off with a very basic codeunit

To go along with that we need a page as it is the page we publish as a web service. Read Stefano’s blog for the reason why, he discovered it after all.

Add the fields that you need for the circumstance. In my case it is just the item number, system ID and the description – only the item number is needed really in my scenario. A function is then needed on the page and you need to ensure you have the [ServiceEnabled] part. Again check out Stefano’s blog to understand why.

Testing it in postman the final result looks like this. Note the addition to the URL which has /NAV.AddItemURL which is the name of the page function from the last screenshot

After posting with Power Automate I get the following:

Copy of the AL code is here: https://github.com/JAng13sea/Blogs/tree/master/Add%20Item%20Link%20-%20AL

D365 Business Central, Power Automate

Business Central Batch post and e-mail with Power Automate – no code

Batch posting has been in the product for some time but I am not seeing a clear way, happy to be wrong, on batch post and e-mail. Here are my options (below) as a user conducting the posting. Yes, I have post via the job queue as well but from what I’ve tested this isn’t sending my e-mail for me. Even the print option breeds no results here. Thought it would at least use the “Document Sending Profile” on the customer record.

Even when you use the “Post Batch” you get nothing about e-mailing? 😬 Not forgetting the workflows in BC have no option for “Post & Send” it’s just “Post”. Usually these gaps would need filling with a code modification but not in this blog.

Where there is a will there is a way and my idea here is to use Power Automate to do the heavy lifting. On this occasion I will use a manual trigger but if this was a production ready scenario I would use a scheduled type flow.

The connector for BC in Power Automate is fairly small but I’m sure it will get better over time. There is a slight restriction though as the action of posting and emailing exists but it’s on a singular level like it is in the regular UI. So the first thing I need to do is fire in some data I can use in Power Automate from BC. I will of course need the GUID ID reference to the sales invoice as that’s what the BC Power Automate connector likes. The page that fits the bill here is available as a web service already. I will use the top one in the list 2811. This page is a mixture of open and posted sales invoices so some filtering is needed:

Example of the payload from this page:

Adding a filter to the ODATA query so I only post invoices that are ready to be posted. In the case of this page the “status” values are different to regular BC. In my case I have a choice between draft aka “Open” or open aka “Released”. So I have added ?$filter=status%20eq%20%27Draft%27%20and%20totalAmountIncludingTax%20gt%200 to the end of the web service URL I got from BC. Reason being the feature in the Power Automate BC connector restricts me to draft only and I don’t want to post invoices with a 0 value. Will log something on the Power Automate forum about the connector wanting “Open” invoices rather than “Released”. Backwards logic otherwise, what if I’m using approvals for instance. I think this is possibly wrong so I have posted this to find out more: https://powerusers.microsoft.com/t5/Connecting-To-Data/Is-the-Business-Central-Post-amp-Send-Invoice-action-correct/m-p/637081#M9492

Anyway….

Our flow starts to take shape like this. Check out my last blog for a more detailed account on the parse JSON part (https://joshanglesea.wordpress.com/2020/07/22/power-automate-purchase-order-alerts-to-vendor/)

Once we add the “id” from the parse JSON step you will get an “Apply to each” wrap around the chosen BC action which is exactly what we want (automatically). Remember our goal Batch post and e-mail invoices – so we better placed than standard.

In my case a small batch of invoices have been handled and emails dished out accordingly. I added an extra step for updating the posting date as well. Just so it takes something useful from the existing “Batch Posting” feature 👍

Power Automate

Business Central Power Automate purchase order alerts to vendor

Recently had a requirement where alerts to a vendor need sending for orders expected in the next two weeks. A reminder is needed as the lead times for the goods are so long.

The goal is therefore, have an email sent to a vendors email address for purchase order lines in the next two weeks from today’s date.

To achieve this Power Automate or Logic Apps are the most appropriate choices. For this blog I have chosen Power Automate.

Given that I will need line data and I want to minimise the need to make multiple HTTP calls so some small DEV BC side will help. I have produced a query object so that I have all the data I require for all the areas I want data from (link to my code is at the end). I’ve chosen header level here for the “Expected Receipt Date” but you could do it for the line level if needed:

Ensure the query is published as a web service and check you have output by using the web link or from an API test tool like Postman:

To get the exact data we want for the scenario some ODATA query logic is needed. Add the following to the end of the web service URL. Replace <Your Date> with the date you want to filter by. The format must be like this: 2020-07-21T00:00:00Z:

?$filter=Expected_Receipt_Date ge (<Your Date>) and Expected_Receipt_Date le (<Your Date>) and Outstanding_Quantity gt 0

The date values will be replaced with calculated fields in the flow which will be a scheduled type flow so that is our starting point along with the ability to create variables for the date/time values which are then used in our ODATA query above:

Use the HTTP connector and call the web service with a GET command and use the current time and future time variables.

The HTTP will produce JSON which we saw earlier, from the web service call, and to use the data from this we need to use the PARSE JSON feature and we’ll be able to then select the content for use in subsequent steps like sending the email. To generate the schema just paste in an example from calling the web service. Works nicely by using an API test tool like Postman as you get it formatted in a nicer structure.

Once the JSON is read by the PARSE JSON step you are able to create a “Apply to each” step which will iterate through each of the received rows of data until it has read them all. Whilst that is happening we can initiate further actions like our email:

Each reference comes from the PARSE JSON and you can use format functions on certain data like I have for the date. The result of this flow is an email much like this one:

Code: https://github.com/JAng13sea/Blogs/tree/master/Purchase_Query

D365 Business Central

Business Central Gen. Jrnl. Imports with Data Exchange Definitions

Have you seen this button on the general journal page and thought that’s useful?

Some of you might not have seen it though as the button only appears if you complete some setup in the General Ledger Setup page:

The above field shows a list of Data Exchange Definitions for the type of “Payroll Import”. So it is fit for purpose but only usable for payroll – and more importantly just one import file structure. For demos I have often mocked up a Data Exchange Definition as a payroll import type and just passed over a CSV file. This falls into my one of many ways a journal can have external data added to it. This is fine for one off demos sessions but how can this idea be taken seriously in a project/production scenario? The answer lies in how is the current “Payroll Import” button working and can we change it for our situation? My end goal here would be to have the user pick from a range of different data exchange definitions which work for different CSV files. In the past customers may of had bespoke import routines passing data through the excel buffer. If they can compromise on having the file converted to CSV this could be a worthwhile solution!

Over in the base app the “Payroll Import” action button looks like this:

So it is only ever looking at that setup page to know which data exchange definition to use. It also has it’s own codeunit to run the process. So the component parts are small and we can adjust them with ease for our end goal.

Very simply a page extension is needed for the general journal and you can go with something like below to add the type of page action we want. In my case I have opted to use the “Generic Import” type on the data exchange definition – which I have set as a global variable on the page.

Codeunit wise I have done a complete copy of the base app and then renamed the function names. Very easy indeed. The process itself works fine so no need to change anything it is more about what you offer the user when they make that button selection.

The end result is a dynamic journal import selection. The user can now maintain and update a data exchange definition or add new one’s as new journal import requirements come along! 👍😁

Don’t know much about data exchange definitions? Check this out: https://docs.microsoft.com/en-gb/dynamics365/business-central/across-how-to-set-up-data-exchange-definitions

Code available here: https://github.com/JAng13sea/DED-Jrnl-Import