Recently, I faced one requirement where customer does their invoice matching in external system where they want to have incremental vendor data extracted along with PO data from the F&O to feed it to that system.
In this blog post we are going to look at how we can utilize recurring integrations API and business event to extract the vendor data from all legal entities in F&O to fulfill this scenario.
As you can see in the below diagram, the extraction of the vendor data is going to happen using data project. Recurring integrations is enabled on the data project created in F&O. In this scenario, I have created a custom entity to extract data from all legal entities. Upon completion of data project execution business event is triggered which is consumed by power automate to download the file by using HTTP trigger instead of calling deque endpoint and store it in Azure blob storage. This file from blob storage can be then read by any external System.

Now let’s do deep dive in each of these steps.
First, I have created custom data entity ASPVendVendorEntity which is duplicate of standard entity VendVendorV2Entity. All the datasources and fields which are no longer needed are removed from the entity. The reason for creating duplicate entity is, I wanted to keep standard entity for different purpose and use this entity to send vendor data from all the legal entities to multiple external systems and wanted to change its primary context property.
Data entities have primary context property, where you can specify the entity field to use for company identification. This field can be any field that is extended from the DataAreaId extended data type (EDT), and isn’t limited to an underlying system dataAreaId field.
So, to send all legal entity data in a single execution, you can blank out this property of the data entity.

You can read more about primary company context on this link
Once the entity creation is done next step is enabling change tracking on data entity for incremental export.

Next step is creating and configuring data project and creating recurring job. You have to make sure before creating data project enabling change tracking on data entity and select incremental push only while adding entity is important otherwise incremental export is not going to work.

Make sure to select project category as Integration while creating the project. Once you are done with data project creation you can enable recurring data job and set the recurrence.

Now as per scheduled recurrence, file with incremental vendor data is going to available at the endpoint for download.
To download this file using recurring integrations the usual procedure is first we call the dequeue API for export and the call acknowledgement API to acknowledge that the package has been downloaded.
Instead of following regular process here and perform polling to look or execution status of the message whether export is completed or not, I have tweaked a process to send Business event as soon as processing is finished. You can download code for this business event from my Github page.
Once you download and build this code in your local VM, make sure to rebuild business event catalog and activate the following business event.

Below is the JSON schema which is sent by this business event to Power automate.
{
“BusinessEventId”: “HSMDMFProjExecFinishedBusinessEvent”,
“ControlNumber”: 5637150577,
“DMFDefinitionGroupName”: “VendExportIncremental”,
“DMFExecutionId”: “Vendor Export-2022-03-04T09:16:09-B83D00DB2B7E4F8EA9904B031B1C8FB8”,
“DMFExecutionSummaryStatus”: “Succeeded”,
“EndDateTime”: “/Date(1646414175000)/”,
“EventId”: “ADD6E9EC-9AB4-459B-8BFC-30A4E76DFB83”,
“EventTime”: “/Date(1646414175000)/”,
“MajorVersion”: 0,
“MinorVersion”: 0
}
If you look at the above schema, business events send information like execution status of the project and file path to download the file. I am using power automate business event trigger to consume this business event and download the file.

Once the business event is received by power automate, I am just checked if the execution status is successful and using http trigger with Get operation to download the file.

Once the file is downloaded from F&O it is moved to azure blob storage using create blob action.

As you can see below, as per the recurrence a new file is created with incremental vendor data from all the companies in the blob location.

The code associated with business event and power automate flow with this scenario is present on my Github page foe download.
thanks for sharing info….can you please advice is it possiable to create business events on composite data enity?
LikeLike
In my scenario business event it not created for entity but if you are looking for events on entity you should look for data events.
LikeLike
Thanks for the article. I found DMFPackageFilePath to be missing from JSON schema once I have downloaded artifact from Github.
LikeLike