Integration tricks
Business Central implementations where Dynamics 365 CE operates at the front end, whilst BC covers the back-end core ERP functionality is a frequent scenario, and both systems are well tuned for this unity. Integration between BC and CE is quite straightforward, thanks to the BC integration framework, and it's well described, so even the synchronisation of custom entities does not usually pose a big task - integration framework does all the heavy lifting, hiding the boilerplate functionality behind CDS virtual tables. But of course the real life presents many surprises which no amount of documentation can cover. In this post, and a few following ones I want to write about my integration experience and a few tricks that may not be so obvious, but can save significant effort.
Scenario
For the sake of this demo, let's imagine that we need to build an integration for a company whose products require certification. This could be a pharmaceutical company or a food supplier. The key tables that we want to sync between Customer Engagement and Business Central are Product and a custom table Product Certificate. Business Central out of the box implements the synchronisation of BC items with the CE Product entity, and the other table is a custom one with just three fields, apart from system-created columns (it's a very simple certification solution).
And so, our task is to create corresponding custom tables in BC and extend the integration to cover the product certificates.
In this post, I will not go deep into details of declaring virtual tables and writing the extension code - integration of custom entities is described in detail in BC documentation. I will rather focus on nuances that can ruin the integration and cost a lot of time in debugging.
All the code for the integration of the custom entity is published on GitHub.
Customizing CE / Dataverse
So, we have received and understood all the client's requirements are are ready to start the development, and the first step is to prepare the development environment. In the Dataverse Admin Center, we can create a new environment selecting the Developer type. This will give you a clean new environment for development and testing, but what about the customisations that must be synchronised? If it's just a few entities out of a large and complex solution that require integration, probably deploying the whole client's custom solution on the dev instance is not the ideal approach - the custom entities can be simply created manually in the development environment.
First of all, let's introduce Dataverse changes that must be pulled into BC. One thing we should remember is that D365CE is natively built on top of Dataverse. Or rather Dataverse grew out of the Common Data Model (CDS) which was based on the Microsoft CRM architecture. Therefore, CE is closely integrated with Dataverse, and the customisations I will be showing throughout this article can be equally applied in CE or Dataverse with the same outcome. On the other hand, I focus on the CE/BC integration side and won't delve into such specific CE development tools like XRM Toolbox.
If you are not familiar with Dataverse or CE solutions, probably the New Table action in the PowerApps portal (or the respective action New Entity in CE) can look compelling to quickly construct a new entity for the development purposes. Let's see why this is not such a good idea after all.
We start creating our product certification add-on by creating a new table Product Certificate. And now I'm creating the table exactly as I described above - clicking the New Table button in the Tables area of the Power Apps portal. This action opens a simple panel that requests to enter the table display name and description. Internal table identifiers, particularly logical name and schema name are generated automatically based on the display name, and by default are hidden under the Advanced options tab. But unfold the advanced options and have a look at the Schema name of the new table.
This is the table identifier which we must use when we reference the table in the Business Central connector. The ExternalName property in the definition of the virtual table in BC must refer to the schema name. It is important to note that the schema name is formed from the display name - the table's "given name" - and a prefix. And whilst the "ProductCertificate" part of the name can be changed by the developer, the prefix is fixed and not editable.
Same applies to field names - whenever a new fields is added to the table, its name is prefixed with the same fancy code which we see in the table properties. So when it comes to declaring the virtual table in the BC integration client, we always include the name prefix in tables and fields references, as in the following example.
table 50900 "CRM Product Certificate"
{
Caption = 'CRM Product Certificate';
ExternalName = 'cr0e8_productcertificate';
TableType = CRM;
fields
{
field(1; ProductCertificateId; Guid)
{
Caption = 'Certificate ID';
ExternalName = 'cr0e8_productcertificateid';
ExternalAccess = Insert;
ExternalType = 'Uniqueidentifier';
}
This is a part of the virtual table declaration, the full sample with the rest of the synchronisation code is available in my GitHub repository.
But now let's assume we coded, deployed, and successfully tested the whole solution on our dev environment, and fast-forward to user acceptance testing where the integration solution is deployed on a new environment.
And all of a sudden, the first attempt to synchronize the custom entities ends in a failure, with an error message informing us that "Entity with a name = 'cr0e8_productcertificate' with namemapping = 'Logical' was not found in the MetadataCache."
Explanation of this error is simple - the prefix cr0e8 is specific to my development environment. Objects created in another environment have a different prefix, therefore - just as the error message informs - the entity from the dev environment does not exist in the test instance or production.
How do we fix this?
Dataverse Solutions
The solution lies in Dataverse Solutions (yes, the pun is intended). The point is that the entity prefix is not something global and immutable, associated with the environment itself. It is rather a part of the identification of a solution publisher, which can exist in multitudes in Dataverse environments. And each environment has at least one, called Default CDS Publisher and one solution linked to this publisher - Common Data Services Default Solution.
When changes are made in Dataverse without explicitly assigning them to a solution (as we did in this example, creating a table with the New Table action), these changes are considered to be a part of the default solution with its respective default publisher.
In the Power Apps portal, if we navigate to Solutions -> Publishers, we can see the list of all publishers registered in this environment. The prefix, as you can see in the following screenshot, is one of the publisher's configurable properties, and it can be changed.
If you want to change it, click on the drop-down menu (three dots in front of the publisher name) and choose the Edit action.
Note. October'23 update of the Power Apps portal removed the Publishers view from the Solutions area. Now it is available via More > Discover All https://learn.microsoft.com/en-us/power-apps/maker/canvas-apps/intro-maker-portal
So now we have at least one solution to the naming problem - changing the prefix of the default Dataverse publisher will make all the modifications done outside of any explicit solution pick up this prefix. This way all schema modifications in the default solution will bear the same name prefix. But what if the production environment you work with has multiple solutions deployed and the integration must cover more than just one specific solution? With one prefix for the default solution, we are quite limited in naming options. If we need to integrate entities published by a publisher "Contoso" and a publisher "Cronus", both with their specific name prefixes, we find ourselves in the same difficult situation - no matter which prefix we assign to the default publisher, we can't satisfy both solutions - Contoso and Cronus.
The best way to handle integration scenarios is to start setting up the development environment from registering a new publisher with its solution, even if we do not expect more that one solution prefix in the production.
Starting a new solution
The initialisation of the new solution begins from the decision on the publisher we want to use for it. If the publisher does not exist yet, we need to create one. In the Power Apps portal, navigate to Solutions -> Publishers, where you will see the New Publisher action which will lead to a publisher registration area.
Give the new publisher the name and assign the object prefix and save. The Cronus publisher will have its own prefix "cronus" which will mark all objects deployed by this publisher. Choice value prefix setting which you can see in the picture above is another important piece of configuration, but I will touch upon it in one of the following posts.
Once the publisher is registered, it is time to create a new solution to wrap our changes in. Under the Solutions tab, we can find the New Solution action which starts - surprise! - a new solution. Here we assign the name, display name, and the publisher which we have just created.
As you can notice in this screenshot, new solution publisher can be created straight from this interface.
When the solution is ready, just click on its name to open the solution area and apply your customisations.
Objects tab of the Solution area contains the New Table item among many other objects that can be created in a solution. Create the Product Certificate table using this action, and the new table created inside the solution now has the Cronus publisher prefix, resulting in a table schema name cronus_productcertificate instead of the old cr0e8_productcertificate. The solution now can be deployed to any environment, retaining the same schema names for all objects within it.
Kommentare