In this chapter, we will discuss the Azure integrations that can be leveraged to implement fast and scalable solutions. We will be focusing on three types of Azure technologies that are useful in Dynamics 365 implementations. They are Azure WebJobs, Functions, and Logic Apps. Azure WebJobs is ideal for batch processing, which will run behind the scenes. Since we are talking about Azure technologies in this chapter, we will also take a quick look at some security aspects. Microsoft Dynamics 365 for Customer Engagement leverages the cloud services infrastructure and the built-in security features to safeguard the data. To safeguard access, it depends heavily on Azure Active Directory (AAD) to authenticate the users and prevent unauthorized access to sensitive data. Also, when it comes to licensing, you can buy the license for Dynamics 365, or you can provision the required infrastructure with Azure, and your cost is determined by the usage. When leveraging the power of these technologies, you must take extra precautions to plan and design your solution. There is a huge pile of information provided at https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/manage-subscriptions-licenses-user-accounts.
Logic Apps can be used to design more advanced workflows, and Azure Functions enables you to write functionality that processes data in a serverless architecture. Especially with Logic Apps, you can connect with many external applications, which we will look at in detail later this chapter.
Azure WebJobs
First let’s look at Azure WebJobs, which is a feature of Azure App Service. You can use the WebJobs feature to run processes behind the scenes, and the best thing about WebJobs is that you can schedule when to execute the process. Web jobs can be categorized as continuous or triggered. Continuous web jobs start immediately when the job is created. To execute the app continuously, the application executes within an infinite loop. If the application stops, you can restart it. Triggered web jobs start only on a certain schedule or when manually triggered.
The following are the supported file types with Azure WebJobs:
PowerShell (.ps1)
Windows CMD (.cmd, .exe, .bat)
Bash (.sh)
Python (.py)
Node.js (.js)
Java (.jar)
PHP (.php)
Keep in mind that a web app can time out after 20 minutes of inactivity, which can be reset with a request to the actual web app. If your application is set up as continuous or scheduled, then enable Always On to make sure the application executes without any interruption.
Writing Code for a Web Job
Let’s look at the example with the SBMA membership management solution. The stakeholders want the application to raise the yearly subscription automatically and on time. For instance, if the membership of a member is expiring a month from today, then a subscription and a payment record should be created behind the scenes. One of the options is to create a Windows service, but to deploy it you would need a server, either physical or virtual. Either way, creating a server requires an additional fee for the maintenance, which SBMA is not willing to pay. So, the option available is to create a console application that executes behind the scenes and create the necessary subscription records. When compared to Windows services, the best thing about the WebJobs feature is the ability to create console applications. Creating a console application means that debugging and troubleshooting can be done before deploying to the cloud, whereas Windows services require much more complex debugging scenarios.
This section will list the code used to create the application. Listing 6-1 illustrates the Main program that connects to the Dynamics 365 instance and calls the logic to create the subscription. Remember, you should create only the subscription here; the relevant payment record will be created by the custom workflow activity created in Chapter 5.
using System;
using System.Configuration;
using System.Net;
using System.Reflection;
using System.ServiceModel.Description;
using log4net;
using Microsoft.Crm.Sdk.Messages;
using Microsoft.Xrm.Tooling.Connector;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using SBMA.ServiceProcessor.BusinessProcessLayer;
namespace SBMA.ServiceProcessor
{
public class Program
{
public static readonly ILog log = LogManager.GetLogger(MethodBase.GetCurrent Method().DeclaringType);
public static void Main(string[] arcs)
{
ExecuteLogic();
Console.ReadLine();
}
/// <summary>
/// Connecting to Dynamics 365 instance
/// </summary>
public static void ExecuteLogic()
{
try
{
log.Info("Connecting to Dynamics 365 instance...");
using (CrmServiceClient crmConnection = new CrmServiceClient
Main Program That Calls the Process to Create the Subscriptions
Note
As you can see, there is a reference to log4net, which is handy when testing code. Also, when you publish the application with web jobs, you can view the details of execution in the logs. This information is really valuable. This link will guide you to set up log4net: https://stackify.com/log4net-guide-dotnet-logging/.
The recommended approach for connecting to a Dynamics 365 instance from an external application like this is to use Microsoft.Xrm.Tooling.Connector library, which you can download from NuGet. As shown in Listing 6-1, to establish the connection, you pass the connection string read from the configuration file to the CrmServiceClient constructor, which is then used to instantiate the organization service. When you create the connection string, you must make sure that you set the SkipDiscovery property to false, as shown in Listing 6-2. If not, the application will throw the exception “Microsoft.Xrm.Entity cannot be converted to type<ex: SBMA.ServiceProcessor.Account>.” This is because the service does not know how to deserialize the results into a proper object. You can find more information at https://community.dynamics.com/crm/b/bettercrm/archive/2018/11/28/dynamics-365-tooling-object-of-type-microsoft-xrm-sdk-entity-cannot-be-converted-to-type-type.
When all the coding is completed, you can build the application. Navigate to the bin folder and zip the content of the folder. You will need to upload the content to Azure WebJobs. The zip file must contain all the reference .dlls and config files. Now let’s start creating the Azure web job.
Creating a Web Job
To create a web job, you must first create a web app under Azure App Service. Provide the relevant details and click Create. See Figure 6-1.
Once the web app is created successfully, you will see it listed as an app service. See Figure 6-2.
Navigate inside the web app created and select WebJobs in the left pane, as illustrated in Figure 6-3.
Click +Add to add the web job. Provide all the details. You must provide a unique name and upload the zip file you created. Set the type as Triggered and the type of trigger as Scheduled. The unique thing about the scheduled triggers is the interval must be provided as a CRON expression. You can learn about cron expressions at https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer#cron-expressions. Figure 6-4 shows the settings for the web job. For the demonstration purposes, the cron expression is set to every two minutes. Ideally, this type of batch job should be executed daily. For instance, you could enter it as 0 30 23 * * *, meaning the process will be executed at 11:30 p.m. every night.
After configuring these settings, you can click OK, which will create the web job, as shown in Figure 6-5. You can see that Status is set to Ready.
Select the web job and click Run to start the job, and the status will change to Running. See Figure 6-6.
To view the progress of the web job and troubleshoot any issues, click Logs in the toolbar to see the progress. Also, all the messages entered through log4net will be shown on the console. See Figure 6-7.
As mentioned earlier in this section, to make the application execute without any interruption, you must enable the Always On option in the “General settings” section of the web job. Keep in mind that this action involves an additional cost. See Figure 6-8.
An improved version of this solution would be to use Fetch XML queries to select the data required for processing. The benefit of using Fetch XML queries is that when the business wants simple changes to the query, you can easily alter the Fetch XML code without any development work and releases. The following are the high-level steps to create such a solution:
1.
Create an entity that holds the Fetch XML query in Dynamics 365. You must have a couple of unique fields to identify the query such as which method will use the query, which process will use the query, etc.
2.
The console application will use the Fetch XML query to retrieve the batch of records to process on the Dynamics 365 end. You could write .NET classes as you did in the previous example to process the data.
3.
A further improvement would be to use workflows. You can create on-demand workflows that hold the data processing logic, which will be invoked by the console application. Using workflows gives you a rich set of execution logs within Dynamics 365. This is similar to creating an Advanced Find view and executing the on-demand workflow against all the records in the results view.
As you can see, if the business requires any batch processing, you can easily do it with Azure WebJobs by creating a console application that processes data. In the next section, you will be looking at another serverless Azure technology, Azure Functions.
Azure Functions with Dynamics 365
Dynamics 365 and Azure Functions integration in v9.0 enables various integration scenarios for your applications. As mentioned earlier in this chapter, Azure Functions is serverless, and the pay-per-use pricing makes it cost effective. Azure Functions comes with built-in capabilities to integrate with PowerApps, Logic Apps, and Microsoft Flow, providing a new set of building blocks to power users to design and develop state-of-the-art solutions. You can do many amazing things in the context of Dynamics 365 integrations.
Export and import data from external applications or web services.
Get notifications from third-party services.
When there are heavy processes that should performed, you can pass them to Azure Functions without disrupting the user.
You can connect to third-party endpoints such as payment gateways, making it easy to change the endpoint at any time without changing the internal logic.
The primary reason for using Azure Functions is to minimize the necessity to write plugins and custom workflows, which are difficult to debug. Also, using third-party libraries with plugins in Dynamics 365 online is not possible, unless you have the source code of the third-party library and compile it into the plugin DLL. Therefore, you can easily write an Azure function that references third-party libraries. In this section, we will look at processing credit card payments via Azure Functions. In the future, if the client wants to change the payment gateway, they can easily change the endpoint from the function rather than having to alter plugins or custom workflows.
Creating an Azure Function
First let’s create an Azure function. Click +Create Resource and in the Compute section you will find Function App. Provide all the necessary information and click Create. See Figure 6-9.
It will take few seconds to create the app. Once the app is created, you will see the details of the app. See Figure 6-10.
Next, let’s create the function that processes the payment. Click the + sign next to Functions in the left pane. Then you will be asked to select the template. For this example, you will trigger the function with HTTP triggers. Select API & Webhooks from the Scenario drop-down box. From the options available, select HTTP Trigger. See Figure 6-11.
When you select the trigger, you will be prompted to enter the name of the function and the authorization level. See Figure 6-12.
The script editor window will open, as shown in Figure 6-13. This is where you write the function. Please note that the example given here is a simple one, and we will not go into the details of implementing the payment gateway because that is out of the scope of this book. This example will demonstrate when the state is changed on the Dynamics 365 end; then it will trigger the function via JavaScript.
You can test the function by navigating to the Test tab in the right pane. After entering the JSON, click the Run button. Figure 6-14 shows the output. You can see the test result in the output window and the logs in the Logs window on the bottom of the screen. This will help you to ensure your function is working and accept the data as JSON.
Consuming the Azure Function
As the next step, you will have to create the JavaScript file, which will be used to call the function. But before that, you need to click the </> Get Function URL link next to the Run button and copy the URL with the key. See Figure 6-15.
Before leaving, you must configure cross-origin resource sharing (CORS) so that you can consume your function from Dynamics 365. Navigate back to the Functions app and select the Platform Features tab. As shown in Figure 6-16, click the CORS icon, and you will be prompted to enter the URL. Enter the instance URL and click Save.
Now it is time to enter the JavaScript that calls the function. The following JavaScript code invokes the function:
Now you need to combine the JavaScript code with the OnChange event of the Payment Status field of the Member Payment form. So, when you change the status to Paid, you will see that the form notification appears, meaning that the request has been successfully sent to the function and the function returned a success message. See Figure 6-17.
In terms of best practices, one other option to consume an Azure function is to register it as a webhook using the plugin registration tool. Open the plugin registration tool and connect to your Dynamics 365 instance. In the Register menu, click the Register New Web Hook option, as shown in Figure 6-18.
Specify the name of the webhook. In the Endpoint URL field, paste the URL copied from without the code/key. For the authentication, select WebhookKey and paste in the key, as shown in Figure 6-19.
There’s one important thing to remember with the key here. For demonstration purposes, we have used the default key. From the Manage option in the Azure portal, you can have more than one key. Best of all, you can revoke or renew any key if you found out that the key has been compromised. Just click the “Add new function key” button. See Figure 6-20.
Then the screen will display the settings for the configuring the key. As instructed, leave the Key field blank to generate it automatically. See Figure 6-21.
Click the Save button, and your new key will be generated, which you can use with the webhook registration. See Figure 6-22.
As shown in Figure 6-22, you can renew and revoke the function keys. Moving back to the plugin registration tool step, when you click Save after entering the key, the webhook gets registered, and now you can register the steps on it.
This is a simple example of how you can communicate with the functions provided by the Azure Functions feature. You can use the concept to implement more complex scenarios. In the next section, we will look at another popular Azure integration, Azure Logic Apps.
Integrating Azure Logic Apps with Dynamics 365
Azure Logic Apps is considered Microsoft’s offering for enterprise-scale integration and workflow in the cloud. With it you can off-load heavy integrations easily and quickly with a low cost. This service is part of the platform as a service (PaaS) offering from Microsoft. Even though you can develop custom workflow applications to meet your integration needs, you should consider Logic Apps because it provides an out-of-the-box retry policy that eliminates extra coding. Also, the true power comes with its automatic scalability, eradicating the requirements of additional VMs.
Azure Logic Apps provides a vast diversity of integration points including Dynamics 365. You can learn more about these integration points at https://docs.microsoft.com/en-us/azure/connectors/apis-list. Also, Logic Apps has built-in triggers that kick off the workflow. For instance, an event could be receiving an e-mail or detecting a change in an Azure Storage account that triggers the Logic Apps engine and creates a new logic instance that runs the workflow. Next, the execution moves to the actions, which are all the steps or tasks that should happen after the trigger. These actions could be managed connections, custom API calls via Azure Functions, or custom connections. Logic Apps also include enterprise capabilities with BizTalk Server.
You can create apps in Logic Apps with Visual Studio, Visual Studio Code, and Azure Portal.
Similar to Logic Apps, Microsoft Flow is becoming popular as the go-to workflow engine for Dynamics 365. We discussed this in Chapter 4. The following is a high-level comparison between Logic Apps and Microsoft Flow:
Logic Apps is more suitable for advanced developer integration scenarios, whereas Microsoft Flow is for self-service workflows that power users can create and use.
Unlike Microsoft Flow, Logic Apps involves a running cost. It incorporates the pay-per-usage costing model. Microsoft Flow, on the other hand, has a free tier with 750 flow executions per month. Flow plan 1, which is a paid plan, increases the executions to 4,500. Flow plan 2, which is known as the premium plan, increases the executions to 15,000 flows per month.
You can use the browser and Visual Studio to design Logic Apps workflows, but Microsoft Flow workflows can be designed in the browser only.
Microsoft Flow workflows can be easily designed and tested in a nonproduction environment and then promoted to the production environment. Logic Apps workflows can be source controlled, tested, and automated and managed via Azure Resource Management.
Since Logic Apps uses the Azure Portal as its backbone, Logic Apps workflows have more management and monitoring options for administrators; troubleshooting is much easier as well.
Since both options have their own pros and cons, you must wonder when to use these technologies. When you do not have an Azure subscription but you must implement a workflow, then the Microsoft Flow free tier is the go-to workflow engine. Microsoft Flow is part of the Power Platform, so you can easily manipulate data with the Common Data Service using the workflows created by Microsoft Flow. Azure Logic Apps comes into action when you need more control over your development in more complex integration scenarios on an enterprise scale. As mentioned, if you want automation and source control, then you should go for Logic Apps.
Among various integration points, Azure Logic Apps comes with Dynamics 365 connectors, and in this section you will look at a scenario where connectors are used in relation to the SBMA membership application.
The Azure Logic Apps Solution
This scenario is again related to membership payments. In this application, there are two different types of payments, credit card and direct debit payments. For direct debit payments, a list of payments must be sent to the bank on a given day to process the payments. So, for all the direct debit payments, the payment details including the account number, bank code, amount, and account name must be sent to the bank or the direct debit processing services. The bank will process the payments and send a file that contains the successful and failed payments. The process should pick up this file and update the Dynamics 365 solution accordingly.
You could write a custom workflow, but this is cumbersome and time-consuming, and you might end up spending more money just for the implementation. Let’s look at how you can implement this scenario in Azure Logic Apps. We will be using Visual Studio to create and publish the apps. Let’s open the SBMA solution and add a new project. See Figure 6-24.
Select the Logic App template from the template list, and click OK, as shown in Figure 6-25.
Visual Studio will create the project; you can see that there are three files created under the project. Right-click the LogicApp.json file, and click Open with the Logic App Designer (Ctrl+L). You will have to log into your Azure subscription at this point. Once you’re logged in successfully, Visual Studio will open the Logic App properties window; now click Create New from the Resource Group drop-down, provide the details of the resource group, and click OK. See Figure 6-26.
Visual Studio will load the LogicApp.json file, as shown in Figure 6-27. You can see that there are many built-in templates that you can use if they match up with your requirements.
Select the blank template from the list since you are going to create this solution from scratch. When the blank page is loaded, click +New Step, and you will see the dialog shown in Figure 6-28 where you can select the steps you want to create.
Since you are going to create a process that is recurring, search for the Recurrence step in the list, and you can configure it as shown in Figure 6-29. The step in the example is configured to execute every three minutes, but it can be changed as per the requirements.
The next step is to get the pending direct debit payment records from the Dynamics 365. There are a few Dynamics 365 endpoints that you can choose. For this instance, you need to list all the records filtered by Payment Status and Payment Method. Therefore, you select the “List records” steps. When you add the Dynamics 365 endpoints, you will have to connect your step to the Dynamics 365 instance. Once you enter the credentials and have successfully connected to the instance, you will see the organization name and all the entities. We will be using the Member Payments entity. See Figure 6-30.
As shown in Figure 6-31, the filter query has been configured using an OData query. You can learn more about OData queries at https://docs.microsoft.com/en-us/rest/api/searchservice/odata-expression-syntax-for-azure-search. For simplicity, this example will leave the other settings as the defaults. Now that you have retrieved data, the next step is to format it as a form accepted by the direct debit service of the client. Again, for simplicity, you will be creating a CSV-formatted file, but you can create any file type. To create a CSV-formatted file, you will be using the Create CSV table step. The output from the previous step, which is the value, will be used in the From field. In the Columns drop-down, if you select the Automatic option, then all the columns will be added to the CSV table. To avoid that, select the Custom option, enter the column names, and pick the value in the Add Dynamic Content window.
When you click the Value column, the Add Dynamic Content window will be displayed on the side, as shown in Figure 6-32. You can see that all the columns are listed here, and you can pick only the columns you required.
Finally, you must save the result set to a location. Here, we will be saving the file to a common location in OneDrive. Add the Create File step and configure it as shown in Figure 6-33. Pay close attention to the File Name field where it adds a date and time to avoid duplicates.
For the file content, we have configured the step to use the output from the previous step. To add a unique file name, we have used a function to capture the date and time. When you click the File Name field, the Add Dynamic Content window will pop up. Go to the Expression tab, which has a whole plethora of various functions nicely categorized that will be displayed so you can build any expression. See Figure 6-34.
Now that you have developed the logic app, the finished solution will look like Figure 6-35.
Deploying the Project
Let’s publish the workflow. Right-click the logic app project and click the Deploy option; you will be prompt with the window shown in Figure 6-36. Also note that, in this window, you will have to create a resource group, so make sure to log into your Azure account first.
Once the app is deployed successfully, you will be able to see the application under Azure Logic Apps. You can run the application by clicking the Run button, and at the bottom of the window, the application execution details will be listed. It is easy to troubleshoot the workflow. See Figure 6-37.
Click one of the executions, and you will see the details of the execution where you can drill deep into each of the steps you added to the workflow. See Figure 6-38.
Figure 6-39 shows the file created in the given OneDrive location.
Summary
In this chapter, you looked at a few simple examples of setting up Azure WebJobs, Functions, and Logic Apps. In Chapter 7, you will learn about the reporting and dashboard capabilities of Dynamics 365.