© Sanjaya Yapa  2019
Sanjaya YapaCustomizing Dynamics 365https://doi.org/10.1007/978-1-4842-4379-4_6

6. Azure Integration

Sanjaya Yapa1 
(1)
Kandy, Sri Lanka
 

In this chapter, we will discuss the Azure integrations that can be leveraged to implement fast and scalable solutions. We will be focusing on three types of Azure technologies that are useful in Dynamics 365 implementations. They are Azure WebJobs, Functions, and Logic Apps. Azure WebJobs is ideal for batch processing, which will run behind the scenes. Since we are talking about Azure technologies in this chapter, we will also take a quick look at some security aspects. Microsoft Dynamics 365 for Customer Engagement leverages the cloud services infrastructure and the built-in security features to safeguard the data. To safeguard access, it depends heavily on Azure Active Directory (AAD) to authenticate the users and prevent unauthorized access to sensitive data. Also, when it comes to licensing, you can buy the license for Dynamics 365, or you can provision the required infrastructure with Azure, and your cost is determined by the usage. When leveraging the power of these technologies, you must take extra precautions to plan and design your solution. There is a huge pile of information provided at https://docs.microsoft.com/en-us/dynamics365/customer-engagement/admin/manage-subscriptions-licenses-user-accounts .

Logic Apps can be used to design more advanced workflows, and Azure Functions enables you to write functionality that processes data in a serverless architecture. Especially with Logic Apps, you can connect with many external applications, which we will look at in detail later this chapter.

Azure WebJobs

First let’s look at Azure WebJobs, which is a feature of Azure App Service. You can use the WebJobs feature to run processes behind the scenes, and the best thing about WebJobs is that you can schedule when to execute the process. Web jobs can be categorized as continuous or triggered. Continuous web jobs start immediately when the job is created. To execute the app continuously, the application executes within an infinite loop. If the application stops, you can restart it. Triggered web jobs start only on a certain schedule or when manually triggered.

The following are the supported file types with Azure WebJobs:
  • PowerShell (.ps1)

  • Windows CMD (.cmd, .exe, .bat)

  • Bash (.sh)

  • Python (.py)

  • Node.js (.js)

  • Java (.jar)

  • PHP (.php)

Keep in mind that a web app can time out after 20 minutes of inactivity, which can be reset with a request to the actual web app. If your application is set up as continuous or scheduled, then enable Always On to make sure the application executes without any interruption.

Writing Code for a Web Job

Let’s look at the example with the SBMA membership management solution. The stakeholders want the application to raise the yearly subscription automatically and on time. For instance, if the membership of a member is expiring a month from today, then a subscription and a payment record should be created behind the scenes. One of the options is to create a Windows service, but to deploy it you would need a server, either physical or virtual. Either way, creating a server requires an additional fee for the maintenance, which SBMA is not willing to pay. So, the option available is to create a console application that executes behind the scenes and create the necessary subscription records. When compared to Windows services, the best thing about the WebJobs feature is the ability to create console applications. Creating a console application means that debugging and troubleshooting can be done before deploying to the cloud, whereas Windows services require much more complex debugging scenarios.

This section will list the code used to create the application. Listing 6-1 illustrates the Main program that connects to the Dynamics 365 instance and calls the logic to create the subscription. Remember, you should create only the subscription here; the relevant payment record will be created by the custom workflow activity created in Chapter 5.
using System;
using System.Configuration;
using System.Net;
using System.Reflection;
using System.ServiceModel.Description;
using log4net;
using Microsoft.Crm.Sdk.Messages;
using Microsoft.Xrm.Tooling.Connector;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using SBMA.ServiceProcessor.BusinessProcessLayer;
namespace SBMA.ServiceProcessor
{
  public class Program
  {
    public static readonly ILog log = LogManager.GetLogger(MethodBase.GetCurrent Method().DeclaringType);
    public static void Main(string[] arcs)
    {
      ExecuteLogic();
      Console.ReadLine();
    }
    /// <summary>
    /// Connecting to Dynamics 365 instance
    /// </summary>
    public static void ExecuteLogic()
    {
      try
      {
        log.Info("Connecting to Dynamics 365 instance...");
        using (CrmServiceClient crmConnection = new CrmServiceClient
                  (ConfigurationManager.ConnectionStrings["Xrm"].ConnectionString))
        {
        log.Info(crmConnection.IsReady);
        if (crmConnection.IsReady)
        {
          log.Info("Application connected to server successfully.");
        }
        IOrganizationService service = (IOrganizationService)crmConnection;
        CrmServiceContext context = new CrmServiceContext(service);
         if (service != null)
        {
          Guid orgId = ((WhoAmIResponse)service.Execute
                              (new WhoAmIRequest())).OrganizationId;
          if (orgId != Guid.Empty)
          {
            log.Info("Connection established successfully.");
           }
          // Call the process to create the
          SubscriptionProcessor subscriptionProcessor = new SubscriptionProcessor();
          subscriptionProcessor.CreateRenewalSubscriptionProcess(service);
         }
         else
         {
           Console.WriteLine("Connection failed...");
         }
        }
      }
      catch (Exception ex)
      {
            log.Error("EXCEPTION: ", ex);
      }
    }
  }
}
Listing 6-1

Main Program That Calls the Process to Create the Subscriptions

Note

As you can see, there is a reference to log4net, which is handy when testing code. Also, when you publish the application with web jobs, you can view the details of execution in the logs. This information is really valuable. This link will guide you to set up log4net: https://stackify.com/log4net-guide-dotnet-logging/ .

The recommended approach for connecting to a Dynamics 365 instance from an external application like this is to use Microsoft.Xrm.Tooling.Connector library, which you can download from NuGet. As shown in Listing 6-1, to establish the connection, you pass the connection string read from the configuration file to the CrmServiceClient constructor, which is then used to instantiate the organization service. When you create the connection string, you must make sure that you set the SkipDiscovery property to false, as shown in Listing 6-2. If not, the application will throw the exception “Microsoft.Xrm.Entity cannot be converted to type<ex: SBMA.ServiceProcessor.Account>.” This is because the service does not know how to deserialize the results into a proper object. You can find more information at https://community.dynamics.com/crm/b/bettercrm/archive/2018/11/28/dynamics-365-tooling-object-of-type-microsoft-xrm-sdk-entity-cannot-be-converted-to-type-type .

<connectionStrings>
    <add name="Xrm"
        connectionString="AuthType=Office365;
                          [email protected];
                                  Password=***************;
                                  Url=https://orgname.crm6.dynamics.com;
                                  SkipDiscovery=false;" />
 </connectionStrings>
Listing 6-2

Dynamics 365 Connection String

Note

For demonstration purposes, we have hard-coded the credentials to connect to the Dynamics 365 instance, but the best practice is to use the Azure Key Vault. Please refer to the following URL for more information: https://docs.microsoft.com/en-us/azure/key-vault/key-vault-use-from-web-application .

You have created a separate layer to hold all the data access. Listing 6-3 lists the code that reads and writes to Dynamics 365.
using Microsoft.Xrm.Sdk;
using System;
using System.Collections.Generic;
using System.Linq;
namespace SBMA.ServiceProcessor.DataAccessLayer
{
  public class SubscriptionDataAccess
  {
    /// <summary>
    /// Get members renewing in 30 days from today
    /// </summary>
    /// <param name="crmServiceContext"></param>
    /// <returns></returns>
    public List<Account> GetMembersExpiringIn30Days
                                    (CrmServiceContext crmServiceContext)
    {
        var accountList = from a in crmServiceContext.AccountSet
                  where a.sbma_MembershipRenewalDate.Value ==
                                              DateTime.Today.AddDays(30.0)
                   select new Account
                   {
                      AccountId = a.AccountId,
                      AccountNumber = a.AccountNumber,
                      sbma_MembershipTypeId = a.sbma_MembershipTypeId
                   };
        return accountList.ToList();
    }
    /// <summary>
    /// Check for unpaid subscriptions
    /// </summary>
    /// <param name="crmServiceContext"></param>
    /// <param name="accountNumber"></param>
    /// <returns></returns>
    public List<sbma_membersubscription>
      GetUnpaidSubscriptions(CrmServiceContext crmServiceContext, Guid accountNumber)
    {
      var subscriptionList = from s in
                              crmServiceContext.sbma_membersubscriptionSet
                            where s.sbma_MemberId.Id == accountNumber &&
                               s.sbma_SubscriptionStatus.Value ==
                       (int)sbma_membersubscription_sbma_SubscriptionStatus.Pending
                            select new sbma_membersubscription
                            {
                               sbma_membersubscriptionId = s.sbma_membersubscriptionId,
                               sbma_MemberId = s.sbma_MemberId
                            };
        return subscriptionList.ToList();
    }
    /// <summary>
    /// Create the member subscription
    /// </summary>
    /// <param name="organizationService"></param>
    /// <param name="membersubscription"></param>
    public void CreateRenewalMemberSubscription(IOrganizationService
         organizationService, sbma_membersubscription membersubscription)
    {
        organizationService.Create(membersubscription);
    }
    /// <summary>
    /// Get membership type details of the new member
    /// </summary>
    /// <param name="crmServiceContext"></param>
    /// <param name="memberid"></param>
    /// <returns></returns>
    public sbma_membershiptype GetMembershipTypeOfMember(CrmServiceContext
                                                       crmServiceContext,
                                                       Guid membershipTypeId)
    {
        var membershiptype = (from mt in
                              crmServiceContext.sbma_membershiptypeSet.Where
                            (m => m.sbma_membershiptypeId == membershipTypeId)
                             select mt).FirstOrDefault();
        return (sbma_membershiptype)membershiptype;
    }
  }
}
Listing 6-3

Data Access Layer

Listing 6-4 shows the logic that is called at the main program. It will manipulate the data access methods and create the member subscription record.
using Microsoft.Xrm.Sdk;
using SBMA.ServiceProcessor.DataAccessLayer;
using System;
using System.Collections.Generic;
namespace SBMA.ServiceProcessor.BusinessProcessLayer
{
  public class SubscriptionProcessor
  {
    public void CreateRenewalSubscriptionProcess
                           (IOrganizationService organizationService)
    {
      using (CrmServiceContext crmServiceContext =
                           new CrmServiceContext(organizationService))
      {
       SubscriptionDataAccess subscriptionDataAccess = new
                                             SubscriptionDataAccess();
       // Get the list of accounts that subscription is expiring in 30 days
       log.Info("Get members renewing in 30 days...");
       List<Account> accountsList =
              subscriptionDataAccess.GetMembersExpiringIn30Days(crmServiceContext);
       // Execute only if there are any reocrds, else do nothing
       if(accountsList.Count > 0)
       {
         log.Info("Processing "+accountsList.Count+" members...");
         for (int i = 0; i < accountsList.Count; i++)
         {
           // For each account check whether there are any unpaid
             subscriptions for the period.Create only if there are
             no unpaid subscriptions
           List<sbma_membersubscription> membersubscriptions =
                               subscriptionDataAccess.GetUnpaidSubscriptions
                               (crmServiceContext,
                                     accountsList[i].AccountId.Value);
           if (membersubscriptions.Count == 0)
           {
            log.Info("Creating member subscription for member: " +
                              accountsList[i].AccountId.Value);
            sbma_membershiptype membershiptype =
                        subscriptionDataAccess.GetMembershipTypeOfMember
                       (crmServiceContext,
                                accountsList[i].sbma_MembershipTypeId.Id);
            //Set the Membershipcription Properties
            sbma_membersubscription membersubscription = new
                                                    sbma_membersubscription
            {
              //Se the entity reference with Member record
              sbma_MemberId = new EntityReference
                   (accountsList[i].LogicalName,
                                            accountsList[i].AccountId.Value),
              //Set the entity reference with Membership Type record
              sbma_MembershipTypeId = new EntityReference
                                            (membershiptype.LogicalName,
                                 membershiptype.sbma_membershiptypeId.Value),
              //Set the subscription due date
              sbma_SubscriptionDueDate = DateTime.Now.AddDays(7.0),
          //Set the Subscription status to pending
          sbma_SubscriptionStatus = new OptionSetValue(
                (int)sbma_membersubscription_sbma_SubscriptionStatus.Pending)
          };
       // Create the subscription
       subscriptionDataAccess.CreateRenewalMemberSubscription
               (organizationService, membersubscription);
       log.Info("Subscription created successfully.");
      }
     }
    }
   }
  }
 }
}
Listing 6-4

Business Logic Implementation

When all the coding is completed, you can build the application. Navigate to the bin folder and zip the content of the folder. You will need to upload the content to Azure WebJobs. The zip file must contain all the reference .dlls and config files. Now let’s start creating the Azure web job.

Creating a Web Job

To create a web job, you must first create a web app under Azure App Service. Provide the relevant details and click Create. See Figure 6-1.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig1_HTML.jpg
Figure 6-1.

Creating a web app

Once the web app is created successfully, you will see it listed as an app service. See Figure 6-2.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig2_HTML.jpg
Figure 6-2.

App service created

Navigate inside the web app created and select WebJobs in the left pane, as illustrated in Figure 6-3.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig3_HTML.jpg
Figure 6-3.

WebJobs in left pane

Click +Add to add the web job. Provide all the details. You must provide a unique name and upload the zip file you created. Set the type as Triggered and the type of trigger as Scheduled. The unique thing about the scheduled triggers is the interval must be provided as a CRON expression. You can learn about cron expressions at https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer#cron-expressions . Figure 6-4 shows the settings for the web job. For the demonstration purposes, the cron expression is set to every two minutes. Ideally, this type of batch job should be executed daily. For instance, you could enter it as 0 30 23 * * *, meaning the process will be executed at 11:30 p.m. every night.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig4_HTML.jpg
Figure 6-4.

Web job settings

After configuring these settings, you can click OK, which will create the web job, as shown in Figure 6-5. You can see that Status is set to Ready.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig5_HTML.jpg
Figure 6-5.

Web job created and ready to execute

Select the web job and click Run to start the job, and the status will change to Running. See Figure 6-6.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig6_HTML.jpg
Figure 6-6.

Web job running

To view the progress of the web job and troubleshoot any issues, click Logs in the toolbar to see the progress. Also, all the messages entered through log4net will be shown on the console. See Figure 6-7.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig7_HTML.jpg
Figure 6-7.

Web job execution status

As mentioned earlier in this section, to make the application execute without any interruption, you must enable the Always On option in the “General settings” section of the web job. Keep in mind that this action involves an additional cost. See Figure 6-8.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig8_HTML.jpg
Figure 6-8.

Enabling Always On for the web job

An improved version of this solution would be to use Fetch XML queries to select the data required for processing. The benefit of using Fetch XML queries is that when the business wants simple changes to the query, you can easily alter the Fetch XML code without any development work and releases. The following are the high-level steps to create such a solution:
  1. 1.

    Create an entity that holds the Fetch XML query in Dynamics 365. You must have a couple of unique fields to identify the query such as which method will use the query, which process will use the query, etc.

     
  2. 2.

    The console application will use the Fetch XML query to retrieve the batch of records to process on the Dynamics 365 end. You could write .NET classes as you did in the previous example to process the data.

     
  3. 3.

    A further improvement would be to use workflows. You can create on-demand workflows that hold the data processing logic, which will be invoked by the console application. Using workflows gives you a rich set of execution logs within Dynamics 365. This is similar to creating an Advanced Find view and executing the on-demand workflow against all the records in the results view.

     

As you can see, if the business requires any batch processing, you can easily do it with Azure WebJobs by creating a console application that processes data. In the next section, you will be looking at another serverless Azure technology, Azure Functions.

Azure Functions with Dynamics 365

Dynamics 365 and Azure Functions integration in v9.0 enables various integration scenarios for your applications. As mentioned earlier in this chapter, Azure Functions is serverless, and the pay-per-use pricing makes it cost effective. Azure Functions comes with built-in capabilities to integrate with PowerApps, Logic Apps, and Microsoft Flow, providing a new set of building blocks to power users to design and develop state-of-the-art solutions. You can do many amazing things in the context of Dynamics 365 integrations.
  • Export and import data from external applications or web services.

  • Get notifications from third-party services.

  • When there are heavy processes that should performed, you can pass them to Azure Functions without disrupting the user.

  • You can connect to third-party endpoints such as payment gateways, making it easy to change the endpoint at any time without changing the internal logic.

The primary reason for using Azure Functions is to minimize the necessity to write plugins and custom workflows, which are difficult to debug. Also, using third-party libraries with plugins in Dynamics 365 online is not possible, unless you have the source code of the third-party library and compile it into the plugin DLL. Therefore, you can easily write an Azure function that references third-party libraries. In this section, we will look at processing credit card payments via Azure Functions. In the future, if the client wants to change the payment gateway, they can easily change the endpoint from the function rather than having to alter plugins or custom workflows.

Creating an Azure Function

First let’s create an Azure function. Click +Create Resource and in the Compute section you will find Function App. Provide all the necessary information and click Create. See Figure 6-9.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig9_HTML.jpg
Figure 6-9.

Creating an Azure Functions app

It will take few seconds to create the app. Once the app is created, you will see the details of the app. See Figure 6-10.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig10_HTML.jpg
Figure 6-10.

Azure Functions app details

Next, let’s create the function that processes the payment. Click the + sign next to Functions in the left pane. Then you will be asked to select the template. For this example, you will trigger the function with HTTP triggers. Select API & Webhooks from the Scenario drop-down box. From the options available, select HTTP Trigger. See Figure 6-11.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig11_HTML.jpg
Figure 6-11.

Selecting the function template

When you select the trigger, you will be prompted to enter the name of the function and the authorization level. See Figure 6-12.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig12_HTML.jpg
Figure 6-12.

Function properties

The script editor window will open, as shown in Figure 6-13. This is where you write the function. Please note that the example given here is a simple one, and we will not go into the details of implementing the payment gateway because that is out of the scope of this book. This example will demonstrate when the state is changed on the Dynamics 365 end; then it will trigger the function via JavaScript.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig13_HTML.jpg
Figure 6-13.

Function editor window

Replace the default code with the following code. Please note that you can create this function using Visual Studio and publish it directly to your Azure subscription. For more information, visit https://docs.microsoft.com/en-us/azure/azure-functions/functions-develop-vs and https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio . To keep things simple, this example will be created using the default browser-based editor.
#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
  log.LogInformation("Credit Card Payment request recieved...");
  string cardHoldersName, expiryDate, cardNumber, amount;
  cardHoldersName = req.Query["cardHoldersName"];
  expiryDate = req.Query["expiryDate"];
  cardNumber = req.Query["cardNumber"];
  amount = req.Query["amount"];
  string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
  dynamic data = JsonConvert.DeserializeObject(requestBody);
  cardHoldersName = cardHoldersName ?? data?.cardHoldersName;
  expiryDate = expiryDate ?? data?.expiryDate;
  cardNumber = cardNumber ?? data?.cardNumber;
  amount = amount ?? data?.amount;
  // Payment Gateway Implementation
  if(cardHoldersName == null)
  {
      return new BadRequestObjectResult
      ("Please pass a name on the query string or in the request body");
  }
  return (ActionResult)new OkObjectResult
  ($"Payment processed successfully: {cardHoldersName} - {cardNumber}");
}
You can test the function by navigating to the Test tab in the right pane. After entering the JSON, click the Run button. Figure 6-14 shows the output. You can see the test result in the output window and the logs in the Logs window on the bottom of the screen. This will help you to ensure your function is working and accept the data as JSON.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig14_HTML.jpg
Figure 6-14.

Testing the function

Consuming the Azure Function

As the next step, you will have to create the JavaScript file, which will be used to call the function. But before that, you need to click the </> Get Function URL link next to the Run button and copy the URL with the key. See Figure 6-15.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig15_HTML.jpg
Figure 6-15.

Copy function URL and the key

Before leaving, you must configure cross-origin resource sharing (CORS) so that you can consume your function from Dynamics 365. Navigate back to the Functions app and select the Platform Features tab. As shown in Figure 6-16, click the CORS icon, and you will be prompted to enter the URL. Enter the instance URL and click Save.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig16_HTML.jpg
Figure 6-16.

Configuring CORS

Now it is time to enter the JavaScript that calls the function. The following JavaScript code invokes the function:
(function (SBMA)
{
 //constants
  Constants = function () {
      this.CALLING_MODULE = "sbma_MemberPayments.js";
      this.AZURE_BASE_ENDPOINT = "https://sbmacreditcardprocessing.azurewebsites.net/api/";
      this.AZURE_FUNCTION_ENDPOINT = "ProcessCreditCardPayment?code=ISCPrJwBXdpk8bbTI8Y8yBYIbawj9cNPo8Cg/DotanIAnrb9XEepoQ==";
      this.FORM_TYPE_UPDATE = 2;
      this.SUCESS_MESSAGE = "Payment processed successfully.";
      this.FAILURE_MESSAGE = "Payment failed.";
      return this;
  }();
  var formContext = null;
  SBMA.processCCPayment = function (executionContext) {
      formContext = executionContext.getFormContext();
      var formType = formContext.ui.getFormType();
      var fieldValue = formContext.getAttribute("sbma_paymentstatus").getValue();
      alert(fieldValue);
      if (formType === Constants.FORM_TYPE_UPDATE) {
        if (fieldValue === 646150001) {
          var ccPayment = {
              cardHoldersName:
                formContext.getAttribute("sbma_nameoncard").getValue(),
              expiryDate:
                formContext.getAttribute("sbma_cardexpirydate").getValue(),
              cardNumber:
                formContext.getAttribute("sbma_ccnumber").getValue(),
              amount: formContext.getAttribute("sbma_amountdue").getValue()
        };
        executeAzureFunction(ccPayment, paymentSuccessHandler, paymentFailureHandler);
       }
      }
  };
  paymentSuccessHandler = function (response) {
      formContext.ui.setFormNotification(SUCESS_MESSAGE, "INFO", null);
  };
  paymentFailureHandler = function (response) {
      formContext.ui.setFormNotification(FAILURE_MESSAGE, "ERROR", null);
  };
  executeAzureFunction = function (payment, successHandler, failureHandler) {
      var endpoint = Constants.AZURE_BASE_ENDPOINT + Constants.AZURE_FUNCTION_ENDPOINT;
      var req = new XMLHttpRequest();
      req.open("POST", endpoint, true);
      req.setRequestHeader("Accept", "application/json");
      req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
      req.setRequestHeader("OData-MaxVersion", "4.0");
      req.setRequestHeader("OData-Version", "4.0");
      req.onreadystatechange = function () {
            if (this.readyState === 4) {
                  req.onreadystatechange = null;
                  if (this.status === 200) {
                        successHandler(JSON.parse(this.response));
                  }
                  else {
                        failureHandler(JSON.parse(this.response).error);
                  }
            }
      };
      req.send(window.JSON.stringify(payment));
};
})(window.AzureServicesLib = window.AzureServicesLib || {});
Now you need to combine the JavaScript code with the OnChange event of the Payment Status field of the Member Payment form. So, when you change the status to Paid, you will see that the form notification appears, meaning that the request has been successfully sent to the function and the function returned a success message. See Figure 6-17.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig17_HTML.jpg
Figure 6-17.

OnChange event triggered the function and the success message

In terms of best practices, one other option to consume an Azure function is to register it as a webhook using the plugin registration tool. Open the plugin registration tool and connect to your Dynamics 365 instance. In the Register menu, click the Register New Web Hook option, as shown in Figure 6-18.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig18_HTML.jpg
Figure 6-18.

Registering a new webhook with the plugin registration tool

Specify the name of the webhook. In the Endpoint URL field, paste the URL copied from without the code/key. For the authentication, select WebhookKey and paste in the key, as shown in Figure 6-19.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig19_HTML.jpg
Figure 6-19.

Webhook settings

There’s one important thing to remember with the key here. For demonstration purposes, we have used the default key. From the Manage option in the Azure portal, you can have more than one key. Best of all, you can revoke or renew any key if you found out that the key has been compromised. Just click the “Add new function key” button. See Figure 6-20.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig20_HTML.jpg
Figure 6-20.

Adding a new function key option

Then the screen will display the settings for the configuring the key. As instructed, leave the Key field blank to generate it automatically. See Figure 6-21.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig21_HTML.jpg
Figure 6-21.

New function key settings

Click the Save button, and your new key will be generated, which you can use with the webhook registration. See Figure 6-22.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig22_HTML.jpg
Figure 6-22.

New function key generated

As shown in Figure 6-22, you can renew and revoke the function keys. Moving back to the plugin registration tool step, when you click Save after entering the key, the webhook gets registered, and now you can register the steps on it.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig23_HTML.jpg
Figure 6-23.

Registering the new step for a webhook

This is a simple example of how you can communicate with the functions provided by the Azure Functions feature. You can use the concept to implement more complex scenarios. In the next section, we will look at another popular Azure integration, Azure Logic Apps.

Integrating Azure Logic Apps with Dynamics 365

Azure Logic Apps is considered Microsoft’s offering for enterprise-scale integration and workflow in the cloud. With it you can off-load heavy integrations easily and quickly with a low cost. This service is part of the platform as a service (PaaS) offering from Microsoft. Even though you can develop custom workflow applications to meet your integration needs, you should consider Logic Apps because it provides an out-of-the-box retry policy that eliminates extra coding. Also, the true power comes with its automatic scalability, eradicating the requirements of additional VMs.

Azure Logic Apps provides a vast diversity of integration points including Dynamics 365. You can learn more about these integration points at https://docs.microsoft.com/en-us/azure/connectors/apis-list . Also, Logic Apps has built-in triggers that kick off the workflow. For instance, an event could be receiving an e-mail or detecting a change in an Azure Storage account that triggers the Logic Apps engine and creates a new logic instance that runs the workflow. Next, the execution moves to the actions, which are all the steps or tasks that should happen after the trigger. These actions could be managed connections, custom API calls via Azure Functions, or custom connections. Logic Apps also include enterprise capabilities with BizTalk Server.

Similar to Logic Apps, Microsoft Flow is becoming popular as the go-to workflow engine for Dynamics 365. We discussed this in Chapter 4. The following is a high-level comparison between Logic Apps and Microsoft Flow:
  • Logic Apps is more suitable for advanced developer integration scenarios, whereas Microsoft Flow is for self-service workflows that power users can create and use.

  • Unlike Microsoft Flow, Logic Apps involves a running cost. It incorporates the pay-per-usage costing model. Microsoft Flow, on the other hand, has a free tier with 750 flow executions per month. Flow plan 1, which is a paid plan, increases the executions to 4,500. Flow plan 2, which is known as the premium plan, increases the executions to 15,000 flows per month.

  • You can use the browser and Visual Studio to design Logic Apps workflows, but Microsoft Flow workflows can be designed in the browser only.

  • Microsoft Flow workflows can be easily designed and tested in a nonproduction environment and then promoted to the production environment. Logic Apps workflows can be source controlled, tested, and automated and managed via Azure Resource Management.

  • Since Logic Apps uses the Azure Portal as its backbone, Logic Apps workflows have more management and monitoring options for administrators; troubleshooting is much easier as well.

Since both options have their own pros and cons, you must wonder when to use these technologies. When you do not have an Azure subscription but you must implement a workflow, then the Microsoft Flow free tier is the go-to workflow engine. Microsoft Flow is part of the Power Platform, so you can easily manipulate data with the Common Data Service using the workflows created by Microsoft Flow. Azure Logic Apps comes into action when you need more control over your development in more complex integration scenarios on an enterprise scale. As mentioned, if you want automation and source control, then you should go for Logic Apps.

Among various integration points, Azure Logic Apps comes with Dynamics 365 connectors, and in this section you will look at a scenario where connectors are used in relation to the SBMA membership application.

The Azure Logic Apps Solution

This scenario is again related to membership payments. In this application, there are two different types of payments, credit card and direct debit payments. For direct debit payments, a list of payments must be sent to the bank on a given day to process the payments. So, for all the direct debit payments, the payment details including the account number, bank code, amount, and account name must be sent to the bank or the direct debit processing services. The bank will process the payments and send a file that contains the successful and failed payments. The process should pick up this file and update the Dynamics 365 solution accordingly.

You could write a custom workflow, but this is cumbersome and time-consuming, and you might end up spending more money just for the implementation. Let’s look at how you can implement this scenario in Azure Logic Apps. We will be using Visual Studio to create and publish the apps. Let’s open the SBMA solution and add a new project. See Figure 6-24.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig24_HTML.jpg
Figure 6-24.

Selecting the project type

Select the Logic App template from the template list, and click OK, as shown in Figure 6-25.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig25_HTML.jpg
Figure 6-25.

Selecting the Logic App template

Visual Studio will create the project; you can see that there are three files created under the project. Right-click the LogicApp.json file, and click Open with the Logic App Designer (Ctrl+L). You will have to log into your Azure subscription at this point. Once you’re logged in successfully, Visual Studio will open the Logic App properties window; now click Create New from the Resource Group drop-down, provide the details of the resource group, and click OK. See Figure 6-26.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig26_HTML.jpg
Figure 6-26.

Logic App Properties window

Visual Studio will load the LogicApp.json file, as shown in Figure 6-27. You can see that there are many built-in templates that you can use if they match up with your requirements.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig27_HTML.jpg
Figure 6-27.

Logic Apps, selecting templates

Select the blank template from the list since you are going to create this solution from scratch. When the blank page is loaded, click +New Step, and you will see the dialog shown in Figure 6-28 where you can select the steps you want to create.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig28_HTML.jpg
Figure 6-28.

Selecting steps

Since you are going to create a process that is recurring, search for the Recurrence step in the list, and you can configure it as shown in Figure 6-29. The step in the example is configured to execute every three minutes, but it can be changed as per the requirements.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig29_HTML.jpg
Figure 6-29.

Configuring the recurring step

The next step is to get the pending direct debit payment records from the Dynamics 365. There are a few Dynamics 365 endpoints that you can choose. For this instance, you need to list all the records filtered by Payment Status and Payment Method. Therefore, you select the “List records” steps. When you add the Dynamics 365 endpoints, you will have to connect your step to the Dynamics 365 instance. Once you enter the credentials and have successfully connected to the instance, you will see the organization name and all the entities. We will be using the Member Payments entity. See Figure 6-30.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig30_HTML.jpg
Figure 6-30.

Configuring Dynamics 365 list step

Note

As with Flow, Logic Apps also has connectors and triggers for the Common Data Service. You can find more information about Common Data Services at https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/data-platform-intro .

As shown in Figure 6-31, the filter query has been configured using an OData query. You can learn more about OData queries at https://docs.microsoft.com/en-us/rest/api/searchservice/odata-expression-syntax-for-azure-search . For simplicity, this example will leave the other settings as the defaults. Now that you have retrieved data, the next step is to format it as a form accepted by the direct debit service of the client. Again, for simplicity, you will be creating a CSV-formatted file, but you can create any file type. To create a CSV-formatted file, you will be using the Create CSV table step. The output from the previous step, which is the value, will be used in the From field. In the Columns drop-down, if you select the Automatic option, then all the columns will be added to the CSV table. To avoid that, select the Custom option, enter the column names, and pick the value in the Add Dynamic Content window.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig31_HTML.jpg
Figure 6-31.

Creating a CSV table

When you click the Value column, the Add Dynamic Content window will be displayed on the side, as shown in Figure 6-32. You can see that all the columns are listed here, and you can pick only the columns you required.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig32_HTML.jpg
Figure 6-32.

“Dynamic content” window

Finally, you must save the result set to a location. Here, we will be saving the file to a common location in OneDrive. Add the Create File step and configure it as shown in Figure 6-33. Pay close attention to the File Name field where it adds a date and time to avoid duplicates.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig33_HTML.jpg
Figure 6-33.

Configuring OneDrive, the create file step

For the file content, we have configured the step to use the output from the previous step. To add a unique file name, we have used a function to capture the date and time. When you click the File Name field, the Add Dynamic Content window will pop up. Go to the Expression tab, which has a whole plethora of various functions nicely categorized that will be displayed so you can build any expression. See Figure 6-34.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig34_HTML.jpg
Figure 6-34.

Dynamic expressions

Now that you have developed the logic app, the finished solution will look like Figure 6-35.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig35_HTML.jpg
Figure 6-35.

Completed workflow

Deploying the Project

Let’s publish the workflow. Right-click the logic app project and click the Deploy option; you will be prompt with the window shown in Figure 6-36. Also note that, in this window, you will have to create a resource group, so make sure to log into your Azure account first.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig36_HTML.jpg
Figure 6-36.

Logic app deploy settings

Once the app is deployed successfully, you will be able to see the application under Azure Logic Apps. You can run the application by clicking the Run button, and at the bottom of the window, the application execution details will be listed. It is easy to troubleshoot the workflow. See Figure 6-37.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig37_HTML.jpg
Figure 6-37.

Logic Apps app details on Azure Portal

Click one of the executions, and you will see the details of the execution where you can drill deep into each of the steps you added to the workflow. See Figure 6-38.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig38_HTML.jpg
Figure 6-38.

Execution steps and details

Figure 6-39 shows the file created in the given OneDrive location.
../images/471991_1_En_6_Chapter/471991_1_En_6_Fig39_HTML.jpg
Figure 6-39.

Logic Apps app output created at the provided OneDrive location

Summary

In this chapter, you looked at a few simple examples of setting up Azure WebJobs, Functions, and Logic Apps. In Chapter 7, you will learn about the reporting and dashboard capabilities of Dynamics 365.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset