6

API Data Access

So far in this book, we've worked with static data that is hardcoded directly into the TripLog app itself. However, in the real world, it is rare that an app depends purely on local static data—most mobile apps get their data from a remote data source, typically an API. In some cases, an app may communicate with a third-party API—that of a social network, for example. Alternatively, developers sometimes create their own API to make data available for their apps. In this chapter, we will create a simple API in the cloud that we can connect to and retrieve data from in the TripLog app.

The following is a quick look at what we will cover in this chapter:

  • Creating a live, cloud-based, backend API to store and retrieve TripLog data
  • Creating a data access service that handles communication with the API for the app
  • Setting up data caching so that the TripLog app works offline

Let's start by creating an API using Microsoft's Azure Function App service.

Creating an API with Azure Functions

Almost all mobile apps communicate with an API to retrieve and store information. In many cases, as a mobile app developer, you might just have to use an API that already exists. However, if you're building your own product or service, you may need to create your own backend and web API.

There are several ways you can create an API, as well as several places you can host it, and certainly many different languages you can develop it in. For the purposes of this book, we will create a backend service and web API in the cloud using an Azure Function bound to Azure Table storage.

Azure Functions have a lot of capability and serve as a powerful "serverless" compute platform for numerous scenarios. You can create functions in Visual Studio or directly in the Azure portal and you can choose from .NET Core, Node.js, and several other runtime stacks. Since the primary focus of this book is developing a mobile app, I won't go too deep in explaining all the ins and outs of Azure Functions. In this section, we'll just cover the basics needed to create a simple API to which we can connect our app later in this chapter.

For more information about Azure Functions, visit https://azure.microsoft.com/en-us/services/functions/

In order to follow along with the steps in this chapter, you'll need to have an Azure account. If you don't already have an Azure account, you can create one for free at https://azure.microsoft.com/en-us/free/.

Creating an Azure Function App

Once you have an Azure account, you can begin setting up an API with Azure Functions in the Azure portal, as follows:

  1. Go to https://portal.azure.com in a web browser, and log in to the Azure portal using your credentials.
  2. From the Azure portal dashboard or home screen, click on the + Create a resource button in the main portal menu, then type function into the search textbox and select Function App, as shown in the following screenshot:

    Figure 1: Creating a new Function App in the Azure Portal (step 1 of 2)

  3. On the Function App detail page, click on the Create button:

    Figure 2: Creating a new Function App in the Azure Portal (step 2 of 2)

  4. Select your Subscription and Resource Group.
  5. Enter a name for your Function App.
  6. Select .NET Core for your Runtime stack.
  1. Select the Region that is closest to your location and then click on Create.

Now that a Function App has been created, we will add a new function within it.

Creating an Azure Function

Create a new HTTP trigger function as follows:

  1. Navigate to your new Function App from the portal dashboard or home screen.
  2. Select Function Apps on the left side and click on the + to add a new function, as shown in the following screenshot:

    Figure 3: Creating a new Azure Function in the Azure Portal (step 1 of 2)

  3. Click on the HTTP trigger button.
  4. Enter a name for the new function, such as entry and select Anonymous for the authorization level, as shown in the following screenshot:

Figure 4: Creating a new Azure Function in the Azure Portal (step 2 of 2)

By selecting the Anonymous authorization level, we are making the API available without providing any specific authentication headers in the HTTP request. In the next chapter, we will add authentication to both the API and the mobile app, but for now we will simply provide anonymous access.

  1. Click on Create.

Once the new function has been created, it will present you with the function code in a simple in-browser code editor. In addition to the function code file, there is also a file called function.json, which contains the details of the function. Update the function.json file as follows to add Azure Table storage bindings to our function:

{
  "bindings": [
    {
      "authLevel": "anonymous",
      "name": "req",
      "type": "httpTrigger",
      "direction": "in",
      "methods": [
        "get",
        "post"
      ]
    },
    {
      "name": "$return",
      "type": "http",
      "direction": "out"
    },
    {
      "type": "table",
      "name": "entryTableOutput",
      "tableName": "entry",
      "connection": "AzureWebJobsStorage",
      "direction": "out"
    },
    {
      "type": "table",
      "name": "entryTableInput",
      "tableName": "entry",
      "take": 50,
      "connection": "AzureWebJobsStorage",
      "direction": "in"
    }
  ],
  "disabled": false
}

For the purposes of this book, we will write a very simple function that handles both retrieving and storing entries. For incoming GET requests, we will simply return all the objects in the table. For incoming POST requests, we'll read the request body and add it to the table. Update your function as follows:

#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;
using Newtonsoft.Json;
public static async Task<IActionResult> Run(HttpRequest req, Newtonsoft.Json.Linq.JArray entryTableInput, IAsyncCollector<Entry> entryTableOutput, ILogger log)
{
    log.LogInformation(req.Method);
    if (req.Method == "GET")
    {
        return (ActionResult) new OkObjectResult(entryTableInput);
    }
    var requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var entry = JsonConvert.DeserializeObject<Entry>(requestBody);
 
    if (entry != null)
    {
        await entryTableOutput.AddAsync(entry);
        return (ActionResult) new OkObjectResult(entry);
    }
    return new BadRequestObjectResult("Invalid entry request.");
}
public class Entry
{
    public string Id => Guid.NewGuid().ToString("n");
    public string Title { get; set; }
    public double Latitude { get; set; }
    public double Longitude { get; set; }
    public DateTime Date { get; set; }
    public int Rating { get; set; }
    public string Notes { get; set; }
    // Required for Table Storage entities
    public string PartitionKey => "ENTRY";
    public string RowKey => Id;
}

Browsing and adding data

Now that we have created an API in Azure and set up a data table within the service, we can start making calls to the API and getting responses. Before we start making calls to the API from within the TripLog app, we can test the endpoint by making GET and POST HTTP requests to it using either a command line or a REST console.

There are several REST consoles to choose from if you don't already have one installed. I typically use an app named Postman (https://www.getpostman.com).

If you don't want to use a REST console, you can use the command line to issue HTTP requests to the API. To do this, use either curl in Terminal on macOS or Invoke-RestMethod in PowerShell on Windows.

For documentation about curl, visit: https://curl.haxx.se/docs/

For documentation about Invoke-RestMethod, visit: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-restmethod

  1. Using either a REST console or the command line, issue a GET request to the API endpoint for the Azure Function using the following URL and header:
    https://<your-function-name>.azurewebsites.net/api/entry
    
  2. If everything has been set up properly, we should receive a 200 status code and an empty collection in the response body, as follows:
    []
    
  3. Next, add a new record to the backend service by issuing a POST request to the same API endpoint, with an Entry JSON object included in the body of the request. The service will automatically create the appropriate columns within the Entry table when we insert the first object, and we should get a 200 status code with the new item we added in the response body, as follows:
    https://<your-function-name>.azurewebsites.net/api/entry
    --data '{
        "title": "Space Needle", 
        "latitude": 47.6204,
        "longitude": -122.3491,
        "date": "2019-11-09T00:00:00.000Z",
        "rating": 5,
        "notes": "Wonderful site to see"
    }'
    
  1. Next, issue another GET request to the entry endpoint:
    https://<your-function-name>.azurewebsites.net/api/entry
    

We should receive a 200 status code, but now the response body has a collection containing the new item we added:

[
    {
        "Id": "beb8fcd5e31b48f8a601efc167a0018f",
        "Title": "Space Needle",
        "Latitude": 47.6204,
        "Longitude": -122.3491,
        "Date": "2019-11-09T00:00:00Z",
        "Rating": 5,
        "Notes": "Wonderful site to see",
        "PartitionKey": "ENTRY",
        "RowKey": "beb8fcd5e31b48f8a601efc167a0018f"
    }
]

In the preceding response, notice that after we added the new record to the backend service, it was automatically given an Id property, along with a couple of other properties. We will need to update the TripLogEntry model in our TripLog app to account for this new Id property, as follows:

public class TripLogEntry
{
    public string Id { get; set; } 
    public string Title { get; set; } 
    public double Latitude { get; set; }
    public double Longitude { get; set; } 
    public DateTime Date { get; set; } 
    public int Rating { get; set; } 
    public string Notes { get; set; }
}

Now that we have a live backend service that we can communicate with via HTTP, we will update our TripLog app so that it can send requests to the API to add and retrieve log entries.

Creating a base HTTP service

In order for an app to communicate with an API via HTTP, it needs an HTTP library. Since we are using .NET and C# to build a Xamarin.Forms app, we can leverage a library called System.Net.Http.HttpClient. The .NET HttpClient provides a mechanism to send and receive data via standard HTTP methods, such as GET and POST.

Continuing to keep separation and abstraction key to our app architecture, we want to keep the specific logic related to the HttpClient separate from the rest of the app. In order to do this, we will write a base service class in our core library that will be responsible for handling HTTP communications in a generic way. This provides a building block for any domain-specific data services we might need to write; for example, a service that is responsible for working with log entries in the API. Any class that will inherit from this class will be able to send HTTP request messages using standard HTTP methods (such as GET, POST, PATCH, and DELETE) and get HTTP response messages back without having to deal with HttpClient directly.

As we saw in the previous section, we are able to post data to the API in the form of JSON, and when we receive data from the API, it's also returned in the JSON format. In order for our app to translate its C# models into JSON for use in an HTTP request body, the model will need to be serialized. In contrast, when an HTTP response message is received in JSON, it needs to be deserialized into the appropriate C# model. The most widely used method to do this in .NET software is to use the Json.NET library.

In order to create a base HTTP service, perform the following steps:

  1. Add the Json.NET NuGet package, Newtonsoft.Json, to the core library project and each of the platform-specific projects.
  2. Create a new abstract class in the Services folder of the core library project named BaseHttpService:
    public abstract class BaseHttpService
    {
    }
    
  3. Add a protected async method to the BaseHttpService class named SendRequestAsync<T>, which takes in a Uri named url, an optional HttpMethod named httpMethod, an optional IDictionary<string, string> named headers, and an optional object named requestData. These four parameters will be used to construct an HTTP request. The url parameter is the full URL of the API endpoint for the request. The httpMethod optional parameter is used to make the request a GET, POST, and so on. The headers optional dictionary parameter is a collection of string key/value pairs used to define the header(s) of the request (such as authentication). Finally, the requestData optional parameter is used to pass in an object that will be serialized into JSON and included in the body of POST and PATCH requests:
    public abstract class BaseHttpService
    {
        protected async Task<T> SendRequestAsync<T>(
            Uri url, 
            HttpMethod httpMethod = null, 
            IDictionary<string, string> headers = null, 
            object requestData = null)
        {
            var result = default(T);
            // Default to GET
            var method = httpMethod ?? HttpMethod.Get;
            // Serialize request data
            var data = requestData == null
                ? null
                : JsonConvert.SerializeObject(requestData);
            using (var request = new HttpRequestMessage(method, url))
            {
                // Add request data to request 
                if (data != null)
                {
                    request.Content = new StringContent(data, Encoding.UTF8, "application/json");
                }
                // Add headers to request 
                if (headers != null)
                {
                    foreach (var h in headers)
                    {
                        request.Headers.Add(h.Key, h.Value);
                    }
                }
                // Get response
                
                using (var client = new HttpClient())
                using (var response = await client.SendAsync(request, HttpCompletionOption.ResponseContentRead))
                {
                    var content = response.Content == null
                        ? null
                        : await response.Content.ReadAsStringAsync();
                    if (response.IsSuccessStatusCode)
                    {
                        result = JsonConvert.DeserializeObject<T>(content);
                    }
                }
            }
            return result;
        }
    }
    

Now that we have a base HTTP service, we can subclass it with classes that are more specific to our data model, which we will do in the next section.

Creating an API data service

Using BaseHttpService as a foundation that abstracts away the HTTP request details, we can now begin to create services that leverage it to get responses back from the API in the form of domain-specific models. Specifically, we will create a data service that can be used by the ViewModels to get the TripLogEntry objects from the backend service.

We will start off by defining an interface for the data service that can be injected into the ViewModels, ensuring that there is no strict dependency on the API, or the logic that communicates with it, continuing the pattern we put in place in Chapter 4, Platform-Specific Services and Dependency Injection. To create a data service for the TripLog API, perform the following steps:

  1. Create a new interface named ITripLogDataService in the Services folder of the core library:
    public interface ITripLogDataService
    {
    }
    
  2. Update the ITripLogDataService interface with methods to get and add new TripLogEntry objects:
    public interface ITripLogDataService
    {
        Task<IList<TripLogEntry>> GetEntriesAsync();
        Task<TripLogEntry> AddEntryAsync(TripLogEntry entry);
    }
    

Next, we will create an implementation of this interface that will also subclass BaseHttpService so that it has access to our HttpClient implementation, as shown in the following steps:

  1. Create a new class in the core library Services folder named TripLogApiDataService, which subclasses BaseHttpService and implements ITripLogDataService:
    public class TripLogApiDataService : BaseHttpService, ITripLogDataService
    {
    }
    
  2. Add two private properties to the TripLogApiDataService class—a Uri and an IDictionary<string, string>—to store the base URL and headers, respectively, to be used for all requests:
    public class TripLogApiDataService : BaseHttpService, ITripLogDataService
    {
        readonly Uri _baseUri;
        readonly IDictionary<string, string> _headers;
    }
    
  3. Update the TripLogApiDataService constructor to take in a Uri parameter, then set the private _baseUri and _headers properties:
    public class TripLogApiDataService : BaseHttpService, ITripLogDataService
    {
        readonly Uri _baseUri;
        readonly IDictionary<string, string> _headers;
        public TripLogApiDataService(Uri baseUri)
        {
            _baseUri = baseUri;
            _headers = new Dictionary<string, string>();
            // TODO: Add header with auth-based token in chapter 7
            
        }
    }
    
  4. Finally, implement the members of ITripLogDataService using the SendRequestAsync<T>() base class method:
    public class TripLogApiDataService : BaseHttpService, ITripLogDataService
    {
        readonly Uri _baseUri;
        readonly IDictionary<string, string> _headers;
        // ...
        public async Task<IList<TripLogEntry>> GetEntriesAsync()
        {
            var url = new Uri(_baseUri, "/api/entry");
            var response = await SendRequestAsync<TripLogEntry[]>(url, HttpMethod.Get, _headers);
            return response;
        }
        public async Task<TripLogEntry> AddEntryAsync(TripLogEntry entry)
        {
            var url = new Uri(_baseUri, "/api/entry");
            var response = await SendRequestAsync<TripLogEntry>(url, HttpMethod.Post, _headers, entry);
            return response;
        }
    }
    

Each method in this TripLog data service calls the SendRequestAsync() method on the base class passing in the API route and the appropriate HttpMethod. The AddEntryAsync() also passes in a TripLogEntry object, which will be serialized and added to the HTTP request message content. In the next chapter, we will implement authentication with the API and update this service to pass in an authentication-based token in the header as well.

Updating the TripLog app ViewModels

Using the API and data service we created, we can now update the ViewModels in the app to use live data instead of the local, hardcoded data they currently use. We will continue to leverage the patterns we put in place in previous chapters to ensure that our ViewModels remain testable and do not have any specific dependencies on the Azure API, or even the HTTP communication logic. To update the ViewModels, perform the following steps:

  1. First, update the TripLogCoreModule in the core library to register our ITripLogDataService implementation into the IoC:
    public class TripLogCoreModule : NinjectModule
    {
        public override void Load()
        {
            // ViewModels 
            Bind<MainViewModel>().ToSelf();
            Bind<DetailViewModel>().ToSelf();
            Bind<NewEntryViewModel>().ToSelf();
            // Core Services
            var tripLogService = new TripLogApiDataService(new Uri("https://<your-function-name>.azurewebsites.net"));
            Bind<ITripLogDataService>()
                .ToMethod(x => tripLogService)
                .InSingletonScope();
        }
    }
    
  2. Next, update the MainViewModel constructor to take an ITripLogDataService parameter, which will be provided automatically via dependency injection:
    readonly ITripLogDataService _tripLogService;
    // ...
    public MainViewModel(INavService navService, ITripLogDataService tripLogService)
        : base(navService)
    {
        _tripLogService = tripLogService;
        LogEntries = new ObservableCollection<TripLogEntry>();
    }
    
  3. We will then update the LoadEntries() method in MainViewModel, replacing the 3-second delay and hardcoded data population with a call to the live TripLog API via the current ITripLogDataService implementation that is injected into the ViewModel's constructor:
    async void LoadEntries()
    {
        if (IsBusy)
            return;
        IsBusy = true;
        try
        {
            var entries = await _tripLogService.GetEntriesAsync();
            LogEntries = new ObservableCollection<TripLogEntry>(entries);
        }
        finally
        {
            IsBusy = false;
        }
    }
    

Notice we are using async/await for all calls to our TripLog API since it is a remote call over the internet and we can't expect immediate responses.

No other changes to MainViewModel are required. Now, when the app is launched, instead of the hardcoded data loading, you will see the items stored in the Azure backend service database.

Now, we will update the NewEntryViewModel so that when we add a new entry, it is actually saved to the Azure backend through the data service:

  1. Update the NewEntryViewModel constructor to take an ITripLogDataService parameter:
    readonly ITripLogDataService _tripLogService;
    // ...
    public NewEntryViewModel(INavService navService, ILocationService locService, ITripLogDataService tripLogService)
        : base(navService)
    {
        _locService = locService;
        _tripLogService = tripLogService;
        Date = DateTime.Today;
        Rating = 1;
    }
    
  2. Then, we will update the SaveCommand execution method to call the AddEntryAsync() method of the data service:
    async Task Save()
    {
        if (IsBusy)
            return;
        IsBusy = true;
        try
        {
            var newItem = new TripLogEntry
            { 
                Title = Title,
                Latitude = Latitude,
                Longitude = Longitude,
                Date = Date,
                Rating = Rating,
                Notes = Notes
            };
            await _tripLogService.AddEntryAsync(newItem);
            await NavService.GoBack();
        }
        finally
        {
            IsBusy = false;
        }
    }
    

Now, if we launch the app, navigate to the new entry page, fill out the form, and click on Save, the log entry will be sent to the TripLog backend service and saved in the database.

Offline data caching

Mobile apps have several benefits over web apps, one of which is the ability to operate offline and maintain offline data. There are a couple of reasons why offline data is important to a mobile app. First of all, you cannot guarantee that your app will always have a network connection and the ability to directly connect to live data. Supporting offline data allows users to use the app, even if only for limited use cases when they are operating with limited or no connectivity. Secondly, users expect mobile apps to offer high performance, specifically, quick access to data without having to wait.

By maintaining an offline cache, an app can present a user with data immediately while it's busy retrieving a fresh dataset, providing a perceived level of performance to the user. It is important that when the cache updates, the user receives that updated data automatically so that they are always seeing the latest data possible, depending on specific use cases, of course.

There are several ways of implementing a data cache in a mobile app, all depending on the size and complexity of the data that needs to be stored. In most cases, storing the cache in a local database using SQLite is the best approach.

In this chapter, we will update the TripLog app to maintain a cache of log entries and keep the cache in sync with the live API as data is received from the Azure backend service. The data cache will be stored in an SQLite database, but to ease the implementation, we will use an open source library called Akavache. Akavache provides not only caching capabilities, but also a very easy-to-use API to update the cache to be able to handle many different scenarios.

For the purposes of this book and the TripLog sample application, we will only be using a small subset of Akavache features. For a closer look at the Akavache library and all of its capabilities, check it out on GitHub at https://github.com/reactiveui/Akavache.

Adding the Akavache library

Like most libraries that we have used throughout this book, the Akavache library can be obtained via NuGet. First, add a reference to the library to the core library project and each of the platform-specific projects.

Next, we will need to add Akavache to our IoC container so that it can be injected into our ViewModels. Akavache comes with some static variables that make it very easy to use.

However, we want to instantiate our own instance and add it to the IoC, to maintain separation. To do this, update the Load method in the TripLogCoreModule Ninject module, as follows:

Bind<Akavache.IBlobCache>().ToConstant(Akavache.BlobCache.LocalMachine);

Maintaining an offline data cache

Currently, the TripLog app's MainViewModel calls the TripLogApiDataService to get its data directly from the live API. As mentioned at the beginning of this chapter, in the event of little or no connectivity, the TripLog app will fail to display any log entries. With a few minor modifications to the MainViewModel, we can set it up to use the Akavache library to retrieve log entries from a local cache, and also to refresh that cache with any changes in the dataset once a connection with a live API succeeds.

First, update the MainViewModel constructor to require an instance of Akavache.IBlobCache, which will be injected via our Ninject implementation from Chapter 4, Platform-Specific Services and Dependency Injection:

readonly IBlobCache _cache;
// ...
public MainViewModel(INavService navService, ITripLogDataService tripLogService, IBlobCache cache)
    : base (navService)
{
    _tripLogService = tripLogService;
    _cache = cache;
    LogEntries = new ObservableCollection<TripLogEntry> ();
}

Next, we will need to modify the logic in the LoadEntries() method to tie into the local offline cache. To do this, we will leverage an extension method in Akavache called GetAndFetchLatest. This method actually performs two functions. First, it immediately returns cached data, given a specific key (in our case, entries). Secondly, it makes a call to the API based on the given Func<> and updates the cache for the given key. Since it is performing two functions, it will ultimately return twice. In order to handle this, and because it is returning an IObservable, we can use the Subscribe extension method to handle each return as it occurs. In the Subscribe extension method, we will update the LogEntries ObservableCollection property on the MainViewModel based on what is either returned from the cache or from the subsequent API call, if successful:

void LoadEntries()
{
    if (IsBusy)
    {
        return;
    }
    IsBusy = true;
    try
    {
        // Load from local cache and then immediately load from API
        _cache.GetAndFetchLatest("entries", async () => await _tripLogService.GetEntriesAsync())
            .Subscribe(entries =>
            {
                LogEntries = new ObservableCollection<TripLogEntry>(entries);
                IsBusy = false;
            });
    }
    finally
    {
        IsBusy = false;
    }
}

The first time the app is launched with this code, the cache will be populated. On any subsequent launches of the app, you will notice that data appears immediately as the view is constructed. If you add an item to the backend service database and then launch the app again, you will notice that the new item falls into place after a couple of seconds.

Summary

In this chapter, we created a live API from scratch using Azure App Services. We then created a data service within our app to handle communication between the app and the API. Then, by adding a reference to this service to our ViewModels, we quickly transformed the app from using static data to using live data from our new API. Finally, we set up offline data caching. In the next chapter, we will add authentication to our API and update the app with sign-in capabilities.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset