Chapter 3

Building a Windows Azure–Based Service

What's In This Chapter?

  • Getting started with Windows Azure and SharePoint
  • Understanding Windows Azure table storage
  • Creating a Windows Communication Foundation Service in Windows Azure
  • Connecting to Windows Azure with Business Connectivity Services

Windows Azure is a good platform for running your services and data in the cloud, but how do you leverage these services and data from within SharePoint? In this chapter, you will learn about the Windows Azure Table service, which enables Windows Azure to host very large and scalable flat datasets in the cloud. (Azure storage services also include the Blob service and the Queue service, but these services are outside the scope of this chapter.) Windows Azure can also host your Windows Communication Foundation (WCF) services, and you will learn how to create a WCF service in Windows Azure that reads and writes data in a Windows Azure table. Then you will learn how to create a Business Connectivity Services (BCS) model to consume the Windows Azure service without writing any code.

The scenario this chapter uses leverages county tax data from all the states from 2007. You will see how to move this open-source government data into your Windows Azure table and consume it in your SharePoint applications. Although the example uses data from somewhere else, you could apply the same techniques for your own data, including moving some of your SharePoint lists to the cloud.

Before digging into how everything works, it is important to start at the beginning. Getting started can be one of the most challenging aspects to any new technology; and when you put two robust technologies together Windows Azure and SharePoint Server 2010, the challenges often multiply. The next section describes how to set up your development environment to build Windows Azure solutions with SharePoint and, more importantly, how to avoid some issues.

Getting Started

You need two things to develop Windows Azure and SharePoint solutions: SharePoint 2010 and the Windows Azure SDK and tools. Also, because you will most likely be developing solutions on a single machine using Hyper-V, you will need to work around some of the technical issues with running the Windows Azure compute emulator and SharePoint on the same machine. You will see one way to solve this problem using networking tunneling.

Installing SharePoint 2010

Getting started with SharePoint is easy because a prebuilt SharePoint 2010 Information Worker Virtual Machine (IW VM) is located at www.microsoft.com/download/en/details.aspx?id=21099. Download and run the IW VM in Hyper-V. You will need a 64-bit Windows Server 2008 R2 machine with at least 8 GB of memory (16 GB is even better). Another option is to run the SharePoint Easy Setup Script, located at http://channel9.msdn.com/Shows/SharePointSideshow/Building-a-SharePoint-Development-Machine-Using-the-Easy-Setup-Script. The Easy Setup Script will automatically build a native install or boot to a Virtual Hard Drive (VHD) SharePoint 2010 developer machine. Both are good options; however, if you want to do Windows Azure development, you need to use Hyper-V or have two machines, due to incompatibilities between SharePoint and the Windows Azure SDK. You will see how to work around this limitation in the “Creating an SSH Tunnel” section.

Installing Windows Azure SDK and Tools

Once you have SharePoint up and running, download and install the Windows Azure Tools for Microsoft Visual Studio 1.4 from http://go.microsoft.com/fwlink/?LinkID=128752. You only need to download and install the VSCloudService.exe, which contains both the SDK and Visual Studio tools. Do not install this SDK on the same machine you have SharePoint running on. If you do, you will most likely break one or all of the following in some way: Business Connectivity Services (BCS), User Profile Service, or Managed Metadata. This is a known issue, which likely will be fixed in the near future. For now, you need to install them on separate machines. Or, in this case, install the Windows Azure SDK on the Hyper-V host machine that is running the SharePoint VHD.

Creating an SSH Tunnel

Once you have the Windows Azure SDK installed on the Hyper-V host, this presents another problem. The Windows Azure tools use both a local compute emulator and a storage emulator. Typically, the compute emulator runs at 127.0.0.1:8080. The challenge is accessing this endpoint address from your SharePoint VHD. It is not possible to access this address externally from the machine that is running the compute emulator. This is done as a security mechanism to protect your machine from attack, but it makes it very difficult for the SharePoint and Windows Azure developer. There are a couple of possible solutions to this problem.

The first approach is to simply publish your Windows Azure services to the cloud before connecting to them with SharePoint. This means that SharePoint never actually connects to your Windows Azure developer environment. From a connectivity aspect, this is the easiest approach, but it makes it very difficult to debug and rapidly iterate over your code.

The second approach uses a Secure Shell (SSH) tunnel. Although the SSH tunnel is a little more complicated to understand and set up, it provides the same development experience as having everything installed on a single machine. I would like to credit Emmanuel Huna with documenting this approach on his blog at http://blog.ehuna.org/2009/10/accessing_the_windows_azure_lo.html. His post is not actually targeted at the SharePoint and Windows Azure developer, but it works very nicely for this purpose. The blog post is very clear to follow, but the following steps summarize the basic concept of the SSH tunnel approach:

1. Install the SSH server called freeSSHd, from www.freesshd.com, on the Hyper-V host, which is Windows Server 2008 R2. This is the machine where you are running the Windows Azure SDK.

2. Configure freeSSHd to open a tunnel on the host machine.

3. Install the SSH client called PuTTY, from www.chiark.greenend.org.uk/∼sgtatham/putty, on the SharePoint VHD and configure it to connect to the host machine.

Once you start the host SSH server and connect the SSH client to the host, you will be able to access the Windows Azure compute emulator at 127.0.0.1:8080 from the SharePoint server running in Hyper-V. This is a little complicated if you have not worked with SSH tunnels, but Emmanuel's blog post is very clear, and the diagram in Figure 3.1 should help you understand how all the various pieces fit together.

Initializing the Windows Azure Compute and Storage Emulators

After installing the Windows Azure SDK, you need to start the Windows Azure compute and storage emulators. Before starting the storage emulator, however, you need to initialize it. The Windows Azure storage emulator uses SQL Server to emulate Windows Azure storage tables, blobs, and queues. The initialization process creates the SQL database and tables to emulate these. When you start the storage emulator for the first time, the application automatically runs the initialization process. Although this generally works well, depending on the complexity of your SQL Server setup, you may have to run the tool manually from the command line. To do so, open the Windows Azure SDK command prompt as an administrator and run the following command (note the period at the end, which specifies using the default SQL Server instance):

DSInit /sqlInstance:.

After running the DSInit command, you can start the storage emulator from the Windows Azure icon running in the task tray. For more details, see the MSDN help page at http://msdn.microsoft.com/en-us/library/gg433132.aspx.

At this point, you should have everything installed and connected to begin developing Windows Azure and SharePoint solutions.

Using Windows Azure Table Storage

In Chapter 2, “Using SQL Azure for Business Intelligence,” you learned about SQL Azure databases. Windows Azure storage adds tables, queues, blobs, and drives. In this chapter, you will learn about Windows Azure table storage. Windows Azure tables are a collection of entities. Think of a table entity as being similar to a row in a database. Tables in Windows Azure are not restricted to a schema — that is, each entity (or row) can have a different shape or set of properties. An entity can have up to 255 properties, including the three reserved properties: PartitionKey, RowKey, and Timestamp. These three properties are automatically added to every entity. Tables can be split, or partitioned, to help Windows Azure scale them to very large sizes. The PartitionKey is the unique ID for a given partition, and the RowKey is the unique ID for a given row in a partition. This means that the PartitionKey plus the RowKey uniquely identify any row in a table. The Timestamp is the modified time of the entity.

Entities support only a small number of data types:

  • byte[]
  • bool
  • DateTime
  • double
  • Guid
  • Int32 or int
  • Int64 or long
  • String

String is the default property type for a property.

This chapter focuses on the Windows Azure service side of the solution. You will learn more about Windows Azure table storage in Chapter 8, “Financial Modeling with Excel Services and Windows Azure,” including how to create and populate Windows Azure tables. You will also leverage some of the samples located in the Windows Azure SDK, at http://msdn.microsoft.com/en-us/library/gg432966.aspx.

Creating the Windows Azure Web Role

In this project, you will create a WCF service that runs in a Windows Azure web role. You will add to this Visual Studio solution throughout this chapter. The purpose of this service is to read from and write data to a Windows Azure table, which you will create later in this chapter, and then the service will be called from SharePoint's BCS.

1. Open Visual Studio and create a new Windows Azure project called CountyTax, as shown in Figure 3.2. Remember that this example uses county tax data from across the U.S.A.

2. Add a WCF service web role called CountyTaxService, as shown in Figure 3.3. The WCF service web role is just a special version of the standard ASP.NET web role, with some added code to jump-start your WCF service development.

3. As shown in Figure 3.4, your Windows Azure solution is empty. Press F5 to ensure that everything is set up correctly and functioning. You can also delete the IService1.cs and Service1.svc service files, as you will not need them. You will also need references to the System.Data.Services.Client and Microsoft.VisualBasic namespaces.

As mentioned, you will add to this Visual Studio solution throughout the chapter. You'll start by adding the county tax data. The next section describes how to create the Entity Data Model (EDM), and how to create a Windows Azure table and bulk import some data.

Creating the Entity Data Model

Azure tables are based on entities from the ADO.NET Entity Framework. You can define an entity based on the shape of the data — that is, the properties the entity contains. Although there are many data sets to explore and consume, this example uses county tax data from the Data.gov site. The county tax data is located at http://explore.data.gov/Population/Tax-Year-2007-County-Income-Data/wvps-imhx, as shown in Figure 3.5.

The site allows you to view, filter, visualize, export, discuss, and embed the data. In this case, you want to export the data as a comma-delimited .csv file. Click the Export button and choose CSV from the list of supported export formats.

After downloading the .csv file, you need to do a little cleanup of the headers. By default, the site creates the headers with the same text shown on the site. While normally this is good, it is better to change these to match the entity property names that you will be using. Open the .csv file using Notepad and change the first line to the following:

StateCode,CountyCode,State,CountyName,TotalReturns,TotalExemptions,
AdjustedGrossIncome,Wages,DividendIncomes,InterestIncome

Now that you have the data that you want to use as the basis for your table, it is time to create an entity to represent this data schema. Add a class to the CountyTaxService project called CountyTaxEntity. Replace the generated code with the following code:

download
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using Microsoft.WindowsAzure.StorageClient;

namespace CountyTaxService
{
  // Entity for Windows Azure table
  public class CountyTaxEntity : TableServiceEntity
  {
    public CountyTaxEntity()
    {
    }

    // State Code
    public string StateCode { get; set; }
    // County Code
    public string CountyCode { get; set; }
    // State Abbreviation
    public string State { get; set; }
    // County Name
    public string CountyName { get; set; }
    // Total Number of Tax Returns
    public string TotalReturns { get; set; }
    // Total Number of Exemptions
    public string TotalExemptions { get; set; }
    // Adjusted Gross Income (In Thousands)
    public string AdjustedGrossIncome { get; set; }
    // Wages and Salaries Incomes (In Thousands)
    public string Wages { get; set; }
    // Dividend Incomes (In Thousands)
    public string DividendIncomes { get; set; }
    // Interest Income (In Thousands)
    public string InterestIncome { get; set; }

  }
}

code snippet 076576 Ch03_Code.zip/CountyTaxEntity.cs

The preceding code defines an entity called CountyTaxEntity with string properties for the same fields contained in the comma-delimited dataset you downloaded from Data.gov. Note that this entity derives from TableServiceEntity. The TableServiceEntity includes the other required properties for a Windows Azure table: PartitionKey, RowKey, and TimeStamp.

Populating the Windows Azure Table

Before you can use the application, you need a way to create and populate the Windows Azure table with the .csv data file you downloaded. In this section, you will learn how to create an upload page and code to bulk add the data.

1. Start by adding a new web form to the CountyTaxService project called Import.aspx. Replace the generated code in the Import.aspx file with the following code:

download
<%@ Page Title="Import" Language="C#"
    MasterPageFile="∼/Site.Master"
    AutoEventWireup="true"
    CodeBehind="Import.aspx.cs"
    Inherits="CountyTaxService.Import" %>

<asp:Content ID="HeaderContent"
             ContentPlaceHolderID="HeadContent"
             runat="server">
</asp:Content>
<asp:Content ID="BodyContent"
             ContentPlaceHolderID="MainContent"
             runat="server">
  <p>
    Upload a comma-delimited .csv file containing your contacts.
  </p>
  <br />
  <br />
  <table border="0" cellpadding="0" cellspacing="0">
    <tr>
      <td valign="top">
        <asp:FileUpload ID="Uploader" runat="server"
                        Style="margin-top: 0px"
                        Height="24px"
                        Width="472px" />&nbsp;
      </td>
      <td valign="top">
        <asp:Button ID="cmdUpload" runat="server"
                    Text="Import" OnClick="cmdUpload_Click" />
      </td>
    </tr>
  </table>
  <br />
  <asp:Label ID="lblInfo" runat="server"
             EnableViewState="False"
             Font-Bold="True">
  </asp:Label>
  <br />
  <br />
</asp:Content>

code snippet 076576 Ch03_Code.zip/Import.aspx

This code will create a page from which you can browse for the .csv data file and import it into your Windows Azure table (see Figure 3.6).

Actually, to get your page to look exactly like this, you need to add the Site.Master and Site.css pages to your site. You can find these in the included source code, or you can copy them from a standard web role project created with Visual Studio.

The important code happens on the click event of the Import button.

2. Add the following code to the Import.aspx.cs code-behind file. This code will read the file stream and start the import process.

download
 protected void cmdUpload_Click(object sender, EventArgs e)
    {
      // Check for file.
      if (this.Uploader.PostedFile.FileName == string.Empty)
      {
        this.lblInfo.Text = "No file specified.";
        this.lblInfo.ForeColor = System.Drawing.Color.Red;
      }
      else
      {
        try
        {
          // Import csv data into a table.
          this.ImportData(this.Uploader.PostedFile.InputStream);
        }
        catch (Exception err)
        {
          this.lblInfo.Text = err.Message;
          this.lblInfo.ForeColor = System.Drawing.Color.Red;
        }
      }
    }

code snippet 076576 Ch03_Code.zip/Import.aspx.cs

3. The Import button's click event handler calls the ImportData method. The ImportData method reads the comma-delimited file stream and parses out each row and value into a System.Data.DataTable object. Add the following code to the Import.aspx.cs class:

download // Process csv file and import data into the application.
    private void ImportData(Stream csvStream)
    {
      this.lblInfo.Text = "";

      // Create data table to hold the data in memory.
      DataTable dt = new DataTable();

      // Parse the csv file and add the data to the data table.
      using (var csvFile = new TextFieldParser(csvStream))
      {
        csvFile.TextFieldType = FieldType.Delimited;
        csvFile.SetDelimiters(",");
        csvFile.HasFieldsEnclosedInQuotes = true;

        // Read the first row of data
        // (which should contain column names).
        string[] fields = csvFile.ReadFields();

        // Add columns to data table, using first (header) row.
        DataColumn col = null;
        List<int> fieldIndices = new List<int>();

        if (!csvFile.EndOfData)
        {
          // The FirstName field is required,
          // since it's used for the partition key and row key.
          if (!fields.Contains("FirstName"))
          {
            this.lblInfo.Text = 
              "The .csv file must contain a FirstName field, " +
              "named in the first row of data.";
            this.lblInfo.ForeColor = System.Drawing.Color.Red;
          }

          // Create array of property names from CountyTaxEntity.
          List<string> propertyNames = new List<string>();
          foreach (PropertyInfo info in
              typeof(CountyTaxEntity).GetProperties())
          {
            propertyNames.Add(info.Name);
          }

          // Add a field to the data table if it
          // matches one defined by CountyTaxEntity.
          for (int i = 0; i < fields.Length; i++)
          {
            if (propertyNames.Contains(fields[i]))
            {
              col = new DataColumn(fields[i]);
              dt.Columns.Add(col);

              // Track the field's index,
              // so we know which ones to add data for below.
              // This way any fields other than those named
              // by CountyTaxEntity will be ignored. 
              fieldIndices.Add(i);
            }
          }
        }

        // Add data from each row to data table
        // where it matches column name.
        DataRow row = null;
        while (!csvFile.EndOfData)
        {
          // Get the current row from the csv file.
          string[] currentRow = csvFile.ReadFields();

          // Create a new row in the data table.
          row = dt.NewRow();

          // Copy the data from the csv to the data table.
          foreach (var index in fieldIndices)
          {
            row[index] = currentRow[index];
          }

          // Add the row.
          dt.Rows.Add(row);
        }
      }

      // Insert values from the data table
      // into a Windows Azure table.
      try
      {
        DataLayer.BulkInsertCountyTax(dt);

        // Redirect to main page.
        Response.Redirect("Default.aspx");
      }
      catch (ApplicationException e)
      {
        this.lblInfo.Text = 
          "Error importing csv file: " + e.Message;
        this.lblInfo.ForeColor = 
          System.Drawing.Color.Red;
      }
    }

code snippet 076576 Ch03_Code.zip/Import.aspx.cs

There is nothing specific to Windows Azure in the ImportData code. It iterates through each row and creates a corresponding DataRow object in the DataTable. See the code comments to understand exactly what is going on in each code block. Note that after it builds a DataTable object in memory, it passes it to the DataLayer class, which you will add next. The DataLayer class contains a static method called BulkInsertCountyTax, which has one parameter for the DataTable.

To recap, before adding the BulkInsertCountyTax method, at this point you have converted the comma-delimited data file to an in-memory DataTable. Next, you will convert the in-memory DataTable to a Windows Azure table.

4. Add a new class project item to the CountyTaxService project called DataLayer.cs. The DataLayer class will contain all the code to create, read, write, update, delete, and list data from the Windows Azure table. You need to add a number of references to support the Windows Azure tables. Add references to Microsoft.WindowsAzure.StorageClient, System.Data.Services.Client, and System.Data.DataSetExtensions. Add the following using statements to the top of the class:

download
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;
using System.Data.Services.Client;

code snippet 076576 Ch03_Code.zip/DataLayer.cs

The next step will create the actual bulk import code. The first couple of lines delete and re-create the table, if it exists. This is simply done for the demo and enables the demo to be rerun repeatedly. Next, the DataTable is sorted by the StateCode. This is done because the table is partitioned by StateCode. Partitioning is one way that Windows Azure scales tables, by splitting them based on the PartitionKey. When doing a bulk update, you can only update against one partition per update. This means that you need to ensure that all the items in the batch match. The technique used here is to keep track of the StateCode; and when it changes, do the batch update before moving on to the next StateCode. Note there is a limit of 100 items per batch update. The code checks for either condition, the StateCode changing or the item count, before it does an update.

5. The BulkImportCountyTax method creates a TableServiceContext object called context. The context object holds the updates in memory until the SavesChanges method is called, as previously explained. The code iterates through the DataTable and calls the InsertCountyTaxInternal method, which you will implement next. The code passes the context object and each field in the DataRow to the method, which creates a CountyTaxEntity and adds it to the context. Add the following code to the DataLayer class to create the BulkInsertCountyTax method:

download
// Bulk insert county tax entities from a DataTable object.
    public static void BulkInsertCountyTax(DataTable dt)
    {
      //DEMO: Delete the table first on Bulk Import
      tableClient.DeleteTableIfExist(tableName);
      //DEMO: Create the table if it does not exist.
      tableClient.CreateTableIfNotExist(tableName);

      // Ensure that the data table will be filtered case-insensitively.
      dt.CaseSensitive = false;

      // Bulk saves can only be made on same partition
      // Need to sort by state code, which is the partition key
      dt.DefaultView.Sort = "StateCode ASC";
      string lastStateCode = "";

      // Get data context.
      TableServiceContext context = tableClient.GetDataServiceContext();

      // counter to track batch items
      int i = 0;

      // Create and add each entity.
      foreach (DataRow row in dt.Rows)
      {
        // initialize the partition check
        if (lastStateCode == "") lastStateCode = 
                      row.Field<string>("StateCode");

        // Batch supports only 100 transactions at a time,
        // so if we hit 100 records for this partition,
        // submit the transaction and keep going.
        // or submit if the State code changes
        // (which is the Table Partition Key)
        if (i == 100 || lastStateCode !=
                row.Field<string>("StateCode"))
        {
          // Save changes, using the Batch option.
          context.SaveChanges(
            System.Data.Services.Client.SaveChangesOptions.Batch);

          // Reset the counter.
          i = 0;
        }

        // Insert the new entity for this row.
        InsertCountyTaxInternal(
            context,
            row.Field<string>("StateCode"),
            dt.Columns.Contains("CountyCode") ?
              row.Field<string>("CountyCode") : string.Empty,
            dt.Columns.Contains("State") ?
              row.Field<string>("State") : string.Empty,
            dt.Columns.Contains("CountyName") ?
              row.Field<string>("CountyName") : string.Empty,
            dt.Columns.Contains("TotalReturns") ?
              row.Field<string>("TotalReturns") : string.Empty,
            dt.Columns.Contains("TotalExemptions") ?
              row.Field<string>("TotalExemptions") : string.Empty,
            dt.Columns.Contains("AdjustedGrossIncome") ?
              row.Field<string>("AdjustedGrossIncome") : string.Empty,
            dt.Columns.Contains("Wages") ?
              row.Field<string>("Wages") : string.Empty,
            dt.Columns.Contains("DividendIncomes") ?
              row.Field<string>("DividendIncomes") : string.Empty,
            dt.Columns.Contains("InterestIncome") ?
              row.Field<string>("InterestIncome") : string.Empty);

        // Increment the counter.
        i++;

         lastStateCode = row.Field<string>("StateCode");
      }

      // Save changes, using the Batch option.
      context.SaveChanges(SaveChangesOptions.Batch);

    }

code snippet 076576 Ch03_Code.zip/DataLayer.cs

6. The InsertCountyTaxInternal method is a private method to add a CountyTaxEntity object to a TableServiceContext. This is the method that creates the entity in the Windows Azure table. The code first creates a new CountyTaxEntity object. Next, set the StateCode as the PartitionKey. The other required field is the RowKey, which needs to be unique, so it is set as a Guid just to keep things simple for this example. After that, it is just a matter of setting each field of the CountyTaxEntity object with the matching property passed into the method. To save the entity onto the context, call the AddObject method of the context object, passing the table name and the CountyTaxEntity object. Add the following code to the DataLayer class:

download
 // Insert a new county tax.
    private static string InsertCountyTaxInternal(TableServiceContext context,
      string StateCode,
      string CountyCode,
      string State,
      string CountyName,
      string TotalReturns,
      string TotalExemptions,
      string AdjustedGrossIncome,
      string Wages,
      string DividendIncomes,
      string InterestIncome)
    {
      // Create the new entity.
      CountyTaxEntity entity = new CountyTaxEntity();

      // Partition key is the State Abbreviation.
      entity.PartitionKey = State.ToUpper();

      // Row key is a GUID
      entity.RowKey = Guid.NewGuid().ToString();

      // Populate the other properties.
      entity.StateCode = StateCode;
      entity.CountyCode = CountyCode;
      entity.State = State;
      entity.CountyName = CountyName;
      entity.TotalReturns = TotalReturns;
      entity.TotalExemptions = TotalExemptions;
      entity.AdjustedGrossIncome = AdjustedGrossIncome;
      entity.Wages = Wages;
      entity.DividendIncomes = DividendIncomes;
      entity.InterestIncome = InterestIncome;
      // Add the entity.
      context.AddObject(tableName, entity);

      return entity.RowKey;
    }

code snippet 076576 Ch03_Code.zip/DataLayer.cs

7. There is one last thing to do before you run this code. You need to get a connection to the Windows Azure storage using your storage account information. In this case, because you are running this in the local storage emulator, you don't need a real Windows Azure storage account just yet. You want this code to run once when the service starts up. The WebRole.cs file is created for you when you created the project. The WebRole class contains an OnStart method that runs when the service starts. Replace the following code in the WebRole.cs file:

download
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.Diagnostics;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;
‘
namespace CountyTaxService
{
  public class WebRole : RoleEntryPoint
  {
    public override bool OnStart()
    {
      // To enable the AzureLocalStorageTraceListner,
      // uncomment relevant section in the web.config 
      DiagnosticMonitorConfiguration diagnosticConfig = 
        DiagnosticMonitor.GetDefaultInitialConfiguration();
      diagnosticConfig.Directories.ScheduledTransferPeriod = 
        TimeSpan.FromMinutes(1);
      diagnosticConfig.Directories.DataSources.Add(
        AzureLocalStorageTraceListener.GetLogDirectory());

      // For information on handling configuration changes
      // see the MSDN topic at
      // http://go.microsoft.com/fwlink/?LinkId=166357.

      // Get connection string and table name
      // from the role's configuration settings.
      string connectionString = 
        RoleEnvironment.GetConfigurationSettingValue(
                            "StorageConnectionString");
      string tableName = 
        RoleEnvironment.GetConfigurationSettingValue(
                            "TableName");

      CloudStorageAccount storageAccount = 
        CloudStorageAccount.Parse(connectionString);

      CloudTableClient tableClient = 
        storageAccount.CreateCloudTableClient();

      // Create the table if it does not exist.
      tableClient.CreateTableIfNotExist(tableName);

      return base.OnStart();
    }
  }
}

code snippet 076576 Ch03_Code.zip/WebRole.cs

8. The table connection information is stored in a couple of Azure service configuration files, ServiceConfiguration.cscfg and ServiceDefinition.csdef. In the ServiceConfiguration.cscfg file, set the connection string to UseDevelopmentStorage =true. Add the following code to the ServiceConfiguration.cscfg file in the CountyTax project:

download
<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="CountyTax"
xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
                     osFamily="1" osVersion="*">
 <Role name="CountyTaxService">
   <Instances count="1" />
   <ConfigurationSettings>

     <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"
              value="UseDevelopmentStorage=true" />

     <Setting name="StorageConnectionString"
              value="UseDevelopmentStorage=true" />

     <Setting name="TableName"
              value="CountyTax" />
   </ConfigurationSettings>
 </Role>
</ServiceConfiguration>

code snippet 076576 Ch03_Code.zip/ServiceConfiguration.cscfg

9. Before you can use custom properties in the ServiceConfiguration.cscfg file, you need to define them in the ServiceDefinition.csdef file. Add the following code to define the StorageConnectionString and TableName settings:

download
<?xml version="1.0" encoding="utf-8"?>
<ServiceDefinition name="CountyTax"
xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
  <WebRole name="CountyTaxService">
    <Sites>
      <Site name="Web">
        <Bindings>
          <Binding name="Endpoint1" endpointName="Endpoint1" />
        </Bindings>
      </Site>
    </Sites>
    <Endpoints>
      <InputEndpoint name="Endpoint1" protocol="http" port="8080" />
    </Endpoints>
    <Imports>
      <Import moduleName="Diagnostics" />
    </Imports>
    <ConfigurationSettings>
      <Setting name="StorageConnectionString" />
      <Setting name="TableName" />
    </ConfigurationSettings>
    <LocalResources>
      <LocalStorage name="CountyTaxService.svclog"
                    sizeInMB="1000" cleanOnRoleRecycle="false" />
    </LocalResources>
  </WebRole>
</ServiceDefinition>

code snippet 076576 Ch03_Code.zip/ServiceDefinition.csdef

10. Set the CountyTax project as the startup project and the Import.aspx as the startup page, and then press F5 to run the Windows Azure web role. When the Import page loads, click the Browse button, select the .csv file that you created earlier in the chapter, and then click the Import button to load the .csv file into the Windows Azure table.

You can view the storage emulator using Visual Studio's Server Explorer, as shown in Figure 3.7.

You now have your data imported into a newly created Windows Azure table. The next step is to create a Windows Azure WCF service endpoint for the SharePoint BCS to connect to.

Creating a WCF Service Endpoint

The second part of this example creates a Windows Azure service endpoint for the SharePoint BCS client to interact with. In this example, you have already created a Windows Azure WCF service project called CountyTaxService. Up to this point, you have used it to host the Import.aspx web page and the DataLayer code to import the table data from the .cvs file. Now it is time to add the actual service.

Building a Windows Azure Service

To add the service, add a new WCF service project item called CountyTaxService to the CountyTaxService project. This will create the following three files:

  • CountyTaxService.svc — The actual endpoint for the service, which in this case would be accessed using http://127.0.0.1:8080
  • CountyTaxService.svc.cs — The code-behind file for the CountyTaxService.svc file. This is where the actual service code is placed.
  • ICountyTaxService.cs — The service-contract interface.

You'll implement the service-contract interface first. You know that the service is going to be used to connect SharePoint BCS services with your Windows Azure table, so you can create the interface with this in mind. Add the following code to the ICountyTaxService.cs file:

download
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.ServiceModel;
using System.Text;

namespace CountyTaxService
{
  // NOTE: You can use the "Rename" command on the "Refactor" menu to change
  // the interface name "ICountyTaxService" in both code and config file
  // together.
  [ServiceContract]
  public interface ICountyTaxService
  {
    [OperationContract]
    CountyTaxEntity ReadItem(string RowKey);

    [OperationContract]
    List<CountyTaxEntity> ReadList();

    [OperationContract]
    string Create(CountyTaxEntity countyTaxEntity);

    [OperationContract]
    string Update(CountyTaxEntity countyTaxEntity);

    [OperationContract]
    bool Delete(string RowKey);
  }
}

code snippet 076576 Ch03_Code.zip/ICountyTaxService.cs

For the BCS connection, SharePoint just needs a few methods to handle the CRUD operations. The ReadItem, ReadList, Create, Update, and Delete methods are defined in the interface, which you can implement in the CountyTaxService.svc.cs file. This class implements the ICountyTaxService interface. The code in this class is very minimal as well, because all the actual code is implemented in the DataLayer class.

download
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.ServiceModel;
using System.Text;

namespace CountyTaxService
{
  public class CountyTaxService : ICountyTaxService
  {

    public CountyTaxEntity ReadItem(string RowKey)
    {
      CountyTaxEntity ct = 
        DataLayer.GetCountyTaxByRowKey(RowKey);

      return ct;
    }

    public List<CountyTaxEntity> ReadList()
    {
      List<CountyTaxEntity> CountyTaxes =
        DataLayer.GetAllCountyTax();

      return CountyTaxes;
    }

    public string Create(CountyTaxEntity countyTaxEntity)
    {
      CountyTaxEntity ct = countyTaxEntity;

      string rowkey = DataLayer.InsertCountyTax(
          ct.StateCode,
          ct.CountyCode,
          ct.State,
          ct.CountyName,
          ct.TotalReturns,
          ct.TotalExemptions,
          ct.AdjustedGrossIncome,
          ct.Wages,
          ct.DividendIncomes,
          ct.InterestIncome);

      return rowkey;
    }

    public string Update(CountyTaxEntity countyTaxEntity)
    {
      CountyTaxEntity ct = countyTaxEntity;

      string rowkey = DataLayer.UpdateCountyTax(
          ct.RowKey,
          ct.StateCode,
          ct.CountyCode,
          ct.State,
          ct.CountyName,
          ct.TotalReturns,
          ct.TotalExemptions,
          ct.AdjustedGrossIncome,
          ct.Wages,
          ct.DividendIncomes,
          ct.InterestIncome);

      return rowkey;
    }

    public bool Delete(string RowKey)
    {
      bool succeeded = 
        DataLayer.DeleteCountyTaxByRowKey(RowKey);

      return succeeded;
    }
  }
}

code snippet 076576 Ch03_Code.zip/CountyTaxService.svc.cs

Next, you will implement the code in the Datalayer class, which uses a Windows Azure storage client to read, write, update, and delete the table data.

Accessing Table Storage Data

Although the Windows Azure Table service has a REST interface to access the data, the Windows Azure team has created a client proxy to handle all the REST details for you. This makes writing the CRUD code very easy. As you can see in the following code, most of the work is handled using the CreateQuery method of the context object. The CreateQuery method uses a LINQ query to access the data. Note that the following code shows two different ways to use the LINQ syntax. Both examples are valid; it is really just a matter of preference. Replace the following code in the DataLayer class:

download
private static string connectionString;
    private static string tableName;
    private static CloudStorageAccount storageAccount;
    private static CloudTableClient tableClient;

    static DataLayer()
    {
      // Get connection string and table name from settings.
      connectionString = 
        RoleEnvironment.GetConfigurationSettingValue(
        "StorageConnectionString");
      tableName = 
        RoleEnvironment.GetConfigurationSettingValue(
        "TableName");

      // Reference storage account from connection string.
      storageAccount = CloudStorageAccount.Parse(connectionString);

      // Create Table service client.
      tableClient = storageAccount.CreateCloudTableClient();
    }

    // Get county tax filtered by state.
    public static List<CountyTaxEntity> GetCountyTaxByState(string state)
    {
      IQueryable<CountyTaxEntity> query = 
        tableClient.GetDataServiceContext().
        CreateQuery<CountyTaxEntity>(tableName);
      if (state != null)
      {
        query = query.Where(c => c.PartitionKey == state.ToUpper());
      }
      return query.AsTableServiceQuery().ToList();
    }

    // Get county tax filtered by state.
    public static List<CountyTaxEntity> GetAllCountyTax()
    {
      IQueryable<CountyTaxEntity> query = 
        tableClient.GetDataServiceContext().
        CreateQuery<CountyTaxEntity>(tableName);

      return query.AsTableServiceQuery().ToList();
    }

    // Get CountyTax using RowKey.
    public static CountyTaxEntity GetCountyTaxByRowKey(string RowKey)
    {
        // Get data context.
        TableServiceContext context = tableClient.GetDataServiceContext();

        return GetCountyTaxByRowKey(RowKey, context);
    }

    // Get CountyTax using RowKey.
    public static CountyTaxEntity GetCountyTaxByRowKey(
      string RowKey, TableServiceContext context)
    {

        CountyTaxEntity countyTax =
          (from ct in context.CreateQuery<CountyTaxEntity>(tableName)
           where
             ct.RowKey == RowKey
           select ct).Single();

        return countyTax;
    }

    // Create new context and get CountyTax.
    public static CountyTaxEntity GetCountyTax(
      string State, string CountyCode)
    {
      // Get data context.
      TableServiceContext context = tableClient.GetDataServiceContext();

      return GetCountyTax(State, CountyCode, context);
    }

    // Get CountyTax using existing context.
    public static CountyTaxEntity GetCountyTax(
      string State, string CountyCode, TableServiceContext context)
    {
      CountyTaxEntity countyTax = 
        (from ct in context.CreateQuery<CountyTaxEntity>(tableName)
        where
          ct.PartitionKey == State &&
          ct.CountyCode == CountyCode
        select ct).Single();

      return countyTax;
    }

    // Update contact data.
    public static string UpdateCountyTax(
      string RowKey,
      string StateCode,
      string CountyCode,
      string State,
      string CountyName,
      string TotalReturns,
      string TotalExemptions,
      string AdjustedGrossIncome,
      string Wages,
      string DividendIncomes,
      string InterestIncome)
    {

      // Get data context.
      TableServiceContext context = tableClient.GetDataServiceContext();

      // Set updated values
      CountyTaxEntity entity = GetCountyTax(State, CountyCode, context);

      // Populate the other properties.
      entity.StateCode = StateCode;
      entity.CountyCode = CountyCode;
      entity.State = State;
      entity.CountyName = CountyName;
      entity.TotalReturns = TotalReturns;
      entity.TotalExemptions = TotalExemptions;
      entity.AdjustedGrossIncome = AdjustedGrossIncome;
      entity.Wages = Wages;
      entity.DividendIncomes = DividendIncomes;
      entity.InterestIncome = InterestIncome;

      // Update the object.
      context.UpdateObject(entity);

      // Write changes to the Table service.
      context.SaveChanges();

      return entity.RowKey;
    }

    // Insert a new contact.
    public static string InsertCountyTax(
      string StateCode,
      string CountyCode,
      string State,
      string CountyName,
      string TotalReturns,
      string TotalExemptions,
      string AdjustedGrossIncome,
      string Wages,
      string DividendIncomes,
      string InterestIncome)
    {
      // Get data context.
      TableServiceContext context = tableClient.GetDataServiceContext();

      // Insert the new entity.
      string rowkey = InsertCountyTaxInternal(context,
          StateCode,
          CountyCode,
          State,
          CountyName,
          TotalReturns,
          TotalExemptions,
          AdjustedGrossIncome,
          Wages,
          DividendIncomes,
          InterestIncome);

      // Save changes to the service.
      context.SaveChanges();

      return rowkey;
    }

    // Insert a new county tax.
    private static string InsertCountyTaxInternal(
      TableServiceContext context,
      string StateCode,
      string CountyCode,
      string State,
      string CountyName,
      string TotalReturns,
      string TotalExemptions,
      string AdjustedGrossIncome,
      string Wages,
      string DividendIncomes,
      string InterestIncome)
    {
// Create the new entity.
      CountyTaxEntity entity = new CountyTaxEntity();

      // Partition key is the State Abbreviation.
      entity.PartitionKey = State.ToUpper();

      // Row key is a GUID
      entity.RowKey = Guid.NewGuid().ToString();

      // Populate the other properties.
      entity.StateCode = StateCode;
      entity.CountyCode = CountyCode;
      entity.State = State;
      entity.CountyName = CountyName;
      entity.TotalReturns = TotalReturns;
      entity.TotalExemptions = TotalExemptions;
      entity.AdjustedGrossIncome = AdjustedGrossIncome;
      entity.Wages = Wages;
      entity.DividendIncomes = DividendIncomes;
      entity.InterestIncome = InterestIncome;
      // Add the entity.
      context.AddObject(tableName, entity);

      return entity.RowKey;
    }
// Delete a county tax.
    public static bool DeleteCountyTaxByRowKey(string RowKey)
    {
      // Get data county tax.
      TableServiceContext context = 
        tableClient.GetDataServiceContext();

      // Retrieve county tax.
      CountyTaxEntity entity = 
        GetCountyTaxByRowKey(RowKey);

      // Delete the entity.
      context.DeleteObject(entity);

      // Save changes to the service.
      context.SaveChanges();

      return true;
    }

code snippet 076576 Ch03_Code.zip/DataLayer.cs

In Visual Studio, set the start page to CountyTaxService.svc. Press F5 to start your Windows Azure service running on the local compute emulator. This will launch CountyTaxService.svc in the browser, as shown in Figure 3.8, and attach the Visual Studio debugger.

You have now created all the Windows Azure parts of the solution: a Windows Azure table to store the data, as well as a Windows Azure WCF service to access the data. The last step is to create a SharePoint BCS external content type using SharePoint 2010 Designer. This will enable you to connect to the Windows Azure table as though it were a native SharePoint list.

Creating a SharePoint BCS External Content Type

SharePoint has the ability to connect to external data in systems such as SQL Server using BCS. BCS creates a mapping to the external data, called an external content type (ECT). You can create an ECT in one of two ways: programmatically in Visual Studio or declaratively in SharePoint Designer (SPD). In this example, you will use SPD to create the external content type declaratively without writing any code.

Consuming a Windows Azure Service Using BCS

Once you have the Windows Azure service created, consuming that service using SPD is easy. In fact, at this point there is nothing special about connecting to the Windows Azure service that is different from connecting to any other service. So, if you are familiar with BCS and SPD, this should be straightforward.

1. Open SPD and connect to the SharePoint site in which you want to create the ECT.

2. Click the External Content Types item in the left navigation pane of SPD. In the right pane, you will see a list of existing ECTs. Create a new ECT by clicking the External Content Type button in the New ribbon group.

3. Set the Name and Display Name properties to AzureCountyTax. You can leave all the other properties at their default values, as shown in Figure 3.9.

4. Now you need to map the CRUD operations of the ECT to the WCF service you created in the previous section. Click the hyperlink labeled Click Here to discover external data sources and define operations. The Operations Designer page will open. The Operations Designer is where you manage the ECT mapping.

5. Click the Add Connection button and choose WCF Service from the External Data Source Type Selection dialog.

6. In the WCF Connection dialog, set the Service Metadata URL to http://demo2010a:8080/CountyTaxService.svc?wsdl. Remember, this is the endpoint of your Windows Azure service. In this case, it is the local server name where SharePoint is running, which is forwarded to the host machine at http:127.0.0.1:8080 through the SSH tunnel.

7. Set the Service Endpoint URL to http://demo2010:8080/CountyTaxService.svc, and the Metadata Connection mode to WSDL.

8. Set the Proxy Namespace to BCSServiceProxy. Your settings should look like Figure 3.10.

9. Click OK to connect to the Windows Azure service. You can verify that the connection was created when you see the Web Method operations in the Data Source Explorer tab, similar to Figure 3.11.

10. Map each BCS operation to the corresponding service operation from the context menu of each service operation. As shown in Figure 3.12, the following BCS operations are available: Read Item, Read List, Create, Update, and Delete. These match the names of the operations you created in the WCF service previously, making it easy to pick the correct one.

The following steps describe mapping the Update operation.

11. Right-click on Update and select New Update Operation from the context menu. The Operation Mapping wizard opens. The first page is the Operation Properties page (see Figure 3.13). You can click Next without making any changes.

12. The second step in the wizard is the Input Parameters Configuration page. This is where you map each input and output parameter between the BCS operations and the WCF service operations. This example is for the update operation, so select all the data source elements. The only change you need to make is to select the RowKey element and check the Map to Identifier checkbox. Set the Identifier drop-down to RowKey. Ensure your settings match those in Figure 3.14, and then click Finish to create the operation.

Once you have finished mapping all the operations, you will see a page in SPD similar to Figure 3.15.

13. The last step is to create an external list from the ECT you just created. SPD makes this very easy to do with a single button click. Click the Create Lists & Forms button on the ribbon. In the Create List and Form for AzureCountyTax dialog, choose Create New External List and name the new list AzureCountyTax. This will create a list like the one shown in Figure 3.16.

All the operations that you expect to work will work in this external list. For example, creating, updating, and deleting records works just like it does for any native SharePoint list.

Summary

You have seen how you can leverage Windows Azure services in your SharePoint solutions using BCS. This pattern enables SharePoint developers to create hybrid applications that can offline work to Windows Azure. Using BCS as the bridge between Windows Azure and SharePoint ensures that the SharePoint and Windows Azure are loosely coupled though the declarative operation mapping of BCS so that if either side of the solution changes, you do not need to write, test, and deploy code to handle the changes. This makes for a very flexible solution. Finally, using BCS ensures that the Windows Azure data appears integrated in SharePoint as a first-class citizen. This configuration makes it very easy for SharePoint and Office power users and developers to take advantage of the power of the Windows Azure platform from an environment in Office and SharePoint that they are familiar with.

Additional References

Following are two additional references that you might find useful:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset