Part IV

Working with Azure

  • Chapter 8: Setting Up Azure
  • Chapter 9: Identity in Azure
  • Chapter 10: Leveraging Blob Storage

Chapter 8

Setting Up Azure

What’s in This Chapter?

  • Setting up your Windows Azure account
  • Getting your development environment set up for Windows Azure
  • Making your first Windows Azure application
  • Deploying your first Windows Azure application
  • Controlling your Service by programming it

To host a working application on Windows Azure, you need to have a Windows Azure subscription and set it up for your application. In this chapter you learn how to do this, and you learn about the management portal through which you can manage subscriptions and hosted applications. But before you can even think about deploying an application, you need to develop it, and that means you need a development environment. You don’t just need a development tool; you need an environment that emulates Windows Azure as it works in the cloud. Apart from setting up the environments, in this chapter you learn about the components that make up the Windows Azure hosting environment and the development environment. You also learn about the role of each component, and how to work with the core components, by creating a simple application and deploying it to the Windows Azure production environment.

Getting Windows Azure

Getting Windows Azure isn’t about downloading and installing software. It’s an online service, so you need to register to gain access; after this, you can view a management portal through which you can control the different components of Windows Azure. You also gain access to a subscription portal to manage billing.

The following sections walk you through everything you need to start working with Windows Azure.

Registering for a Windows Azure Account

To register for Windows Azure, go to www.windowsazure.com. There you can find information about Windows Azure, and how you can “buy” it. At its core, Windows Azure uses a Pay-As-You-Go model; you pay for what you consume of the services it offers. If you don’t use Azure at all, you pay nothing, so you can safely set up an account without being charged. Although Microsoft does offer invoiced use of Windows Azure, by default you pay for it through a credit card. The following purchase options are available for Azure:

  • Free trial: You can sign up for a free trial, which gives you free use up to a certain quota for a fixed period of time after which you’re charged the going rate.
  • Plan: If you know upfront how much you will consume, you can also purchase a Plan, which gives you a monthly prepaid quota at a discount instead of the Pay-As-You-Go model. Anything you consume beyond the quota is billed afterwards.
note.eps
MSDN Subscribers, Microsoft Partners, and BizSpark program members can use Windows Azure at no charge up to a certain quota so before you register, it makes sense to determine if you qualify for free service under one of these memberships.

To start you must set up a subscription. A subscription is basically a container for services you want to consume under a single invoice. You can set up a subscription in a few simple steps:

1. Select one of the purchase options.
2. Sign up with your Windows Live ID.
3. Verify your account with your mobile phone.
4. Enter your credit card and billing information and accept the Subscription Agreement and Rate Plan.

After you enter all the information, your subscription will be provisioned, meaning that when the subscription is set up, the subscription details are available in the Microsoft Online Services Customer Portal, and you can manage the environment through the Windows Azure Platform Management Portal. You get e-mails that guide you to both portals when your subscription is provisioned, and the browser can automatically switch to the Management Portal when everything is set up.

A Tour of the Azure Portal

You use the Windows Azure Platform Management Portal to manage everything for your Windows Azure environment. The Management Portal is a Silverlight application, so you must have a browser capable of running Silverlight installed. You can access the Management Portal at https://windows.azure.com. After you log in with your Windows Live ID, you see the screen in Figure 8-1.

The Management Portal screen consists of several elements (refer to Figure 8-1).

1. Navigation bar: This is at the top and you can change the portal language here, as well as navigate to the Microsoft Online Services Customer Portal to see your billing information, and sign out of the portal.
2. Main menu: Located at the bottom left, you use this menu to manage the different components of the platform.
3. Submenus: Above the main menu is the submenu, where you can access information and tasks related to the chosen menu item.
4. Taskbar: You can also start common tasks from the taskbar. These tasks are applicable to the current context (that is, the chosen menu and submenu items). The tasks shown here differ as you navigate through the portal.
5. Main screen: This is where you get an overview of what you’ve selected in the menu and where you can take action. In some situations there is a properties window on the right of the main screen.
What’s in the Main Menu?
Besides Home, the main menu contains items for the key components of the Windows Azure Platform:
  • Hosted Services, Storage Accounts & CDN: In this chapter you mainly deal with this section.
  • Database: This section manages SQL Azure and is covered in Chapter 11.
  • Data Sync and Reporting: These sections are related to the Data Database section. With Data Sync, you can manage data synchronization between on-premises SQL Server databases and SQL Azure databases or between SQL Azure databases. In the Reporting section, you can manage SQL Azure Reporting Services.
  • Service Bus, Access Control & Caching: In this section, you can manage the AppFabric Service Bus covered in Chapter 13, the AppFabric Access Control Service covered in Chapter 14, and the AppFabric Cache.
  • Virtual Network: This section manages AppFabric Connect, which is covered in Chapter 15.
Some of the features available in the Windows Azure platform may still be offered in beta or Community Technology Preview (CTP). In that case you can sign up for the beta or CTP under Beta Programs in the submenu of Home.
note.eps
System administrators may prefer to manage applications using the Windows Azure Platform PowerShell Cmdlets available at http://wappowershell.codeplex.com/. Another option is the Windows Azure Platform Management Tool, a Microsoft Management Console Snap-in available at http://wapmmc.codeplex.com/.

Managing the Windows Azure Environment

The opening screen of Hosted Services, Storage Accounts & CDN section shows the Deployment Health overview. This is basically a dashboard that shows you the health of everything you run. Although this is not clear from the overview, there is a hierarchy you should be aware of. The top level is the subscription, of which you can have multiple. Under a subscription you can have multiple hosted services, which in turn can contain different deployments.

A hosted service is basically an application that you host in Windows Azure. You can have a production deployment and/or a staging deployment of a hosted service. These are two identical deployments of your application. However, the production deployment is meant to run your live application, whereas the staging deployment is meant for final testing before you go into production. A staging environment basically gives you the opportunity to install and test your application as if it were in production. The next step is then to make the staging environment the production environment and vice versa. This way, you can deploy an application with minimum risk. If the staged application fails in production, you can switch back to the original production deployment.

In a deployment you can have multiple roles. For now, you can think of a role as a single part of your application, running in its own environment. As an example, you can have a website talking to web services. The website would be in one role, and the web services in another. Roles are discussed in more detail in the section “Understanding Azure Roles” later in this chapter.

A role can be run on multiple instances. An instance is comparable to a (virtual) machine. To facilitate load balancing and failover, you typically run your application on at least two instances. Microsoft gives uptime guarantees only if you run at least two instances per role. If your uptime requirements aren’t that stringent, you can, of course, run on a single instance. You can also add more instances if the load on your application increases.

Affinity Groups

Windows Azure is hosted from quite a few datacenters around the world. When you deploy an application, you can tell Windows Azure in which region or subregion you want your application hosted. Examples of regions are Anywhere US and Anywhere Europe. A subregion is more specific, for instance South Central US or West Europe. This location can be significant because the farther users are from the datacenter an application is hosted, the higher the latency. This may result in significant performance degradation.

Now, a subregion can consist of multiple datacenters, and datacenters are so huge that there can be several switches and routers between two different applications in the same datacenter. So, if you have two applications that communicate with each other, latency may be a factor. This is where affinity groups come in. You can view an affinity group as a directive to Windows Azure to host two applications as close to each other within the same datacenter as possible. You can also include Windows Azure Storage in an affinity group. This doesn’t increase only performance, but it can also lower cost because network traffic within a datacenter is free. Network traffic between datacenters on the other hand is charged.

Management Certificates

You can manage most of your applications using the Management Portal. However, there is also a Management API through which Windows Azure subscriptions can be managed. When you deploy applications through Visual Studio, this API is used. You can imagine that the security of this Management API is critical. After all, you don’t want some unauthorized person to play with your application settings, or worse, deploy another application that does harmful things to your users. To avoid this, the Management API is secured with certificates and you need to upload at least one to use the Management API. You don’t have to, but it makes some operations much simpler. This is discussed in more detail in the section “Deploying from Visual Studio.”

User Management

If you plan to have administrators manage applications in the Management Portal, just having a single Windows Live ID that you can use isn’t handy. The owner of the subscription, which is the Windows Live ID to which the subscription is tied, can create one or more Co-Admin accounts under User Management. A Co-Admin can do the same as the owner, except create new subscriptions.

Hosted Services

Under Hosted Services you can find detailed information about all the hosted services you have and the active deployments of these hosted services. As you can see in Figure 8-2, the subscription-hosted services-deployment-roles hierarchy discussed is shown.

In Figure 8-2 you also see a Certificates folder under each hosted service. Here you configure certificates used by your application, either for a secure SSL connection or for other purposes, such as signing and encryption.

Under a hosted service you can have one production deployment and one staging deployment. This is reflected in the taskbar, which does not enable you to add a production deployment to a hosted service already containing a production deployment. If you have a running deployment, you can fold it open to see all the roles and the instances running those roles. Clicking any of the nodes in the hierarchy displays detailed information in the Properties window on the right. A deployment is selected (refer to Figure 8-2), and among others the URL to access the application is shown, as well information regarding the public IP address and when the application was deployed.

Storage Accounts

Under Storage Accounts you manage your accounts on Azure Storage. These accounts are separate from hosted services because Azure Storage is in a sense one big data store managed by Microsoft. It isn’t hosted on distinguishable instances but on a huge farm. You can create multiple storage accounts. These don’t need to correspond with hosted services, but if there is a relationship, you may want to consider placing it in the same affinity group as explained earlier. You can have multiple applications share the same storage or create a single storage account for any application requiring storage. You can also use storage without using hosted services. Having multiple accounts makes sense to keep applications separated, especially if some of your storage accounts are also used by applications outside your control, such as a partner or client.

To access a storage account you need a key, and each storage account has a unique key, so you can ensure only applications that have the key can access the application. You can regenerate a key in case it is compromised in some way. This does take time however, so to avoid downtime, each storage account has two keys generated. You get a primary key and a secondary key. It doesn’t matter which key you use, so if the primary key gets compromised, you can use the secondary key and then regenerate the primary key. To maintain a high level of security, you should refresh keys every once in a while. You can do this round-robin, so that refreshing keys does not cause downtime.

Content Delivery Network

The Windows Azure Content Delivery Network (CDN) enables you to make content in your hosted service or in Azure Storage available in other regions than where it is hosted. From the Management Portal you can set up one CDN endpoint for each hosted service or storage account. An endpoint exposes the content of the particular hosted service or storage account it is attached to by caching the content around the world. Clients accessing content through the CDN are routed to a copy geographically close to their location to get the best response time. Because the CDN caches content, it is only suitable for static content. You should not expose content that dynamically changes due to user interaction. Content that differs by querystring can be made available through the CDN however, but you should be aware that it takes a while to propagate changes, so users can get outdated copies.

Figure 8-3 shows the management interface for CDN endpoints. An endpoint is shown under the hosted service or storage account to which it applies, as is the case for the CDN under the aspnl storage account. When you add a new endpoint, you need to select what the endpoint applies to. You can tell Windows Azure to enforce a secure connection and whether it should accept querystring parameters to differentiate between content. You can also add a domain (refer to Figure 8-3). By default, a URL is assigned to your CDN. For storage accounts this is some random identifier followed by the domain of the CDN. If you don’t like this, you can use the Add Domain function to add a custom domain name. This domain name needs to be verified, so you do need control over the DNS of that domain. When you add a domain, you will be instructed to add a record to the DNS.

A Tour of the Customer Portal

If you want to see how much you’ve used of the Windows Azure platform, you must go to the Microsoft Online Services Customer Portal. This is where you can manage subscriptions and several services Microsoft offers online, not just Windows Azure. You can go there via the Billing link at the top of the Management Portal, or navigate directly to https://mocp.microsoftonline.com/site/default.aspx. Even if you come from the Management Portal, you are not automatically signed in, so before you can do anything, you need to make sure you are.

The Customer Portal serves two purposes: managing your existing subscriptions and buying new subscriptions. Most interesting about managing your subscriptions is your usage, and consequently what you’re being billed. You can view all that by clicking View My Bills on the homepage. This shows you a list with one item for each subscription. If you click one of those links, you can see a screen, as shown in Figure 8-4. Each item with an Arrow icon is expandable to show the detailed charges. You can view any invoice by switching the billing period, but by default you see the current period up until today. Be aware that this is not real time, so it may not be entirely accurate.

Understanding Azure Roles

Earlier you were briefly introduced to the concept of a role in Windows Azure. The best way to understand roles in more depth is to look at large scale multitier applications, as shown in Figure 8-5, which is divided in four major blocks of servers, each with a different function.

A typical action by a user comes in at the web front end. The web front end takes care of only the user interface, so it forwards the real work to the application servers. The application servers talk with the data servers to retrieve and store data for the business operations it performs. A separate set of servers is responsible for batch processing not directly related to user requests. For each block of servers, also known as a farm, the other block of servers looks like a single server because they sit behind a load balancer. The load balancer is responsible for routing a request to the server that has the least load on it at that time.

The reason you would want to run applications in such an environment is twofold (refer to Figure 8-5): availability and scaling. Should a server in the farm fail, the other server(s) can take over. This redundancy ensures that an application stays available to users. With multiple servers active at the same time, user requests can be sent to the different servers to balance the load and ensure quick response times. When more users use the application, more servers can be added to the farm, as long as the server runs the same software and is configured in the same way as the other servers in the farm. Also, dividing the different functions over different servers prevents the functions from working against each other.

Of course, there is one major drawback to the setup in Figure 8-5: It’s expensive, especially if you need all the capacity only at peak hours. In addition, this environment is difficult and thus expensive to maintain. Managing a server farm is an order of magnitude more complex than managing a single server. If you’ve ever worked on an application that runs on more than a single server, you probably know that it takes a complex hardware and network configuration. It is also hard to ensure that the software runs properly on all servers. Installing a new version of the application without down time complicates matters even further, and the same is true when you want to add additional servers to meet the demand.

When you use Windows Azure, management of the hardware and network configuration is taken care of for you. Management is highly automated, so adding additional instances (or removing them) is something you can do without the need for a systems engineer. In a sense, you can think of Windows Azure as a giant warehouse of servers waiting to become part of a farm running (part of) an application. Basically the only difference between an instance in your application and an instance in another application is the software components running on the instance and its configuration. The operating system, server software, and runtime framework are the same on each instance. If you must add a new instance to the farm running your application because you require additional capacity or because a running instance fails, all that you need is the software you built and some configuration to join the farm. This brings you back to what a role is. You can think of a role as a functional unit of your application, but you can also think of it as a unit of configuration for a farm of instances. For a role, you determine which software will run on the instances, how many instances the software will run on, and so on.

For the most part, the one-size-fits-all approach works fine. However, servers performing one function may not run as well on a server configured for another function. For instance, running a batch process on a web server is not a particularly good idea. This is why Windows Azure offers different types of roles: the Web Role, the Worker Role, and the VM Role. Each of these roles is discussed in more detail in the following sections.

Web Role

You’re likely to use the Web Role the most. It has the setup of a web server and is mainly intended to host web applications and web services. It has Internet Information Services (IIS) preconfigured, so that all you need to do is deploy the web application and it’s good to go. This is also the main benefit of using a Web Role. It requires the least amount of setup, so when a new instance is provisioned, it is up and running in no time.

In Visual Studio, you can find several types of Web Role projects. The common denominator between them is that they are hosted in IIS. Three of those are ASP.NET Web Role projects, each for a slightly different technology stack (that is, WebForms, MVC 2, and MVC 3). These only differ in how you create the user interface. The other Web Role project is for Windows Communication Foundation (WCF), so you can host web services. There is nothing preventing you from using WCF in one of the ASP.NET Web Roles, so if your application consists of both a web interface and web services, you can choose one of the ASP.NET Web Roles and add WCF services to it.

Although a Web Role is hosted in IIS, you can still have some custom work done before the role becomes active using a startup task. This enables you to install and configure components you need. You can upload these as part of the setup package, but you can also acquire them from an Internet-based source. Using NuGet (see http://NuGet.org) that is integrated to Visual Studio to get packages into your project is one good way to retrieve packages and keep them up to date in your Web Role. Maarten Balliauw explains how to do this in a blog post you can find at http://bit.ly/azurenuget.

Worker Role

A Worker Role is similar to a Web Role but it doesn’t come with a preconfigured instance of IIS, although you could run IIS on it if you need it. The Worker Role is essentially a clean Windows Server with no running services. This means your application can benefit from the machine’s resources as much as possible. The primary reasons to use a Worker Role are as follows:

  • Running background processes: Some processes need to run periodically to perform some task. This can be done efficiently on a Worker Role.
  • Running long running processes: Long-running processes run outside of a web server because they can’t be limited by timeouts imposed by the web server and should not rely on an active connection with the client. Web applications can offload requests that take a long time to a Worker Role process.

In addition to the preceding reasons, you can use a startup task to install and configure application components and services your application may need, just like you can in a Web Role. This makes the Worker Role suitable for a few more scenarios:

  • Use a web server other than IIS: For instance, if you have an application that runs on Apache, rather than IIS, you can install Apache in the Worker Role and use that to serve the application instead.
  • Use a framework other than .NET: Windows Azure applications aren’t restricted to the .NET Framework, but it is the only framework available by default.

The Worker Role works by virtue of the RoleEntryPoint class, which a Web Role and Worker Role both implement. The RoleEntryPoint class contains three methods, which are fired at appropriate points in the life cycle of a role. These methods are as follows:

  • OnStart: Used to do anything needed before the application can be used, such as installing components.
  • Run: Used to run the application. By default the implementation never returns.
  • OnStop: Runs when the Run-method returns.

The Run-method is the key to the Worker Role. You can see it as the Main-method. As long as it doesn’t return, the application is running.

VM Role

The Web Role and the Worker Role provide a predefined environment on which you can deploy applications specifically created for those environments. So what must you do when you need something that’s not available in either role? This is what the VM Role is for. A VM Role enables you to create a complete custom environment and upload it to Windows Azure. It is called a VM Role because the custom environment is a Virtual Machine image based on Windows Server 2008 R2. When you use a VM Role, you must create a disk image and upload it to Windows Azure Blob Storage. That disk image, known as the base image, is then used as a template for instances deployed because the stored image is immutable. This is logical because you can’t run multiple instances from one disk because of concurrency. Therefore each instance is imaged from the base image. An obvious side effect is that each instance runs on its own disk image, so changes to the disk of one image are not available to other instances. And to make matters worse, if an instance fails, regardless of whether this is due to software or hardware failure, the disk image is thrown away, and a new instance is spawned with a fresh image. This is also the case if the AppFabric Controller decides to move an instance to another physical server within the datacenter. This is by design because Windows Azure assumes that failure is inevitable.

If you are familiar with virtual machines, you may have thought that you could install just about anything and run it in on Windows Azure. By now you probably realize that this is not the case. Although you can install any software that runs on Windows Server 2008 R2, there is no guarantee that it will work as expected. If the software writes data to the disk other than temporary data just needed while the instance runs, you’re out of luck. The software essentially needs to be stateless. This is twofold; it can’t maintain state within an image because instances can be recycled at any time, and it can’t maintain state across client requests because there is no guarantee that the next request from the same client will be handled by the same instance because of the load-balancing infrastructure.

So with the preceding information in mind, when does it make sense to use a VM Role? Following are two main scenarios in which this makes sense:

  • Setup takes a long time: If the startup task of the role takes a long time, adding new instances may take too long for it to be effective.
  • Automated installation is not possible: This mainly happens when you have a setup package that requires manual interaction or configuration, or when the installation has a high probability of running into problems you have to correct.

A final thing you need to be aware of if you decide to use a VM Role is that you are responsible for keeping it updated. Windows Azure automatically updates and patches Web Roles and Worker Roles, but if a VM Role needs an update, you must upload a new base image.

Getting Your Development Environment Ready

Developing for Windows Azure is somewhat different from developing other types of applications. Your local machine is not the same as a Windows Azure instance, let alone multiple instances in different roles. So to properly develop Windows Azure applications, you need to emulate the Windows Azure environment locally. Therefore getting your development environment ready involves more than just installing you code editor of choice. The following sections walk you through which components you need and how to install these.

System Requirements

Windows Azure essentially runs on top of Windows Server 2008 or Windows Server 2008 R2, and you can set up a good development environment on either. If you want to develop on a desktop machine, Windows 7 and Windows Vista SP2 are also capable of running the needed components for Windows Azure.

What Language Should You Choose?

If you’re already a .NET developer, you’re used to choosing between different programming languages. Windows Azure is no different because it is built on top of Windows and has the .NET Runtime installed. This means that you can develop applications in C#, F#, and VB.NET. These are all first-class citizens that can run out of the box, as you would expect. But the story doesn’t end there. Windows Azure is a Windows Server under the hood, so it’s a fair assumption that anything that can run on Windows Server could run on Windows Azure. This isn’t entirely true because Windows Azure has some restrictions to ensure performance, scalability, security, and so on. But a lot of things can run on Windows Azure, including application platforms not natively supported on Windows Azure. This means you can also host Java, PHP, Python, and Ruby applications on Windows Azure although you must install and kick-start the runtime required for those languages. This isn’t incredibly hard, and Microsoft contributes to several (open source) initiatives that help you leverage Windows Azure with other platforms.

Windows Azure works best with Microsoft’s languages and tools. Microsoft is definitely committed to making other platforms work on Windows Azure, but most effort goes into tuning the .NET Framework for Windows Azure and making development easy for developers preferring Visual Studio.

Regardless of the language you choose, running an application on Windows Azure isn’t quite the same as running it on a server (farm) under your control. You need to take the restrictions mentioned earlier into account. For example, writing to the Windows Azure file system is recommended only for temporary data because files are not persisted across restarts of you application.

Getting the Developer Tools

To develop Windows Azure applications, you at least need the Windows Azure SDK. What else you need depends on the language you intend to use. If you’re going to use C#, F#, or VB.NET, using Visual Studio is a no-brainer. You could develop without Visual Studio, but there’s no good reason not to use it. If you don’t have Visual Studio, don’t worry. If you intend to use C# or VB.NET, you can use Visual Web Developer Express, which is available for free.

You can install the needed developer tools in several ways, depending on what you already have, what you need, and how much you want to do manually. You can find information about several ways to do the installation in the following section.

Installing from Visual Studio

If you already have Visual Studio 2010 Professional or better installed, you can start the installation from there. All you need to do is open the dialog to create a new project, and select the Cloud section, as shown in Figure 8-6.

When you double-click Enable Windows Azure Tools, another screen opens where you need to click Install Now. That initiates the download of the Web Platform Installer. From here the installation is almost the same as when you don’t have Visual Studio, so read on.

Installation with the Web Platform Installer

If you don’t have Visual Studio 2010, you can use a development environment known as Visual Web Developer Express (VWDE), which is a free entry-level version of Visual Studio just for web development. You could install that separately, but if you go to http://bit.ly/windowsazuresdk you can install it together with the Windows Azure tools you need at once, using the Web Platform Installer. This is a tool that checks what’s already on your machine, downloads and installs all the necessary updates and components, and configures them. This is even the case for needed Windows components such as Internet Information Services 7.x (IIS). For Windows Azure, it also installs SQL Server Express (if not installed already) so the Windows Azure emulator can also emulate Windows Azure Storage. This makes installation with the Web Platform Installer a no-brainer. When it is done, you’re ready to go.

The Web Platform Installer you download is preconfigured with a scenario to install the Windows Azure SDK, the Windows Azure AppFabric SDK, Visual Studio (if not installed already), and the Windows Azure Tools for Visual Studio, so after it’s installed it automatically shows a screen like in Figure 8-7, indicating you are going to install both Visual Studio and Windows Azure tooling.

When you click Install, you’re asked to accept the license agreement. Because you’re installing a whole bunch of software, you’re accepting the license agreement of all the software. You can scroll through the list to see what’s being installed and configured on your system. After you accept the license agreement, you still need to determine the security settings for SQL Server Express. You can choose between the following:

  • Windows Integrated Security: Enables access to SQL Server only through a Windows account. This is more secure, but is also somewhat harder to get working properly because an application runs under the account configured in IIS.
  • Mixed Mode Authentication: Enables access to SQL Server through a Windows account as above, but also through a username and password only known to SQL Server. This makes it easier to set up, especially with a SQL Server not running in the same Windows domain. This option is the best choice for Windows Azure development.

Now you can sit back and relax. You can easily have a cup of coffee because not only does the download take a while, but the installation also takes its fair share of time. Also, because the Web Platform Installer installs everything you need, it may need to restart Windows. After the restart the Web Platform Installer continues automatically. When installation finishes, you are notified of everything that installed. When you click Finish, the Web Platform Installer returns to an overview screen, showing other software you can install. This includes the latest Visual Studio service pack, which you should also install for good measure. In Chapter 9, “Identity in Azure,” you also work with Visual C# 2010 Express (VCSE). It makes sense to install that before installing the service pack because the service pack applies to both products. You can install VCSE from http://bit.ly/vcse2010. If you need to run the Web Platform Installer again, you can find it in All Programs.

Both VWDE and VCSE are free, but if you want to use them for longer than 30 days, you need to register. If you are not prompted to do so automatically, you can start it from the menu under Help ⇒ Register Product.

Installing Windows Identity Foundation

In the next chapter you work with Windows Identity Foundation (WIF) to secure applications. This means you need WIF and the WIF SDK. These are not installed by the Web Platform Installer when you install Windows Azure Tools because you don’t need them. However, as you learn in the next chapter, WIF is a key component for modern user authentication, so it is a good idea to install it. You can download WIF from the accompanying Knowledgebase article at http://bit.ly/wifinstall. This installs the WIF runtime components. This is installed as a Windows Update and not as a separate installation, so it won’t show up in the installed programs list by default. You can find the WIF SDK download in the Related Resources at the bottom of the WIF download page. Be sure to install the 4.0 version of the SDK. The SDK contains some tools, samples, and documentation, and several Visual Studio templates. You learn about these in the next chapter when WIF is discussed in detail.

If you use VWDE, the Visual Studio templates are not installed properly because the WIF SDK doesn’t recognize VWDE as Visual Studio. To ensure the templates show up in VWDE, you need to add them to VWDE manually by doing the following:

1. Open Windows Explorer.
2. Navigate to C:Program FilesWindows Identity Foundation SDKv4.0Visual Studio Extensions10.0.
3. Copy all files in the folder.
4. Navigate to C:Program FilesMicrosoft Visual Studio 10.0Common7IDEVWDExpressProjectTemplatesWebCSharp1033.
5. Paste the copied files.
6. Open the command prompt as Administrator.
7. Change directory to C:Program FilesMicrosoft Visual Studio 10.0Common7IDE.
8. Type VWDExpress /InstallVSTemplates.

Installing the Windows SDK

If you don’t have Visual Studio, then you are missing the makecert tool you use in the next chapter. This is a tool to create certificates used for encryption and signing. Fortunately, this tool is also available in the Microsoft Windows SDK for Windows 7 and .NET Framework 4, which you can install from http://bit.ly/windows7sdk. You need only a small portion of the SDK, and because this is a web installer, only the selected bits are downloaded. What you need to select is shown in Figure 8-8.

Installing the SDK Only

If you plan to develop applications using a different tool than Visual Studio you can just install the SDK. At http://bit.ly/windowsazuresdk you can perform a manual installation.

You should also install the Windows Azure Libraries for .NET. You use these libraries to work with services in the Azure platform for access control, caching, and communication between applications.

Installing Other Language Tools

As explained earlier, you can develop for Windows Azure with languages not supported natively by the Windows Azure platform. For the following languages tools are available through http://bit.ly/windowsazuresdk:

  • Java: Consists of client libraries to work with various Azure APIs and tools for the Eclipse development environment. These tools include a Project Creation Wizard and project templates, utility scripts, and an Ant-based builder.
  • Node.js: Consists of client libraries and PowerShell tools. These are also available from github at http://bit.ly/gitazurenodejs.
  • PHP: Consists of client libraries, command line tools, and scaffolding templates.

If you want to develop in another language, you can still download the Windows Azure SDK to develop applications with, but you need to do everything that the client libraries provide yourself. In the next section you learn more about what is in the Windows Azure SDK, and why you need it.

Windows Azure SDK

The Windows Azure SDK is the only requirement when you want to make applications for Windows Azure. Apart from documentation and samples, the SDK contains two sets of tools: the Windows Azure SDK Tools and the Windows Azure Tools for Visual Studio.

The documentation and samples that go with the SDK are all online and linked to from the SDK. The advantage of that is that the documentation and samples are up to date. The disadvantage, however, is that developing without an Internet connection is not a good idea.

Windows Azure SDK Tools

There are several important tools in the SDK. Most important is the Windows Azure Compute Emulator and the Windows Azure Storage Emulator, also referred to as the Development Fabric or DevFabric. Without these, developing applications for Windows Azure would be almost impossible because the Azure environment has some unique characteristics your local computer does not. Most obvious, of course, is that Windows Azure theoretically scales out infinitely. The Compute Emulator is a virtualized environment capable of running multiple roles simultaneously and multiple instances within a single role. That said, it can become excruciatingly slow if you fire up too many roles and instances, so you should do so wisely. Only test with multiple instances if you intend to test whether your application runs well on multiple instances. You can deploy and start applications using the command line tools in the SDK. When you use Visual Studio or some of the other development environments that support Windows Azure development, these command-line tools are run under the covers, so you are not crippled in any way if you decide just to use the command line. The Windows Azure SDK comes with the following command-line tools:

  • CSEncrypt: Encrypts a password for use with a Remote Desktop Connection to a running instance
  • CSPack: Builds and packages applications for deployment to Windows Azure or the DevFabric
  • CSRun: Runs a package on the compute emulator
  • CSUpload: Uploads VHD images for a VM Role and certificates to connect to an instance via a Remote Desktop Connection
  • DSInit: Initializes the local storage environment
note.eps
It is beyond the scope of this book to demonstrate how to use the command-line tools without a development environment, but at http://bit.ly/azurecmdline you can find a great blog post by Steve Marx of the Windows Azure team with details.

Of the preceding tools, DSInit is the only one you need to remember. You need to run it when the local storage environment has not been set up (correctly) or if the local storage environment is corrupt. If you want to use the command-line tools, you can go to Start ⇒ All Programs ⇒ Windows Azure SDK vX.X and open the Windows Azure SDK Command Prompt.

Windows Azure Tools for Visual Studio

The Windows Azure Tools for Visual Studio enable you to develop and deploy applications from within a single environment. After you install the tools, you have a template for a Windows Azure Project in the Cloud section that creates a solution with all the necessary content. The tools also come with several context menus applicable to Windows Azure projects in the Solution Explorer. With these, you can package or publish the application, add a new Role to the project, or go directly to the Management Portal. In addition there’s a context menu for existing projects that aids you in turning a project into a Windows Azure application.

A key aspect for most developers is that the Windows Azure Tools for Visual Studio enable you to debug applications by setting breakpoints in code, just as in any other C# or VB.NET application. This works fine in Visual Web Developer Express, so there is no reason why you shouldn’t use Visual Studio to develop .NET-based Windows Azure applications. Visual Studio can save you an enormous amount of time with just the debugging support.

Another benefit of the tools is the configuration editor that is installed with them. This prevents you from making mistakes in the XML file governing the Role configurations in a Windows Azure application.

Developing a Windows Azure Application

The best way to get a feel for the Windows Azure development environment and the Visual Studio tools supporting it is by creating a simple application. Assuming you already have some experience with developing .NET applications, the focus is on what is specific to Windows Azure.

Starting with Hello World

One of the advantages of Visual Studio is that several templates are available that provide you with a fully functional application. This means you can get up and running quickly with a working demo application. Follow these steps:

1. Run Visual Studio as Administrator.
2. Create a new project by clicking File ⇒ New Project.
3. In the left box in the New Project dialog, go to Installed Templates ⇒ Visual C# ⇒ Cloud.
4. There is only one project type: Windows Azure Project. Under Name, enter HelloAzure and click OK.
5. In the New Windows Azure Project, add an ASP.NET Web Role to the Window Azure solution.
6. Right-click the added Web Role, and click Rename.
7. Rename the Web Role to HelloWebRole and click OK.

After Visual Studio sets up the solution, the Solution Explorer should show a solution with two projects in it, as shown in Figure 8-9. The HelloWebRole project is just like a regular ASP.NET Web Application, except that it contains packages.config with the NuGet configuration and WebRole.cs implementing the RoleEntryPoint class discussed earlier. The HelloAzure project contains the configuration needed to run the ASP.NET application in HelloWebRole on Windows Azure (or on the DevFabric).

You can now build and run the application by pressing F5 (with debugging) or Ctrl+F5 (without debugging). This results in the Compute Emulator starting and HelloWebRole deploying to the DevFabric. The browser opens automatically to show the home page of the web application running at http://127.0.0.1:81/.

You can change the web application you created just like any other ASP.NET application, although the configured Membership, Role, and Profile providers all point at a local SQL Server Express database. This can definitely cause problems when you deploy to Windows Azure. If you need these providers, the following are three ways to solve this:

  • Use SQL Azure with the existing providers: To do this, you need to run the scripts that create the needed tables in SQL Azure. The scripts installed with the .NET Framework do not work, but you can find correct scripts and instructions at http://bit.ly/aspnetsqlazurescript. When you finish this, you need to change the connection string to point to SQL Azure and configure SQL Azure to allow the application to connect. In Chapter 11 you learn more about using SQL Azure.
  • Use providers that store the data in table storage: You can download code for providers using table storage from http://bit.ly/tablestorageproviders.
  • Use Windows Identity Foundation instead of the providers: You learn about this in Chapter 9.

There’s a similar problem with the session state provider, which is configured to use the DefaultSessionStateProvider, which stores session state in process. As soon as you run your application on multiple instances, this can cause problems because the session state isn’t shared between instances. Again, if the application requires session state, you must store it in some shared source. This could be SQL Azure, but officially that isn’t supported by Microsoft. The best solution here is to use table storage for session state. You can find code for that as well at http://bit.ly/tablestorageproviders.

Using Azure Table Storage

The problems described in the previous paragraphs amply demonstrate the problems you can face because of load balancing and statelessness of the Windows Azure platform. The solution in almost all cases is the same: use Azure Table Storage. It provides persistent storage available across instances with a single, shared configuration. Also, using affinity groups you can ensure that the data is stored close to your instances to benefit from optimal performance.

Azure Table Storage is a data store for structured data. Because of its name, most people are quick to associate Azure Table Storage with relational databases. But although the name suggests otherwise, Azure Table Storage is different from relational databases. If you need a relational database, you should use SQL Azure (discussed in Chapter 11). However, in many cases another data storage mechanism can work just as well.

Understanding Azure Tables

The data model of Azure Table Storage is a fairly simple hierarchy. At the top level is the storage account. A storage account can contain an unlimited number of tables, and there is also no limit to the size of a table. Although it is called a table, it is not the same as a table in a relational database. A table is a container for entities. An entity is a collection of typed name-value pairs, referred to as properties. Because a table has no fixed schema, two different entities in a table can have different properties. Also, properties are typed per entity. This means that you can have the same property name in another entity, but with a different data type. Table 8-1 shows all the supported data types.

Table 8-1: Supported Data Types for Azure Table Storage

Data Type Description
Binary Array of bytes up to 64 KB in size
Bool Boolean value
DateTime UTC time value in the range 1/1/1601 to 12/31/9999 (64 bit)
Double 64-bit floating point number
GUID Globally Unique Identifier (128-bit)
Int 32-bit integer
Int64 64-bit integer
String UTF-16 string of up to 64 KB in size

An entity can be no larger than 1 MB. Beyond 1 MB, you should consider using Page or Blog Storage. An entity can also have at most 255 properties, including some mandatory properties. These properties are as follows:

  • PartitionKey: A string of at most 1 KB, identifying the partition the entity belongs in
  • RowKey: A string of at most 1 KB, a unique identifier of an entity within a partition
  • Timestamp: A read only value maintained by the system for versioning

The PartitionKey and RowKey together uniquely identify an entity within a table. What you choose as a PartitionKey is important. Entities with the same PartitionKey are stored close together, and hence have good query performance when queried together. Entities with a different PartitionKey are stored separately, so they don’t benefit from the query performance. However, entities with a different PartitionKey can be handed by different servers, which can increase scalability. This is a trade-off that requires you to carefully determine how you want to partition your data, providing you have a large enough data set that partitioning makes sense.

The Azure Tables Storage REST API

Azure Table Storage uses a REST API. This means that you can access and manipulate the data store using HTTP requests. The URL determines what data structure you work with, and the HTTP verb (GET, POST, and so on) determines the operation you perform. Because Azure Table Storage uses a REST API, any platform that supports the HTTP protocol and understands XML can talk to Azure Table Storage. You don’t need a library of sorts to work with Azure Table Storage. In .NET applications you can use ADO.NET Data Services, which abstract away the REST operations from developers using Language Integrated Query (LINQ).

Before diving into using ADO.NET Services with Azure Table Storage, it is insightful to look at the REST API because that gives you a better understanding of what works (well) and what doesn’t when you work with Azure Table Storage. The API consists of two sets of operations: Table operations and Entity operations. With the Table operations, as shown in Table 8-2, you can manage the tables in your storage account. With the Entity operations in Table 8-3, you can manipulate data in the tables.

Table 8-2: REST API Table Operations

HTTP Verb Operation
GET Lists the tables in the storage account, or a subset if a filter is specified
POST Creates a new table in the storage account
DELETE Deletes a table in the storage account

Table 8-3: REST API Entity Operations

HTTP Verb Operation
GET Returns all the entities in the specified table, or a subset if a filter was specified
PUT Updates the given entity by replacing the entire entity
MERGE Updates values of an entity
POST Inserts a new entity into the specified table
DELETE Deletes the specified entity from the table

The Table operations are fairly simple. Creating a table is easy because there is no fixed schema. That means you have to create only the table. After you do that you can use the Entity operations to manipulate data. A simple example is getting a single record by querying on PartitionKey and RowKey, as shown in the following URL:

http://yourstorage.windows.core.net/Cars(PartitionKey="BMW",RowKey="320i")

The path part of the URL (after the last slash) can’t exceed 260 characters, which goes for any operation. You can get around this limitation by using a so-called Entity Group Transaction, not listed in Table 8-3. This operation also uses POST, but this can be used for multiple types of operations. With an Entity Group Transaction, you can save multiple entities in a single table and with the same partition key within a single transaction. You can also use it to query entities in a table.

As you can see from the URL, you always operate within the context of a table. As a consequence, you can’t combine data from multiple tables as you would with a SQL join-statement for instance. To do something like that, you must query both tables and process the data in your application. Because tables don’t have a fixed schema, you can also solve this by adding entities with different properties to a table, and use the partition key to relate them, so you only have to do one query. That still leaves you with some processing in the application, but chances are that retrieving the data takes more time than processing it. Whichever way you solve this problem, the key take away is that Azure Table Storage uses a different paradigm from relational databases. You must adapt your data access strategy and data model to work well with this paradigm.

Working with Azure Storage Tables

When you want to use Azure Storage Tables in .NET applications, the REST API just discussed is abstracted away. Instead, you can use the classes in the Microsoft.WindowsAzure.StorageClient namespace to create objects that hook into LINQ. This makes using tables a lot easier because it feels similar to technologies such as LINQ-to-SQL and LINQ-to-Entities, although the capabilities of Azure Storage Tables are much more limited.

The first thing you need to do is create entities to work with. An entity is a class that inherits from the TableServiceEntity class, as shown in Listing 8-1. As you can see in Listing 8-1, an entity class is nothing more than a data container. Because it inherits from the TableServiceEntity class, it already has properties for the PartitionKey, RowKey, and Timestamp.

download.eps

Listing 8-1: HelloEntity

using System;
using Microsoft.WindowsAzure.StorageClient;

namespace HelloWebRole
{
    public class HelloEntity : TableServiceEntity
    {
        public HelloEntity()
        {
        }

        public string Name { get; set; }
        public string Message { get; set; }
        public DateTime PostDate { get; set; }
    }
}

To work with Azure Table Storage, you need a TableServiceContext object. The TableServiceContext object is basically an in-memory cache between the application and the storage table. You can add objects to the context, query for objects, and manipulate these, and save the changes when you finish. You can get a TableServiceContext object from a CloudTableClient object, and to get it you need to have a CloudStorageAccount object that works against the table storage you want to use. The constructor in Listing 8-2 goes through the motions to create the needed objects to get a TableServiceContext object.

download.eps

Listing 8-2: HelloTableManager

using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;
using Microsoft.WindowsAzure.ServiceRuntime;

namespace HelloWebRole
{
    public class HelloTableManager
    {
        public const string HelloTableName = "HelloTable";

        TableServiceContext _context;

        public HelloTableClient() : this(RoleEnvironment.GetConfigurationSettingValue(
            "StorageConnectionString"))
        {
        }

        public HelloTableClient(string connectionString)
        {
            var account = CloudStorageAccount.Parse(connectionString);
            var tableClient = account.CreateCloudTableClient();
            tableClient.CreateTableIfNotExist(HelloTableName);
            _context = tableClient.GetDataServiceContext();
        }

        public void AddHello(string name, string message)
        {
            _context.AddObject(HelloTableName, new HelloEntity()
            {
                Name = name,
                Message = message,
                PostDate = DateTime.Now,
                PartitionKey = DateTime.Today.ToString("yyyyMMdd"),
                RowKey = Guid.NewGuid().ToString()
            });
            _context.SaveChanges();
        }

        public List<HelloEntity> GetMessagesByDay(DateTime day)
        {
            var query = _context.CreateQuery<HelloEntity>(HelloTableName);
            query = query.Where(c =>
                        c.PartitionKey == day.ToString("YYYYMMDD"));
            var list = query.AsTableServiceQuery().ToList();
            return list.OrderByDescending(e => e.PostDate).ToList();
        } 
    }
}

The AddHello method in Listing 8-2 just adds a HelloEntity object to the table, by adding it to the context and then having the context save the changes. To update an entity you would get the entity from table storage first, manipulate it, and then call SaveChanges.

The GetMessagesByDay method demonstrates how to get entities from the table. It first creates a query object for the table. If you were to use that immediately, it would be an unfiltered query, yielding all entities in the table. You can imagine that if this wouldn’t be limited somehow, this could potentially cause a huge data transfer if there are many entities. Fortunately, Azure Storage Tables does not return more than 1,000 entities. If more entities satisfy the query, a continuation token is provided, so you can query the next set of up to 1,000 entities. In the GetMessagesByDay method, a where-clause is added to the query to filter by day. The PartitionKey is used to make this possible. Using the PartitionKey that way foregoes the need to post-process the queried data.

The call to AsTableServiceQuery converts the DataServiceQuery query to a CloudTableQuery. At that point the query is not yet executed, so you could set properties on the query, such as the RetryPolicy. The ToList method triggers the query and gets the data. You can post-process the results if needed, such as the sorting done in the GetMessagesByDay method.

The code in Listing 8-2 is enough to implement a guestbook with Azure Table Storage. Listing 8-3 demonstrates using the HelloTableManager in an ObjectDataSource used with a FormView-control for adding new messages, and a DataList-control to list messages.

download.eps

Listing 8-3: ASP.NET Azure Table Storage Guestbook

<asp:ObjectDataSource ID="ObjectDataSource1" runat="server"
                      InsertMethod="AddHello"
                      SelectMethod="GetMessagesByDay"
                      TypeName="HelloWebRole.HelloTableManager"
                      Onselecting="ObjectDataSource1_Selecting">
    <InsertParameters>
        <asp:Parameter Name="name" Type="String" />
        <asp:Parameter Name="message" Type="String" />
    </InsertParameters>
    <SelectParameters>
        <asp:Parameter Name="day" Type="DateTime" />
    </SelectParameters>
</asp:ObjectDataSource>

<asp:FormView ID="FormView1" runat="server" DefaultMode="Insert"
              DataSourceID="ObjectDataSource1">
    <InsertItemTemplate>
        Name:
        <asp:TextBox ID="NameTextBox" runat="server"
                     Text='<%# Bind("Name") %>' />
        <br />
        Message:
        <asp:TextBox ID="MessageTextBox" runat="server"
                     Text='<%# Bind("Message") %>' />
        <br />
        <asp:LinkButton ID="InsertButton" runat="server" Text="Insert"
                        CommandName="Insert" CausesValidation="True" />
    </InsertItemTemplate>
</asp:FormView>

<asp:DataList ID="DataList1" runat="server"
              DataSourceID="ObjectDataSource1">
    <ItemTemplate>
        Name:
        <asp:Label ID="NameLabel" runat="server"
                   Text='<%# Eval("Name") %>' />
        <br />
        Message:
       <asp:Label ID="MessageLabel" runat="server"
                  Text='<%# Eval("Message") %>' />
       <br />
       PostDate:
       <asp:Label ID="PostDateLabel" runat="server"
                  Text='<%# Eval("PostDate") %>' />
            <hr />
    </ItemTemplate>
</asp:DataList>

To get a working guestbook based on the code shown in the previous listings, you need to add the code to your project and configure table storage, as follows:

1. Add HelloEntity to the HelloWebRole project.
2. Add HelloTableManager to the HelloWebRole project.
3. Replace the main content in Default.aspx of the HelloWebRole project with Listing 8-3.
4. Add the following code to Default.aspx.cs:
download.eps
protected void ObjectDataSource1_Selecting(object sender,
    ObjectDataSourceSelectingEventArgs e)
{
    e.InputParameters["day"] = DateTime.Today;
}

code snippet 01_ObjectDataSource1_Selecting.txt

5. In the Roles folder of the HelloAzure project, double-click the HelloWebRole item so that the configuration is shown.
6. In the configuration navigate to the Settings tab.
7. Click Add Setting.
8. Name the new setting StorageConnectionString.
9. Choose Connection String as type.
10. In the value type UseDevelopmentStorage=true, so all entities are saved in the local development store instead of in an actual storage account. You can also do this by clicking the Ellipsis button in the Value column and clicking OK to default to Use the Windows Azure Storage Emulator.
11. Save all files and run the project.

Configuring Your Application

The configuration of your application consists of two parts:

  • Service Definition: You can think of this as the configuration of your overall infrastructure. It contains information about the roles in your application, the endpoints at which these are available, and the IIS configuration for virtual directories.
  • Service Configuration: By contrast, this contains a configuration more specific to the roles within the application, such as the number of instances and application settings.

Service Definition

The Service Definition is an XML file that you can edit with any text editor. You can also configure portions of it through the Visual Studio dialogs that come with the Windows Azure SDK, but those don’t cover everything. You can find the entire schema for the service definition at http://bit.ly/servicedefinition.

At the highest level the service definition consists of the following sections:

  • WebRole: Settings for the Web Roles in the application.
  • WorkerRole: Settings for the Worker Roles in the application.
  • VirtualMachineRole: Settings for the VM Roles in the application.
  • NetworkTrafficRules: Rules to determine which roles have access to which internal endpoints on other roles. With these routing rules you can tighten the security of the infrastructure.

Each section is optional but needs at least one of the first three to work. The first three are also similar. One of the most important settings on the role element is the vmsize attribute, which determines the size of the instances used for the role in terms of CPU power and memory size. Table 8-4 describes the sections you can find for each of the role elements.

Table 8-4: Service Definition Role Configuration

Element Description
Certificates The definition of the certificates available in the role. These can be used for SSL, encryption, and signing. The actual certificate references are stored in the service configuration. You learn more about using certificates in Chapter 9.
ConfigurationSettings The definition of configuration settings available to a role. The values of these settings are stored in the service configuration.
Endpoints The endpoints through which the role can be accessed. Input Endpoints are available to clients; Internal Endpoints are only available to other roles.
Imports The Windows Azure modules made available to the role.
LocalResources Defines folders on disk that can be used to store temporary data.
Runtime Settings for the Windows Azure runtime environment. Does not apply to VM Role.
Sites The collection of websites and web applications hosted in a Web Role. This enables you to host multiple sites and applications in a single role. Only applies to Web Role.
Startup The tasks that need to be run when a role starts. Does not apply to VM Role.
Contents Defines locations for content in the role and external locations to copy the content from. Does not apply to VM Role.

Service Configuration

Like the Service Definition, the Service Configuration is an XML file. Its schema is much less elaborate than the Service Definition. You can find the entire schema at http://bit.ly/serviceconfiguration. At the top level the Service Configuration defines attributes for the family and version of the operating system running a role. The family can be either Windows Server 2008 SP2 or Windows Server 2008 R2, which defaults to the former. The version relates to the Windows Azure Guest OS, which is based on the chosen OS family. By default, the latest version is used and instances are automatically upgraded when a new version is released. You can however specify a specific version if you have compatibility issues.

The Service Configuration contains a role configuration for each role in the Service Definition. This configuration can contain the following:

  • The number of instances used to run the role
  • The values for the configuration settings defined in the service definition
  • The certificate references corresponding to the certificate definitions in the service definition

For a VM Role the configuration can also contain a reference to the VM Role image.

The configuration settings in the service configuration compare to the application settings in app.config or web.config. If you currently have values in there that are subject to change, you may want to consider moving them to the service configuration. You can then read them with the following line of code:

string setting = RoleEnvironment.GetConfigurationSettingValue("mySetting");

Using the Configuration Dialogs

A full discussion of everything in the service definition and the service configuration is beyond the scope of this book. The main settings you need to know about are configured through the Visual Studio dialogs installed with the Windows Azure SDK. These dialogs actually operate on both the service definition and the service configuration simultaneously. In Figure 8-10 for instance, you can see settings for the number instances and size of these instances. Although edited together, the size is stored in the service definition, whereas the number of instances is part of the service configuration. You can open the Visual Studio configuration dialogs by clicking the role you want to configure in the Roles folder of the cloud project. Figure 8-10 shows this for the HelloAzure project you created earlier.

A few more tabs exist (refer to Figure 8-10) through which you can configure a role. The Settings tab corresponds to the ConfigurationSettings sections in the service definition and service configuration. The other tabs correspond to the sections with the same names, except Virtual Network. The latter is used for Windows Azure Connect, which is discussed in detail in Chapter 15.

In the configuration dialogs you can differentiate between the local environment and the cloud (production) environment. If you do nothing, all settings are used in both environments. But at the top of the dialog, you can change the settings for a specific environment by selecting it from the Service Configuration drop-down list. In the Azure project you can see service configurations corresponding to the configurations you have defined. By default these are Local and Cloud.

Running Multiple Instances

As discussed earlier, Microsoft gives uptime guarantees if you run a role on at least two instances. This is easy to configure (refer to Figure 8-10). Just increase the instance count and redeploy. In your local development environment, press Ctrl+F5 again, and the application redeploys with the new instance count. You can see it running on multiple instances with the Compute Emulator UI. You can find a Windows Azure icon in the notification area of the taskbar. When you right-click the icon, you can select Show Compute Emulator UI to show the console. Figure 8-11 shows the Compute Emulator UI with two running instances for the HelloWebRole.

Setting Up Endpoints

When you run a role in the local development environment, the Compute Emulator automatically assigns it a port number so that it doesn’t conflict with the default website already running on port 80. If you run a website only on one role, this isn’t a problem. In a more elaborate environment, you may want to have full control over the port numbers, so you can statically configure the roles to communicate with one another. You also want to have full control when you use nonstandard port numbers to communicate between roles in the production environment. One reason to do this would be security because you can restrict access by port number.

If you go to the Endpoints tab, you can see the active endpoints. By default this is the single endpoint that runs on port 80 in the production environment but gets reassigned by the compute emulator. If you want no surprises in the compute emulator, you can pick another port, such as port 8080. You can set both a public port and a private port. The latter is optional, and if you set it, traffic on the public port is rerouted to the private port. The private port would also be where other roles access the role. Endpoints that should not be publically available must be marked as Internal with the drop-down in the Type column.

For a secure connection with SSL, you must select a certificate. Before you can do this, you need to add the certificate under the Certificates tab. You learn how to do this in Chapter 9. To enable a secure connection, you must change the protocol from http to https. TCP is also an accepted protocol that can be used for other forms of communication.

Deploying Your Applications

You can deploy an application in two ways. You can deploy it directly from Visual Studio, or you can package it and then upload and deploy it through the Management Portal. Both options are discussed next.

Packaging and Uploading

Deployment through packaging and uploading is most likely the case if the production environment is managed by a system administrator. This is usual in larger organizations because application development and deployment must be done according to some predefined process with different responsibilities assigned to different roles within the organization.

To deploy using a package, take the following steps:

1. In Visual Studio, right-click the Windows Azure project in the Solution Explorer and select Package.
2. In the dialog presented select the cloud service configuration, and set the build type; then click Package.
3. When the application is packaged, Windows Explorer opens and shows the package folder.
4. Open a browser and sign in to the Management Portal.
5. Go to Hosted Services ⇒ Storage Accounts & CDN.
6. On the taskbar click New Hosted Service.
7. Enter the details of the hosted service to create, as shown in Figure 8-12. Under Deployment options, the default is Deploy to Stage Environment, which has been changed in Figure 8-12.
8. Browse for the package and configuration in the location opened in Step 3.
9. Click OK. This starts the deployment process, which can take a while.

After you deploy your application, it is available through the URL for which you entered the prefix. In Figure 8-12 this would be http://PMC-HelloAzure.cloudapp.net/. If you deploy to staging, your application gets an automatically assigned GUID as the URL prefix, so it is available under a URL such as http://4ab5ac2001324585ba5a902f4242a98c.cloudapp.net/. This URL can change any time you deploy to staging.

Deploying from Visual Studio

In smaller organizations and in test environments, deploying an application directly from Visual Studio is a good option. It’s the easiest way to deploy, so if you don’t need strict separation, it’s the best option. Of course, it’s also a good option for test environments in which the strict separation of roles is not needed.

To deploy directly from Visual Studio, you first need to ensure that Visual Studio is allowed to deploy. This is managed by a certificate, which is safer than a username and password. On your local machine you need the certificate with both the public and private key. In Windows Azure you need the certificate with the public. You can create a certificate the first time you publish, using the following steps.

1. In Visual Studio right-click the Windows Azure project in the Solution Explorer, and select Publish.
2. Click the Credentials drop-down list, and select <Add…>, opening a new dialog.
3. Click the drop-down list under 1, and select <Create…>.
4. Enter a name for the certificate, and click OK. Use a name that helps you remember that it’s a publishing certificate for a particular subscription.
5. Follow the instructions in the dialog under 2 and 3 to tie the certificate to the subscription. You can manage the certificates under Management Certificates in the Hosted Services, Storage Accounts & CDN section.
6. Give the credentials a meaningful name, and click OK.
7. If you have not done so yet, you are prompted to go to the Management Portal and create a Hosted Service and a Storage Account. The former is similar to when you upload a package, but as shown in Figure 8-11, you need to indicate that you don’t want to deploy.
8. Indicate whether you want to deploy to staging or production, and the storage account to use for publication.
9. As with packaging indicate the environment and build the configuration.
10. In the Publish label textbox, enter the name of the deployment in the Management Portal.
11. Click Publish to start the publication process, which will take a while.

When you publish the next time, all values are prefilled and Visual Studio detects that there is an existing deployment of the same project. If you confirm, Visual Studio removes the existing deployment and deploys the new version.

Deploying from Staging to Production

If you’ve done a deployment to Windows Azure staging, you can deploy to production easily using a virtual IP-swap. This means the IP addresses of the staging and production environment are switched. You can see this option on the Management Portal taskbar if you have a staging deployment and a production deployment for the same hosted service.

Handling Changes

As with all software, it is likely that over time you will make configuration changes and minor changes to the software. If you have a new build, you can do a new deployment using one of the techniques previously described. To do this with the least disruption, you can deploy in staging and then do a virtual IP-swap, so the redeployment of instances doesn’t disrupt service. Alternatively, you can do an in-place upgrade by clicking the Upgrade icon on the Management Portal taskbar. This enables you to upload a new package and configuration.

If you just have to change the configuration, you have two options: upload a new service configuration or edit the existing configuration. The latter is only a good option if all you need is to increase or decrease the number of instances. Other than that tinkering with a configuration file is not a good idea. You should make changes to configuration settings used in the application in a file first, so you can easily check them and possibly even test them.

Programmatically Controlling Your Service

By now, you may realize that Windows Azure is more than just a hosting environment. It’s a dynamic environment in which you can make runtime changes to adapt to the needs of users. The Management Portal gives you the facilities to modify the configuration, but you can also do this programmatically.

Using the Services Runtime

The Services Runtime is the set of classes available to your application to retrieve information about the role and instance configuration. Through the Services Runtime you can also request a restart of an instance, in case something has gone wrong with the instance. You’ve already seen the RoleEnvironment class when you learned about the service configuration. However, the RoleEnvironment class is much more than your gateway to the service configuration. It is the starting point for all information about the roles and instances in your application. It contains several static properties:

  • CurrentRoleInstance: The instance in which the code runs.
  • DeploymentId: The unique identifier of the deployment. For the staging environment this corresponds with the URL at which the application is made available.
  • IsAvailable: Indicates whether the instance runs in Windows Azure.
  • IsEmulated: Indicates whether the instance runs in the compute emulator.
  • Roles: The roles in the deployment.

The information you can get through these properties is mostly static, in the sense that you can’t change these through the API. However, you can change them from the Management Portal and with the Service Management API discussed next. When that happens, you can use the RoleEnvironmentChanging and RoleEnvironementChanged events to take action based on those changes. Most likely, you want to update configuration settings you’ve read using the GetConfigurationSettingValue method. The other method that’s important is RequestRecycle, with which you can restart an instance. This is handy when an unrecoverable exception occurs within your application. You can find the complete Services Runtime namespace reference at http://bit.ly/serviceruntime.

Understanding the Service Management API

The Service Management API is a REST API that enables you to do most of what you can do with the Management Portal. You’ve already had a taste of this when you learned about the REST API for Storage Tables, but you can do more. At the basic level you can create, update, and delete most entities in Windows Azure, such as affinity groups, storage accounts, and certificates. The API for hosted services is more elaborate and enables you to perform operations such as deploy, reboot role instances, and change a role configuration. The latter is particularly interesting considering that you can change the number of instances that way. This means that if you can detect the load of your application, you can increase or decrease the number of instances automatically.

Summary

In this chapter you learned to take the first steps into developing applications for Windows Azure. Assuming you already know how to develop .NET applications, you can now get and fire up Visual Studio and begin. That said, Visual Studio is by no means required. As long as you have the development environment and SDK for you language of choice, which can be Java or PHP, too, you’re good to go.

Applications, also known as Hosted Services, can consist of one or more roles, with each role being run on one or more instances. To ensure availability, it’s advisable to run a role on at least two instances. Because instances can fail, you should not store data that needs to be persisted locally. That data should be placed inside Azure Storage Tables.

After you develop an application, you can test it in the local development fabric and then deploy it to Windows Azure. You can deploy using Visual Studio or through the Management Portal, which also enables you to make configuration changes afterward, such as increasing or decreasing the number of instances your application runs on. You can also make these changes programmatically through the Service Management API.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset