Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years’ experience in .NET and JavaScript. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Amazon and Aliyun) and right now, Shaun is being attracted by JavaScript (Angular.js and Node.js) and he likes it.

Shaun is working at Worktile Inc. as the chief architect for overall design and develop worktile, a web-based collaboration and task management tool, and lesschat, a real-time communication aggregation tool.

MVP

My Stats

  • Posts - 122
  • Comments - 553
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Archives


Post Categories


.NET


 

Although the azure application will be running on the sky we still need to develop it on the ground since we would not be able to fly. In order to make it easy and simple to work with azure on local machine we need to download the Azure SDK firstly. Azure SDK extend Visual Studio 2008 and Visual Studio 2010 RC to enable the creation, configuration, building, debugging, running and packaging of scalable web applications and services on Windows Azure, and it includes all necessary assemblies which we need to build the azure application.

Once we installed the SDK in the New Project dialog of Visual Studio we can find a new category named Cloud Service.

1

Once select the Windows Azure Cloud Service we can select and add the roles we want. As I mentioned in the last post a role = a project so here let's say we want to add an ASP.NET Web Application. Chose the ASP.NET Web Role and rename it to Web_Role.

2

Now we created a new azure application with one web role there. Let's look a bit about the projects and files it created for us.

3

There are 2 projects; CloudService2 is an azure project which only contains the configuration for deploying to roles to windows azure. Web_Role is a project very similar with the traditional web application, just a new file named WebRole.cs. This file contains some necessary procedures such as the application started, configuration changing, etc.

Let's add some test code on the default page and then just press F5 to run it and to see what's happening.

4

After compiled in Visual Studio the website appeared on the browser with the correct content I had modified, which is same as what we did when developing a normal website locally. But please notice the taskbar, there's a small blue Windows logo (ugly) appeared. The context menu contains some items like the image below.

5

The Development Fabric is a simulator running on the local machine. When running the azure application at local machine it will be deployed on the Development Fabric and execute all necessary procedures exactly same as what it will be on the Azure. So we can code and debug it without touching the azure service remotely. And if we click the Show Development Fabric UI item we can see the console-like screen appeared with some logs.

6

Let's go to the default page and add some logs on Page_Load function.

   1: protected void Page_Load(object sender, EventArgs e)
   2: {
   3:     System.Diagnostics.Trace
   4:         .TraceInformation("The Default.aspx page had been opened.");
   5: }

And then pressed F5 so we can see on the Development Fabric UI our log appeared there.

7

Next let's work a little bit on the storages that windows azure provides. As we know windows azure provides 3 storages for us: table storage, blob storage and queue storage. I don't want to dig into too much deeply on all of them, just an example on how to work with them locally rather than on the cloud. (I will explain more detailed about the storages in later posts.) Before starting play with them we need to establish the local storage simulation. The azure SDK provides a simulation for us but we need to make sure the there's an instance of SQL Server 2005 (Express) or higher installed in our computer. We navigate to the installation folder of windows azure SDK, normally should be C:\Program Files\Windows Azure SDK\v1.1 and execute \bin\devstore\DSInit.exe. It will create the tables for the simulation on your local SQL Server instance. (You need to add the login for the SQL Database for your currently user.)

If you have SQL Server installed (rather than Express version) you need to add “/sqlInstance” parameter as the DSInit tool utilize .\SQLEXPRESS by default. If you didn’t provide the instance name of your SQL Server just use “.” (dot) as its name so the full command would be “dsinit /sqlInstance .”.

8

If you back to the database through SSMS we can see the tables created by the simulation. While working with the storages locally our application will use these tables as the windows azure storage services. The Development Fabric exposes them by ADO.NET Data Service protocol, same like what it will be on azure. So we can use them exactly the same as on the cloud.

Now let's do something to test the storage. As I said I don't want to dig into much on all of them so let's just create a very simple web application called 'Whiteboard' that allows anyone write something and save it. We want to save the comments and display them on the default page so we need create a class for the data entity. Just create a new class named CommentEntity and paste the code below.

   1: public class CommentEntity : TableServiceEntity
   2: {
   3:     public string Content { get; set; }
   4:     public string Author { get; set; }
   5:     public DateTime CreateOn { get; set; }
   6:  
   7:     public CommentEntity()
   8:         : base(DateTime.UtcNow.ToString("yyyyMMdd"), 
   9:                string.Format("{0:10}_{1}", DateTime.MaxValue.Ticks - DateTime.Now.Ticks, Guid.NewGuid()))
  10:     {
  11:     }
  12: }

All entities stored in the table storage should have 2 public string properties: PartitionKey and RowKey. They are the unique identify columns for all entities. The azure SDK provides a base class for us named TableServiceEntity which we can use. It has the PartitionKey and RowKey properties there and the default constructor. So you can see that the CommentEntity we created was derived from it.

Since the storages are exposed by ADO.NET Data Service we need to create the relevant class to connect and consume it. We add another class named WhiteboardDataContext for the data context and source.

   1: public class WhiteboardDataContext : TableServiceContext
   2: {
   3:     private CloudStorageAccount _account;
   4:  
   5:     public IQueryable<CommentEntity> Comments
   6:     {
   7:         get
   8:         {
   9:             return CreateQuery<CommentEntity>("Comments");
  10:         }
  11:     }
  12:  
  13:     public WhiteboardDataContext(CloudStorageAccount account)
  14:         : base(account.TableEndpoint.AbsoluteUri, account.Credentials)
  15:     {
  16:         _account = account;
  17:  
  18:         var tableStorage = new CloudTableClient(_account.TableEndpoint.AbsoluteUri, _account.Credentials);
  19:         if (!tableStorage.DoesTableExist("Comments"))
  20:         {
  21:             CloudTableClient.CreateTablesFromModel(typeof(WhiteboardDataContext), _account.TableEndpoint.AbsoluteUri, _account.Credentials);
  22:         }
  23:     }
  24:  
  25:     public void AddCommentEntity(CommentEntity comment)
  26:     {
  27:         AddObject("Comments", comment);
  28:         SaveChanges();
  29:     }
  30: }

The WhiteboardDataContext derived from TableServiceContext, which is included in the azure SDK for connect to the table storage. What we did just simply add a property for retrieving the comments and insert a new comment into the storage. In the constructor, we firstly check whether the table for the comments had been initialized or not. If not then we create it from the context type with the credentials passed through the parameter.

Next we go to the frontend file - Default.aspx and add a textbox, a button for adding new comment; and a GridView for displaying all comments in the storage. In the backend code, we create several methods below.

   1: public partial class _Default : System.Web.UI.Page
   2: {
   3:     WhiteboardDataContext _context;
   4:  
   5:     protected void Page_Load(object sender, EventArgs e)
   6:     {
   7:         _context = new WhiteboardDataContext(CloudStorageAccount.FromConfigurationSetting("DataConnectionString"));
   8:  
   9:         if (!IsPostBack)
  10:         {
  11:             BindComments();
  12:         }
  13:     }
  14:  
  15:     protected void btnPostComment_Click(object sender, EventArgs e)
  16:     {
  17:         var comment = new CommentEntity()
  18:         {
  19:             Content = txtNewComment.Text,
  20:             CreateOn = DateTime.Now
  21:         };
  22:         _context.AddCommentEntity(comment);
  23:  
  24:         BindComments();
  25:     }
  26:  
  27:     private void BindComments()
  28:     {
  29:         var comments = _context.Comments.ToList();
  30:         gvComments.DataSource = comments;
  31:         gvComments.DataBind();
  32:     }
  33: }

Firstly we defined a local variant for the data context we had just created and initialize it when page's loading. Then we invoke the BindComments method to retrieve all comments and display on the GridView. In that method we just called the Comments property of the WhiteboardDataContext.
When adding a new comment, we created the comment entity based on what the user typed in the textbox and utilized AddCommentEntity method to insert and saved changes.

Before press F5 we need to do something special for azure development. First is that we need to define some configuration, which tells our application which storage it should use. Double click the Web_Role node under the CloudService2 project, navigate to the Settings panel and create another connect string named DataConnectString. Then in the Value cell open the editor form and just select the first radio button, which indicates using the development storage locally.

The next one is define the configuration setter for the azure application. This is to tell the application how to retrieve the configuration value based on the name. Just modify the OnStart method as the code below.

   1: public override bool OnStart()
   2: {
   3:     DiagnosticMonitor.Start("DiagnosticsConnectionString");
   4:  
   5:     // For information on handling configuration changes
   6:     // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
   7:     RoleEnvironment.Changing += RoleEnvironmentChanging;
   8:  
   9:     Microsoft.WindowsAzure.CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
  10:     {
  11:         configSetter(Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.GetConfigurationSettingValue(configName));
  12:     });
  13:  
  14:     return base.OnStart();
  15: }

And now we can run this application a see how it's going. First time there's nothing in the comment table so we got a blank textbox with a button. And we can enter some sentences and add it into the system.

12

Do not laugh at my UI skill I know I'm an amateur. The key point is that the comments saved in the table storage. If you open up the local SQL Server database you can see some records inserted into the TableRow table.

13

Notice that in the azure platform the data was not stored as the structure in local simulation. At local machine the simulations just created a table for the tables' schema and another for the data and expose them through Data Service by the Development Storage.

It's how we develop and debug (although I didn't show the debug screenshots but trust me we can insert a breakpoint and debug same as what we are doing now everyday) an azure application at local machine without touching the cloud at all. But we do need to upload it onto azure once we think it's good enough to be go-live. In the next post I would like to deploy this application on to azure, real azure.

 

Hope this helps.

Shaun

 

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


 

To me Azure is a new technology but in fact it's not that new. In the PDC08 Microsoft announced that their cloud computing platform had been released with the name Azure, which stand for the color of the sky. I began to play with Azure recently and will post a series of blogs for what it is and how to use it. But first of all we'd better to understand what's Azure.

If we went to the official website of Azure we can find the official definition.

The Windows Azure platform offers a flexible, familiar environment for developers to create cloud applications and services. With Windows Azure, you can shorten your time to market and adapt as demand for your service grows.

It said Azure provides a flexible and familiar environment where we can establish our application and service. The 'flexible' means that we can chose the services the Azure provides which we needed. The 'familiar' means that we don't need to learn much more knowledge to build our application on it. What we had known on web application and windows service (or say NT Service) application development could be used on building Azure application. What we need is to learn how to deploy and configuration.
In Azure there are 3 services provided for now: Windows Azure, SQL Azure and AppFabric. I would like to quote the simple definition from the official website.

Windows Azure: operating system as a service

SQL Azure: fully relational database in the cloud

AppFabric: makes it simpler to connect cloud and on-premises applications

Here I would talk a bit on the first 2 of them since AppFabric I haven't have time to work with.

 

Windows Azure

When I firstly heard about Windows Azure I doubted what's the different between it and the original web hosting provider. As we know we can rent either virtual dedicated server or web hosting, upload our website files and did some configuration through the IIS or the additional web configuration platform provided by the provider, such as the godaddy. Is there any difference between them and Windows Azure? Here I would like to explain it in 'the technology dialect' and 'the business dialect'.
What Windows Azure provides to you is not just a server or a virtual machine, it provides a platform which based on thousands of servers running under Windows Azure OS (based on Windows Server 2008 R2 I guess) with virtualization technologies. By deploying you website on Windows Azure you would not be able to know which server your application is hosted. That means you don't need to touch the IIS and you don't need to consider where your pages should be located.

1

Windows Azure provides the hosting platform with the latest web technologies from Microsoft such as ASP.NET 3.5 SP1, 4.0 Beta and ASP.NET MVC, etc. It also provides some other platforms such as PHP. So you don't need to check whether your application could be run in Azure or not.

Besides the web hosting Windows Azure also provides the storage services for simple usage which are Table Storage, Blob Storage and Queue Storage. You can use them for some lightweight data storage.
The table storage provides the structured data saving service. You can define the relevant class for the data you used and save/retrieve them with the table storage API. If you want to save some binary data such as images, xml, etc you could use the blob storage. And if you need to do something like windows service or NT service, you could use queue storage as a bridge to communicate with your website and your own service running in background. All of these 3 storages could be consumed through the ADO.NET Data Service protocol by your web application and your own background service.

In Windows Azure an application will be called 'Role' that you could treat 1 Role = 1 Application. A web application is a Web Role in Azure world. You can create ASP.NET Web Role, ASP.NET MVC Web Role and WCF Service Web Role etc. You can also create Worker Role, which is 'the background service' I mentioned before, something like the NT service. The worker role will be executed in background on the Azure as a background processing to do such as image archiving, sending emails, etc. Since in Windows Azure you can't see the hard disk of where your application is located, all data should be stored in the 3 storage services and process some background job by the worker role.

2

 

SQL Azure

Comparing with the Windows Azure, the SQL Azure will be more simple and easy to understand. SQL Azure likes the database of the traditional web hosting providers. It's some SQL Server 2008 instances installed on the Azure OS. Once you applied the SQL Azure you will be given a database address, username, a password and a connect string (ADO.NET and ODBC). And you can control it through the SQL Server Management Studio or any tools you prefer. (For now there are some problems when connect the SQL Azure through SSMS unfortunately.) But I'm not pretty sure if SQL Azure provides the other services such as Reporting Service, SQL Agent, etc.

Below is my simple, personal explanation of Azure in technical area. We can see that there are not many differences with the original web application developments which we are familiar with. We can use the 3-tier or N-tier architecture in our application, the server controls and MVC helpers will be working well under Azure as well. The difference is just some configuration to deploy it on Azure and that's all. So that's what Microsoft said 'familiar environment'.
In the business side deploying on Azure we don't need to consider the hardware configuration of the server(s). The Azure just needs to know how many CPU cores we want and how much band we need. And with the growing of our business we just need to increase them without changing the server and the data migration. The infrastructure of Azure will take care of them.

In the future there will be some more services on Azure such as SharePoint, Dynamic CRM (XRM), etc. Then we could build our application based on them without installing on our local. But the problem is do we have to code and debug our application on Azure when developing? Is there any way that we can code and debug locally? In the coming few posts I would like to explain more on how to build our application by Visual Studio, test it by using local simulation.

 

Hope this helps.

Shaun


 

This is my first post on this blog site. Just want to say hello to all of you.