Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years’ experience in .NET and JavaScript. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Amazon and Aliyun) and right now, Shaun is being attracted by JavaScript (Angular.js and Node.js) and he likes it.

Shaun is working at Worktile Inc. as the chief architect for overall design and develop worktile, a web-based collaboration and task management tool, and lesschat, a real-time communication aggregation tool.

MVP

My Stats

  • Posts - 122
  • Comments - 579
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Article Categories


Archives


Post Categories


.NET



When migrate your application onto the Azure one of the biggest concern would be the external files. In the original way we understood and ensure which machine and folder our application (website or web service) is located in. So that we can use the MapPath or some other methods to read and write the external files for example the images, text files or the xml files, etc. But things have been changed when we deploy them on Azure. Azure is not a server, or a single machine, it’s a set of virtual server machine running under the Azure OS. And even worse, your application might be moved between thses machines. So it’s impossible to read or write the external files on Azure. In order to resolve this issue the Windows Azure provides another storage serviec – Blob, for us.

Different to the table service, the blob serivce is to be used to store text and binary data rather than the structured data. It provides two types of blobs: Block Blobs and Page Blobs.

  • Block Blobs are optimized for streaming. They are comprised of blocks, each of which is identified by a block ID and each block can be a maximum of 4 MB in size.
  • Page Blobs are are optimized for random read/write operations and provide the ability to write to a range of bytes in a blob. They are a collection of pages. The maximum size for a page blob is 1 TB.

 

In the managed library the Azure SDK allows us to communicate with the blobs through these classes CloudBlobClient, CloudBlobContainer, CloudBlockBlob and the CloudPageBlob.

Similar with the table service managed library, the CloudBlobClient allows us to reach the blob service by passing our storage account information and also responsible for creating the blob container is not exist. Then from the CloudBlobContainer we can save or load the block blobs and page blobs into the CloudBlockBlob and the CloudPageBlob classes.

 

Let’s improve our exmaple in the previous posts – add a service method allows the user to upload the logo image.

In the server side I created a method name UploadLogo with 2 parameters: email and image. Then I created the storage account from the config file. I also add the validation to ensure that the email passed in is valid.

   1: var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
   2: var accountContext = new DynamicDataContext<Account>(storageAccount);
   3:  
   4: // validation
   5: var accountNumber = accountContext.Load()
   6:     .Where(a => a.Email == email)
   7:     .ToList()
   8:     .Count;
   9: if (accountNumber <= 0)
  10: {
  11:     throw new ApplicationException(string.Format("Cannot find the account with the email {0}.", email));
  12: }

Then there are three steps for saving the image into the blob service. First alike the table service I created the container with a unique name and create it if it’s not exist.

   1: // create the blob container for account logos if not exist
   2: CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient();
   3: CloudBlobContainer container = blobStorage.GetContainerReference("account-logo");
   4: container.CreateIfNotExist();

Then, since in this example I will just send the blob access URL back to the client so I need to open the read permission on that container.

   1: // configure blob container for public access
   2: BlobContainerPermissions permissions = container.GetPermissions();
   3: permissions.PublicAccess = BlobContainerPublicAccessType.Container;
   4: container.SetPermissions(permissions);

And at the end I combine the blob resource name from the input file name and Guid, and then save it to the block blob by using the UploadByteArray method. Finally I returned the URL of this blob back to the client side.

   1: // save the blob into the blob service
   2: string uniqueBlobName = string.Format("{0}_{1}.jpg", email, Guid.NewGuid().ToString());
   3: CloudBlockBlob blob = container.GetBlockBlobReference(uniqueBlobName);
   4: blob.UploadByteArray(image);
   5:  
   6: return blob.Uri.ToString();

Let’s update a bit on the client side application and see the result. Here I just use my simple console application to let the user input the email and the file name of the image. If it’s OK it will show the URL of the blob on the server side so that we can see it through the web browser.

image

Then we can see the logo I’ve just uploaded through the URL here.

image

You may notice that the blob URL was based on the container name and the blob unique name. In the document of the Azure SDK there’s a page for the rule of naming them, but I think the simple rule would be – they must be valid as an URL address. So that you cannot name the container with dot or slash as it will break the ADO.Data Service routing rule. For exmaple if you named the blob container as Account.Logo then it will throw an exception says 400 Bad Request.

 

Summary

In this short entity I covered the simple usage of the blob service to save the images onto Azure. Since the Azure platform does not support the file system we have to migrate our code for reading/writing files to the blob service before deploy it to Azure.

In order to reducing this effort Microsoft provided a new approch named Drive, which allows us read and write the NTFS files just likes what we did before. It’s built up on the blob serivce but more properly for files accessing. I will discuss more about it in the next post.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

Comments

Gravatar # re: Azure – Part 6 – Blob Storage Service
Posted by jlezard on 9/15/2010 6:30 PM
Thanks for this helpful post
Gravatar # re: Azure – Part 6 – Blob Storage Service
Posted by Naveen on 12/14/2010 2:37 PM
Hi ,

I had upload the xml file into azure blob, now I want to read the xml file form the blob and i have to modify xml values then again i want to upload the same xml file in to the blob, could you pls help me regarding the same,,,,

how can i do this one in azure tables....

naveen
Gravatar # re: Azure – Part 6 – Blob Storage Service
Posted by Shaun on 12/14/2010 3:48 PM

@Naveen
I think using BLOB would be the best choice in you case. But anyway if you want to use tables, I would suggest you create an entity with a string property to store your XML value. The table storage doesn't support the XMLDocument, XDocument type as its property.
Post A Comment
Title:
Name:
Email:
Comment:
Verification: