Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years experience in .NET. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Aliyun) as well as WCF and ASP.NET MVC. Recently he's falling in love with JavaScript and Node.js.

Currently Shaun is working at IGT Technology Development (Beijing) Co., Ltd. as the architect responsible for product framework design and development.

MVP

My Stats

  • Posts - 95
  • Comments - 338
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Archives


Post Categories


 

In the next generation of SQL Server, codename “Denali”, there is a new feature named Data-tier Application Framework v2.0 Feature Pack CTP to enhance the data import and export with SQL Server, and currently it had been available in SQL Azure Labs.

 

Run Data-tier Application Locally

From the portal we know that the Data-tier Application CTP2 can be executed from the development machine through an EXE utility. So what we need to do is to download the components listed below. Only one thing, the SQLSysClrTypes.msi should be installed before the SharedManagementObjects.msi.

And then just download the DacImportExportCli.zip, the client side utility and proform the import export command in the local machine against the SQL Server and SQL Azure. For example we have a SQL Azure database available here:

  • Server Name: uivfb7flu1.database.windows.net
  • Database: aurorasys
  • Login: aurora@uivfb7flu1
  • Password: MyPassword

Then we can run the command like this below to export the database schema and data into a single file on local disk named “aurora.dacpac”.

   1: DacImportExportCli.exe -s uivfb7flu1.database.windows.net -d aurorasys -f aurora.dacpac -x -u aurora@uivfb7flu1 -p MyPassword

And if we needed we can import it back to another SQL Azure database.

   1: DacImportExportCli.exe -s uivfb7flu1.database.windows.net -d aurorasys_dabak -f aurora.dacpac -i -edition web -size 1 -u aurora@uivfb7flu1 -p MyPassword

But since the SQL Server backup and restore feature is not supported in SQL Azure currently, and many projects I was working on were willing to backup their databases on SQL Azure from the website running on Windows Azure, so in this post I would like to demonstrate how to perform the export from the cloud instead of in local command windows.

 

Run Data-tier Application on Cloud

From this wiki we know that the Data-tier Application can be invoked by the EXE or directly from the managed library. We can leverage the DACStore class in Microsoft.SQLServer.Management.DAC.dll to import and export from the code. So in this case we will build the web role to export the SQL Azure database via the DACStore.

The first thing we need to do is to add the references about the Data-tier Application to the cloud project. But to be aware that in Windows Azure the role instance virtual machine only has the .NET Framework installed. So we need to add all DLLs Data-tier Application needed in our project and set their Copy Local = True. Here is the list we need to add:

  • Microsoft.SqlServer.ConnectionInfo.dll
  • Microsoft.SqlServer.Diagnostics.STrace.dll
  • Microsoft.SqlServer.Dmf.dll
  • Microsoft.SqlServer.Management.DacEnum.dll
  • Microsoft.SqlServer.Management.DacSerialization.dll
  • Microsoft.SqlServer.Management.SmoMetadataProvider.dll
  • Microsoft.SqlServer.Management.SqlParser.dll
  • Microsoft.SqlServer.Management.SystemMetadataProvider.dll
  • Microsoft.SqlServer.SqlClrProvider.dll
  • Microsoft.SqlServer.SqlEnum.dll
  • If you had installed the components mentioned above you can find the assemblies in your GAC directory. You should copy them from the GAC directory and then added to the project. Regarding how to copy files from GAC please refer to this article. Otherwise you can download the source code from the link at the end of this post which has the assemblies attached.

    And next we move back to the source code and create a page to let the user input the SQL Azure database information that he wants to export. In this example I created an ASP.NET MVC web role and let the user enter the information in Home/Index. After the user clicked the button we will begin to export the database. Firstly we should create the connection string from the user input, and then create an instance of ServerConnection, defined in Microsoft.SqlServer.ConnectionInfo.dll, that will be used to connect to the database when exporting.

       1: // establish the connection to the sql azure
       2: var csb = new SqlConnectionStringBuilder()
       3: {
       4:     DataSource = model.Server,
       5:     InitialCatalog = model.Database,
       6:     IntegratedSecurity = false,
       7:     UserID = model.Login,
       8:     Password = model.Password,
       9:     Encrypt = true,
      10:     TrustServerCertificate = true
      11: };
      12: var connection = new ServerConnection()
      13: {
      14:     ConnectionString = csb.ConnectionString,
      15:     StatementTimeout = int.MaxValue
      16: };

And then we will create the DACStore from the ServerConnection and handle some events such as action initialized, started and finished. We will store the events into a StringBuilder and show after finished.

   1: // create the dac store for exporting
   2: var dac = new DacStore(connection);
   3:  
   4: // handle events of the dac
   5: dac.DacActionInitialized += (sender, e) =>
   6: {
   7:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.Error);
   8: };
   9: dac.DacActionStarted += (sender, e) =>
  10: {
  11:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.Error);
  12: };
  13: dac.DacActionFinished += (sender, e) =>
  14: {
  15:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.ActionState == ActionState.Warning ? string.Empty : (e.Error == null ? string.Empty : e.Error.ToString()));
  16: };

The next thing we are going to do is to prepare the BLOB storage. We will export the database to a local file firatly in Windows Azure virtual machine and then upload to BLOB storage for future download.

   1: // prepare the blob storage
   2: var account = CloudStorageAccount.FromConfigurationSetting("DataConnection");
   3: var client = account.CreateCloudBlobClient();
   4: var container = client.GetContainerReference("dacpacs");
   5: container.CreateIfNotExist();
   6: container.SetPermissions(new BlobContainerPermissions() { PublicAccess = BlobContainerPublicAccessType.Container });
   7: var blobName = string.Format("{0}.dacpac", DateTime.Now.ToString("yyyyMMddHHmmss"));
   8: var blob = container.GetBlobReference(blobName);

Then the final job is to invoke the Export method of the DACStore after the ServerConnection.Connect() to be invoked. The Data-tier Application library will connect to the database and perform the actions to export the schema and data to the file specified.

   1: // export to local file system
   2: connection.Connect();
   3: var filename = Server.MapPath(string.Format("~/Content/{0}", blobName));
   4: dac.Export(model.Database, filename);

And after that we will upload the file to BLOB and remove the local one.

   1: // upload the local file to blob
   2: blob.UploadFile(filename);
   3:  
   4: // delete the local file
   5: System.IO.File.Delete(filename);
   6:  
   7: output.AppendFormat("Finised to export to {0}, elapsed {1} seconds.<br />", blob.Uri, sw.Elapsed.TotalMilliseconds.ToString("0.###"));

Finally we can download the DACPAC file from the BLOB storage to local machine.

 

Summary

The Data-tier Application enables us to be able to export the database schema and data from SQL Azure to anywhere we want, such as the BLOB storage or the local disk. This means we don’t need to pay for the backup database but only the storage usage if we put it into BLOB, and the bandwidth cost if we need to download from the BLOB. But I’m not sure if we need to pay for the bandwidth between the web role and SQL Azure if they are in the same data center (sub-region).

One more thing must be highlighted here is that, the Data-tier Application is NOT a fully backup solution. If you need the transactional consistent backup you should think about the Database Copy or Data Sync. But if you just need the schema and data this might be best choice.

PS: You can download the source code and the assemblies here.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

When we build an application that will be deployed on Windows Azure, one thing we should keep in mind is that the assemblies installed on the virtual machines on Windows Azure are very limited. It only contains the default assemblies in .NET 3.5 SP1 and .NET 4.0. For those references not included on Windows Azure, we should set their Copy Local = True.

image

As the image shown above I set the unity reference as Copy Local = True as I pretty know that in Windows Azure the assemblies of Enterprise Library will not be installed by default. But the problem is that, which references are not on Windows Azure.

I had to mark all suspicious references to copy local until I found this website, http://gacviewer.cloudapp.net.

image

It lists all assemblies that installed on Windows Azure machine in the default page. And additionally, it allows us to upload our project file (web role or worker role) and verifies which references should be copy local. Now it supports C# and VB.NET projects.

image

After I uploaded the project file the website verified the references are not present on Windows Azure which need to be set Copy Local = True. It also lists the ones were installed already.

 

If we forget to set the references not installed on Windows Azure to Copy Local our application will be in trouble while deployed and initialized. The role status loops between start … initializing … busy … stop … start which is very strange. To figure out what’s wrong with it we have to enable the IntelliTrace and Copy Local the reference we forgot and have another try which is very time-consuming. But if we checked our project though this website things will become much simpler.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.