Shaun Xu

The Sheep-Pen of the Shaun



Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years’ experience in .NET and JavaScript. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Amazon and Aliyun) and right now, Shaun is being attracted by JavaScript (Angular.js and Node.js) and he likes it.

Shaun is working at Worktile Inc. as the chief architect for overall design and develop worktile, a web-based collaboration and task management tool, and lesschat, a real-time communication aggregation tool.


My Stats

  • Posts - 122
  • Comments - 622
  • Trackbacks - 0

Tag Cloud

Recent Comments

Recent Posts


Post Categories


In the next generation of SQL Server, codename “Denali”, there is a new feature named Data-tier Application Framework v2.0 Feature Pack CTP to enhance the data import and export with SQL Server, and currently it had been available in SQL Azure Labs.


Run Data-tier Application Locally

From the portal we know that the Data-tier Application CTP2 can be executed from the development machine through an EXE utility. So what we need to do is to download the components listed below. Only one thing, the SQLSysClrTypes.msi should be installed before the SharedManagementObjects.msi.

And then just download the, the client side utility and proform the import export command in the local machine against the SQL Server and SQL Azure. For example we have a SQL Azure database available here:

  • Server Name:
  • Database: aurorasys
  • Login: aurora@uivfb7flu1
  • Password: MyPassword

Then we can run the command like this below to export the database schema and data into a single file on local disk named “aurora.dacpac”.

   1: DacImportExportCli.exe -s -d aurorasys -f aurora.dacpac -x -u aurora@uivfb7flu1 -p MyPassword

And if we needed we can import it back to another SQL Azure database.

   1: DacImportExportCli.exe -s -d aurorasys_dabak -f aurora.dacpac -i -edition web -size 1 -u aurora@uivfb7flu1 -p MyPassword

But since the SQL Server backup and restore feature is not supported in SQL Azure currently, and many projects I was working on were willing to backup their databases on SQL Azure from the website running on Windows Azure, so in this post I would like to demonstrate how to perform the export from the cloud instead of in local command windows.


Run Data-tier Application on Cloud

From this wiki we know that the Data-tier Application can be invoked by the EXE or directly from the managed library. We can leverage the DACStore class in Microsoft.SQLServer.Management.DAC.dll to import and export from the code. So in this case we will build the web role to export the SQL Azure database via the DACStore.

The first thing we need to do is to add the references about the Data-tier Application to the cloud project. But to be aware that in Windows Azure the role instance virtual machine only has the .NET Framework installed. So we need to add all DLLs Data-tier Application needed in our project and set their Copy Local = True. Here is the list we need to add:

  • Microsoft.SqlServer.ConnectionInfo.dll
  • Microsoft.SqlServer.Diagnostics.STrace.dll
  • Microsoft.SqlServer.Dmf.dll
  • Microsoft.SqlServer.Management.DacEnum.dll
  • Microsoft.SqlServer.Management.DacSerialization.dll
  • Microsoft.SqlServer.Management.SmoMetadataProvider.dll
  • Microsoft.SqlServer.Management.SqlParser.dll
  • Microsoft.SqlServer.Management.SystemMetadataProvider.dll
  • Microsoft.SqlServer.SqlClrProvider.dll
  • Microsoft.SqlServer.SqlEnum.dll
  • If you had installed the components mentioned above you can find the assemblies in your GAC directory. You should copy them from the GAC directory and then added to the project. Regarding how to copy files from GAC please refer to this article. Otherwise you can download the source code from the link at the end of this post which has the assemblies attached.

    And next we move back to the source code and create a page to let the user input the SQL Azure database information that he wants to export. In this example I created an ASP.NET MVC web role and let the user enter the information in Home/Index. After the user clicked the button we will begin to export the database. Firstly we should create the connection string from the user input, and then create an instance of ServerConnection, defined in Microsoft.SqlServer.ConnectionInfo.dll, that will be used to connect to the database when exporting.

       1: // establish the connection to the sql azure
       2: var csb = new SqlConnectionStringBuilder()
       3: {
       4:     DataSource = model.Server,
       5:     InitialCatalog = model.Database,
       6:     IntegratedSecurity = false,
       7:     UserID = model.Login,
       8:     Password = model.Password,
       9:     Encrypt = true,
      10:     TrustServerCertificate = true
      11: };
      12: var connection = new ServerConnection()
      13: {
      14:     ConnectionString = csb.ConnectionString,
      15:     StatementTimeout = int.MaxValue
      16: };

And then we will create the DACStore from the ServerConnection and handle some events such as action initialized, started and finished. We will store the events into a StringBuilder and show after finished.

   1: // create the dac store for exporting
   2: var dac = new DacStore(connection);
   4: // handle events of the dac
   5: dac.DacActionInitialized += (sender, e) =>
   6: {
   7:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.Error);
   8: };
   9: dac.DacActionStarted += (sender, e) =>
  10: {
  11:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.Error);
  12: };
  13: dac.DacActionFinished += (sender, e) =>
  14: {
  15:     output.AppendFormat("{0}: {1} {2} {3}<br />", e.ActionName, e.ActionState, e.Description, e.ActionState == ActionState.Warning ? string.Empty : (e.Error == null ? string.Empty : e.Error.ToString()));
  16: };

The next thing we are going to do is to prepare the BLOB storage. We will export the database to a local file firatly in Windows Azure virtual machine and then upload to BLOB storage for future download.

   1: // prepare the blob storage
   2: var account = CloudStorageAccount.FromConfigurationSetting("DataConnection");
   3: var client = account.CreateCloudBlobClient();
   4: var container = client.GetContainerReference("dacpacs");
   5: container.CreateIfNotExist();
   6: container.SetPermissions(new BlobContainerPermissions() { PublicAccess = BlobContainerPublicAccessType.Container });
   7: var blobName = string.Format("{0}.dacpac", DateTime.Now.ToString("yyyyMMddHHmmss"));
   8: var blob = container.GetBlobReference(blobName);

Then the final job is to invoke the Export method of the DACStore after the ServerConnection.Connect() to be invoked. The Data-tier Application library will connect to the database and perform the actions to export the schema and data to the file specified.

   1: // export to local file system
   2: connection.Connect();
   3: var filename = Server.MapPath(string.Format("~/Content/{0}", blobName));
   4: dac.Export(model.Database, filename);

And after that we will upload the file to BLOB and remove the local one.

   1: // upload the local file to blob
   2: blob.UploadFile(filename);
   4: // delete the local file
   5: System.IO.File.Delete(filename);
   7: output.AppendFormat("Finised to export to {0}, elapsed {1} seconds.<br />", blob.Uri, sw.Elapsed.TotalMilliseconds.ToString("0.###"));

Finally we can download the DACPAC file from the BLOB storage to local machine.



The Data-tier Application enables us to be able to export the database schema and data from SQL Azure to anywhere we want, such as the BLOB storage or the local disk. This means we don’t need to pay for the backup database but only the storage usage if we put it into BLOB, and the bandwidth cost if we need to download from the BLOB. But I’m not sure if we need to pay for the bandwidth between the web role and SQL Azure if they are in the same data center (sub-region).

One more thing must be highlighted here is that, the Data-tier Application is NOT a fully backup solution. If you need the transactional consistent backup you should think about the Database Copy or Data Sync. But if you just need the schema and data this might be best choice.

PS: You can download the source code and the assemblies here.


Hope this helps,


All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


No comments posted yet.
Post A Comment