Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years experience in .NET. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Aliyun) as well as WCF and ASP.NET MVC. Recently he's falling in love with JavaScript and Node.js.

Currently Shaun is working at IGT Technology Development (Beijing) Co., Ltd. as the architect responsible for product framework design and development.

MVP

My Stats

  • Posts - 103
  • Comments - 364
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Archives


Post Categories



In my project I'm using SignalR and Angular.js to make the page real-time. But I found a slightly problem. It's a little bit hard to notice user when something changed if there are a lot of fields in the page. Hence I decided to create a small directive which will apply highlight style when it the model value was changed.

 

Below is how it looks like. And you can play with it on my Plunker.

ezgif.com-optimize (1)

Step 1

Download and include the source code of this directive from Github.

 

Step 2

Add dependency to your Angular.js application. Just like the code in Plunker.

   1: var app = angular.module("Demo", ['sx.changeHighlight']);

 

Step 3

Apply this directive in any element you want. It will apply the "text-shadow" style for this element. Also need to specify which value in $scope it will be watching, which means when the value was changed, it will apply highlight style on this element.

In the code below my directive is monitoring $scope.name and when it's changed, the "p" element will apply highlight style.

   1: <p sx-change-highlight ng-model="name">
   2:   Hello, my name is {{name}}.
   3: </p>

It's worth mentioning that you can monitor on anything in $scope, regardless whether it will be displayed in the content your directive was attached or in your page. And besides, you can apply this directive in any element.

In the code below, I applied this directive in a table row.

   1: <td sx-change-highlight ng-model="target.status" ng-class="getStatusTextClass(target.status)">{{target.status | jobStatus}}</td>

 

Step 4 (Optional)

There are some options you can define for the highlight style listed below.

Attribute Description Default Value
timeout How long the highlight style will be faded out. 1 second.
interval Interval highlight style will be faded out from highlight style to normal style. 0.1 second.
skip How many times it will NOT apply highlight style when model changed. This is useful if it's changed when page loaded. 1
blurRadius The redius value of "text-shadow" style. 10
color The highlight color. #337ab7

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


Last week after watched videos of ngConf 2015, I decided to have a try on Angular 2.0. And I think the best start point would be its new router. The application I'm going to change is "My DocumentDB", a web-based Microsoft Azure DocumentDB management tool I created last year. Although Google said it would be very simple and smoothly to migrate from Angular 1.x to this new router and then easy to go forward to Angular 2.0, I found a lot of problem if I'm using Angular-UI-Router.

In this post I would like to introduce what I did, what I suffered and the resolutions. But please keep in mind that Angular New Router is still in development phase. The code and API is being changed very often. So it's recommended by Google and the community, NOT to be used in any production application.

 

Upgrade to Angular.js 1.3.15 or Higher

I didn't find any documentation on the wiki of the new router (and the wiki is outdate right now I think) about the dependencies. But in fact we need to upgrade Angular.js reference to at least 1.3.15, since the new router needs template module that's not available in pervious version of Angular.

 

Download & Install

There are several ways we can get the code of the new router. The wiki told us to download through NPM. But I recommended just download directly from its GitHub repository. The source code is under "dist" folder and we need to download "router.es5.js" for Angular 1.x application.

The full path of this file is https://raw.githubusercontent.com/angular/router/master/dist/router.es5.js Then it into "index.html".

image

 

Add to Dependency

As I mentioned my application was using Angular-UI-Router. In order to use Angular New Router I need to switch the dependency. The module name is 'ngNewRouter'.

   1: var app = angular.module('MyDocDB', [
   2:     'ngNewRouter',
   3:     'ui.bootstrap'
   4: ]);

 

Define the Routes

New router uses a factory named "$router" for routes definition, URL generation and the page navigation. It likes previous router and Angular UI Router very much. The only difference is "$router" is a factory instead of a provider. This means we cannot define our routes in Angular.js "configuration" routine, but in a controller or it's "run" routine.

Below is the routes definition for the new router in my application. I put the code in "app.run" function.

   1: app.run(['$router', function ($router) {
   2:     $router.config([
   3:         {
   4:             path: '/dashboard',
   5:             component: 'dashboard'
   6:         },
   7:         {
   8:             path: '/console',
   9:             component: 'console'
  10:         },
  11:         {
  12:             path: '/credits',
  13:             component: 'credits'
  14:         },
  15:         {
  16:             path: '/database',
  17:             component: 'database'
  18:         },
  19:         {
  20:             path: '/collections',
  21:             component: 'collection'
  22:         },
  23:         {
  24:             path: '/documents',
  25:             component: 'document'
  26:         },
  27:         {
  28:             path: '/',
  29:             redirectTo: '/dashboard'
  30:         }
  31:     ]);
  32: }]);

 

Re-Organize Source Files to Support Component

Angular new router introduces a new concept called "component". A component is a view, plus a controller, and an optional router. This sound like "state" in UI Router. (In UI Router, a route contains a state, an option with URL and views.) This encourages us to put everything related with a feature in a dedicate folder. For example let's say I have a contact feature, then I need a folder named "contact" that contains all stuff related with contact, such as "contact.js" for controllers, "contact.html" for view, and some other files such as "contact-import-dialog.js", "contact-import-dialog.html" and "contact-avatar.js" for directives, factories and filters etc. If we did like this, it would not be a massive effort in this step.

But if I organized my application by Angular categories, this would be painful. For example, I have a folder named "views" contains "contact.html", "account.html" and "order.html" while a folder named "controllers contains all Angular controllers, "services" folder contains all factories and services.

And even worse, there's a project named "angular-seed" by the Angular team, demonstrated the best practice an application skeleton for a typical Angular.js web app, which was organized in this way. Although the current version of angular-seed was organized by features to fulfill the component concept in Angular 2. But if your application was referred by this project before last year, you might run into the problem that I have, as below.

image

So what I need is to re-organize them in the proper way. And as you can see, file name for my default views are all "index.html". This will make angular new router cannot find them when a route is active. So I also need to change the file name from "index.html" to the related component name.

image

You may notice that I didn't put my controllers into the components folder. Well I personally suggest you move them as well to make the folder structure clean. But this is actually optional. Angular new router utilizes "controller as" syntax to find the controller name from the component name. The default syntax is, convert the first char of the component name to upper case, then append "Controller". For example, if I have a component named "database", the controller name should be "DatabaseController".

The good news is, this can be configured. There's a provider named "$componentLoaderProvider". We can define how Angular find the controller by the component name. In my case all my controllers a named with "Ctrl" suffixed. So this is the code for my application.

   1: app.config(['$componentLoaderProvider', function ($componentLoaderProvider) {
   2:     $componentLoaderProvider.setCtrlNameMapping(function (name) {
   3:         return name[0].toUpperCase() + name.substr(1) + 'Ctrl';
   4:     });
   5: }]);

"$componentLoaderProvider.componentToTemplate" can be used to configure how new router find view template path by component name. The default logic is go to "/components" folder, find the folder with the component name, then find file in dash mode. For example, the default view path of "database" component would be "/components/database/database.html".

 

Change $scope in Controllers

In Angular 1.x we use "$scope" to communicate between view and controller. But in angular new router, we cannot do this. Angular new router use the component name to bind data with the controller. This means if we have a database controller and want to show database name in view, the code below would not work.

   1: // Controller
   2: app.controller('DatabaseController', function ($scope) {
   3:     $scope.name = 'db1';
   4: });
   5:  
   6: // View
   7: <p>{{name}}</p>

Instead we need to change the code as below.

   1: // Controller
   2: app.controller('DatabaseController', function () {
   3:     this.name = 'db1';
   4: });
   5:  
   6: // View
   7: <p>{{database.name}}</p>

I don't know whether this change is good or not, but in fact it introduces a lot more effort in migration. And more confusing, "$scope" would turned to be available if the controller was not activated by new router. For example, if I specified a controller by "ng-controller" in HTML, "$scope" would be available.

In order to minimize migration effort what I did is to have a local variant named "$scope" and assign "this" to it in each controllers. Then we don't need to change any further codes in JavaScript, just need to change view content.

   1: app.controller('DocumentCtrl', function ($rootScope, $router, $location, $alert, $modal, api) {
   2:     var $scope = this;
   3:  
   4:     var refresh = function () {
   5:         if ($scope.col.collectionLink) {
   6:             ... ...
   7:         }
   8:     };
   9:  
  10:     $scope.delete = function (doc) {
  11:         ... ...
  12:     };
  13:  
  14:     $scope.createOrUpdate = function (doc) {
  15:         ... ...
  16:     };
  17:  
  18:     var query = $location.search();
  19:     $scope.col = {
  20:         databaseId: decodeURIComponent(query.did),
  21:         databaseLink: decodeURIComponent(query.dl),
  22:         collectionId: decodeURIComponent(query.cid),
  23:         collectionLink: decodeURIComponent(query.cl)
  24:     };
  25:  
  26:     ... ...
  27:  
  28:     refresh();
  29: });

 

Change View and Link Directives

Angular new router uses "ng-viewport" directive for view definition. It uses "ng-link" to generate link for a component. This would be very easy it migrate.

In the official wiki it said you need to use "router-view-port" and "router-link". But this should be obsoleted. The latest documents in Github have been changed based on the latest commits, says using "ng-viewport" and "ng-link".

Below are some code changes related with them in my application.

image

 

Define & Pass Parameters in Route

Sometimes we need to pass parameters though route. For example when user tried to view the details of a contact, we might want to pass the contact ID to the route through URL, then the details controller can retrieve ID and fire $http request to the backend service. This can be done easily in Angular UI Router, but not that easy in new router at this stage, especially if we want to pass parameters through query string with some specially chars for URL. Let's have a look to my application.

In "My DocumentDB" when user click into a collection or a document, I need to pass some parameters, such as the collection ID, link, document ID and link. It can be done in Angular UI Router in the way below.

   1: $stateProvider.state('document', {
   2:     url: '/documents/?did&dl&cid&cl',
   3:     templateUrl: '/views/document/index.html',
   4:     controller: 'DocumentIndexCtrl'
   5: });

Then I can generate the actual URL by using "ui-sref" directive. The URL will be generated accordingly with necessary URL encoding.

   1: <a data-ui-sref="document({ did: db.id, dl: db.link, cid: collection.id, cl: collection._self })">{{collection.id}}</a>

Unfortunately angular new router doesn't support passing parameter through query string at this stage. After dig into its Github repository I found an issue that someone already raised. timkindberg said since angular new router utilizes route-recognizer which is definitely solid in its support for query parameters, it should be OK to use query string. Then SamVerschueren posted a solution and a commit which angular new router must be changed to support it.

Since this commit was not included in "router.es5.js" in the latest version (0.5.1), we need to change it manually.

image

Now when we need to pass parameters in query string we can use "ng-link" as below. And we can retrieve them through "$location.search()".

   1: <a ng-link="home({queryParams: {foo: 5, bar:10}})">Go Home With Query</a>

This can solve most of the cases, but not mine. In "My DocumentDB" I need to pass database, collection and document link through query string and the link contains some chars which need to be URL encoded. This is being taken cared by UI Router. But when I migrated to angular new router I have to do this by myself.

image

And when I retrieved values from query string I also need decode them as well.

image

 

Two Other Issues

Basically after applied steps listed above, we should be able to make our application run under the new angular router. But when I migrated I also find some other issues. It may or may not impact your application but I think I'd better point them out.

The first one is "Controller Executed Twice". I have a plunker to demonstrate and already sent an issue to angular team. After launch my plunker and navigate between three routes, you will find the controller was executed twice indicated in browser console. Angular team had assign this issue fix into the next version (0.5.2).

Second, related with the first. When I passed parameters though query string, since the controller was executed twice, some parameters was missed in its first execution. This caused my application crashed due to mandatory information missed. So what I did is to perform some additional validation.

   1: app.controller('DocumentCtrl', function ($rootScope, $router, $location, $alert, $modal, api) {
   2:     var $scope = this;
   3:  
   4:     var refresh = function () {
   5:         // this validation should not be here
   6:         // it's only because angular new router missed some query strings in its first execution
   7:         if ($scope.col.collectionLink) {
   8:             ... ...
   9:         }
  10:     };
  11:  
  12:     ... ...
  13: });

Summary

In this post I introduced my experience of migrating my application from Angular UI Router to Angular New Router. I listed all steps, as well as issues and resolution I found. But there are some features I didn't cover  such as URL generation, multiple views, controller lazy load etc.

Since the angular new router is still in development phase, there are many problems and effort to migrate. And based on the experience I have to say it's very hard and time consuming to migrate an Angular application from 1.x to new router, then might cost more to move to 2.0. Hence I strongly suggest NOT trying to do this for your Angular application until it become more stable. And I also have to say, this might be a big problem when user wants to use Angular especially when 2.0 is going to be out. Google should be thinking more carefully about the backward compatibility and migration effort.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


During the Chinese New Year holiday, Microsoft had just announced a new feature in V12 SQL Database named Dynamic Data Masking. This feature limits sensitive data exposure by masking it to non-privileged users.

We often have similar requirement in our project, that for some users they cannot view some of the data by displaying masked value. For example, email address may need to be displayed as j******@gmail.com in user profile page for normal visitor. In this case what we need to do is to implement the masking logic in our code. But this not very secured, and adds effort in our application layer.

SQL Database Dynamic Data Masking helps us preventing unauthorized access to sensitive data. Since it's inside SQL Database, there is almost no impact on the application layer.

 

Enable Dynamic Data Masking

To enabled this feature just open the SQL Database in azure new preview portal, open Dynamic Data Masking icon and enable it.

imagePlease ensure your SQL Database supports V12 and latest updates. This is general available in some regions, but may still be in public preview stage in others. For example it is in preview stage in East Asia so you have to check the item below.

image

And make sure the pricing tier you selected supports this feature.

image

Now everything is OK. We can create our tables and insert data records into this new SQL Database. Assuming we have a table named Contacts with several columns:

1, ID: Integer, no need to protect.

2, Name: String, user name, no need to protect.

3, Email: String, need to be masked for normal user.

4, Credit Card Number: String, need to be masked for normal user.

5, Password Hint: String, need to be masked for normal user.

 

Configure Masking Policy

Even though we have data in tables and columns, we can add masking policies without data modification. Just configure the policy in azure portal by opening the Dynamic Data Masking icon.

First we need to define which SQL Server Logins have the permission to view unmasked data, which is called Privileged Login. In this case I already have two logins in my SQL Database Server: super_user and normal_user. I added super_user to the privileged logins.

image

Then specify the table and column name as well as the masking policy. For example for the Email column I was using build-in email masking policy.

image

I can add more masking policies for columns I'd like to protect as below.

image

 

View Data from ADO.NET Client

Below I created a simple console application in C# and connect to the database I've just created. In order to make the dynamic data masking feature work, I need to use security enabled connection string rather than the original one.

image

The console application source code is very simple. Note that I'm using security enabled connection string with the super_user login.

   1: using System;
   2: using System.Collections.Generic;
   3: using System.Data.SqlClient;
   4: using System.Linq;
   5: using System.Text;
   6: using System.Threading.Tasks;
   7:  
   8: namespace shx_maskingdatademo
   9: {
  10:     class Program
  11:     {
  12:         static void Main(string[] args)
  13:         {
  14:             var connectionString = ""
  15:                 + "Server=tcp:insider.database.secure.windows.net,1433;"
  16:                 + "Database=shx-maskingdatademo;"
  17:                 + "User ID=superuser@insider;"
  18:                 + "Password={xxxxxxxxx};"
  19:                 + "Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;";
  20:             var builder = new SqlConnectionStringBuilder(connectionString);
  21:             using (var conn = new SqlConnection(connectionString))
  22:             {
  23:                 using (var cmd = conn.CreateCommand())
  24:                 {
  25:                     cmd.CommandText = "SELECT * FROM Contacts";
  26:                     conn.Open();
  27:                     Console.WriteLine("Server: '{0}'", builder.DataSource);
  28:                     Console.WriteLine("Login : '{0}'", builder.UserID);
  29:                     using (var reader = cmd.ExecuteReader())
  30:                     {
  31:                         while (reader.Read())
  32:                         {
  33:                             Console.WriteLine("{0}\t{1}\t{2}\t{3}\t{4}", reader[0], reader[1], reader[2], reader[3], reader[4]);
  34:                         }
  35:                     }
  36:                 }
  37:             }
  38:  
  39:             Console.WriteLine("Press any key to exit.");
  40:             Console.ReadKey();
  41:         }
  42:     }
  43: }

I can view all data without masking.

image

But when I switched to normal_use.

   1: using System;
   2: using System.Collections.Generic;
   3: using System.Data.SqlClient;
   4: using System.Linq;
   5: using System.Text;
   6: using System.Threading.Tasks;
   7:  
   8: namespace shx_maskingdatademo
   9: {
  10:     class Program
  11:     {
  12:         static void Main(string[] args)
  13:         {
  14:             var connectionString = ""
  15:                 + "Server=tcp:insider.database.secure.windows.net,1433;"
  16:                 + "Database=shx-maskingdatademo;"
  17:                 + "User ID=normaluser@insider;"
  18:                 + "Password={xxxxxxxxx};"
  19:                 + "Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;";
  20:             var builder = new SqlConnectionStringBuilder(connectionString);
  21:             using (var conn = new SqlConnection(connectionString))
  22:             {
  23:                 ... ...
  24:             }
  25:  
  26:             Console.WriteLine("Press any key to exit.");
  27:             Console.ReadKey();
  28:         }
  29:     }
  30: }

All sensitive data were masked automatically.

image

 

Security Connection String Only

In order to make my masking policy enabled I need to connect to my database though the security enabled connection string. If I was using the original connection string you will find all sensitive data were returned as it is even though I was using normal_user login.

image

In order to protect my data in all cases, I will back to azure portal to switch the Security Enable Access from "optional" to "required". This means my database only allows security enabled connection string.

image

Now if I tried to connect to my database through the original connection string, I will receive an exception.

image

 

Summary

SQL Database Dynamic Data Masking limits sensitive data exposure by masking it to non-privileged users. Dynamic data masking is in preview for Basic, Standard, and Premium service tiers in the V12 version of Azure SQL Database. It’s a policy-based security feature that hides the sensitive data in the result set of a query over designated database fields, while the data in the database is not changed. This means we can have those kind of data protected, upgrade the pricing tier and enabled V12 without migrate them to another database, and almost without any code changes.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


Below are some gulp plugins I'm using in my Angular.JS website for build and deployment. Basically what I need are

1, Generate <script> and <link> element in "index.html" page based on packages installed through Bower.

2, Generate <script> elements for all Angular.JS JavaScript files we wrote.

3, Generate configuration file based on environment variants.

4, Combine and minify (except those had been minified) JavaScript and CSS files in release mode, but NOT in debug mode.

Now let's go though gulp plugins I'm using one by one.

 

main-bower-files

This plugin loads the "bower.json" file of my application, retrieve files for each packages based on the "main" property defined in its own "bower.json", for future usage. So if I have packages installed through the command "bower install [package-name] --save", then I can retrieve the files it needs into my gulp task, and pipe to the next step, for example generate <script> and <link> elements.

I can specify where the "bower.json" for my project was located through "{path: 'app'}", also not let the plugin read the file content by using "{read: false}" if I don't need to deal with the files' content.

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3:  
   4: gulp.task('TASKNAME', function() {
   5:     return gulp.src(bower({ paths: 'app' }), { read: false }))
   6:         .pipe(/* next step */)
   7: });

In some cases we need to specify which files should be referenced in a package. For example, by default, only "jquery.js" is necessary for jQuery package. But if we want to use "jquery.min.js" as well as "jquery.min.map", we can override it in our project level "bower.json" through its "overrides" property, as below.

   1: {
   2:   "name": "app",
   3:   "main": "app.js",
   4:   "version": "0.0.0",
   5:   "ignore": [
   6:     "**/.*",
   7:     "node_modules",
   8:     "bower_components",
   9:     "test",
  10:     "tests"
  11:   ],
  12:   "dependencies": {
  13:     "jquery": "~2.1.3",
  14:     "bootstrap": "~3.3.1",
  15:     "node-uuid": "~1.4.2",
  16:     "signalr": "~2.2.0",
  17:     "angular": "~1.3.9",
  18:     "angular-ui-router": "~0.2.13"
  19:     "angular-growl": "~0.4.0",
  20:     "moment": "~2.9.0",
  21:     "fontawesome": "~4.3.0"
  22:   },
  23:   "overrides": {
  24:     "jquery": {
  25:       "main": [
  26:         "dist/jquery.min.js",
  27:         "dist/jquery.min.map"
  28:       ]
  29:     }
  30:   }
  31: }

 

gulp-inject

This plugin reads the file source, transforms each of them to a string and injects into placeholders in the target stream files, such as an HTML file. I used it to generate <script> and <link> elements into the "index.html" file based on files detected by "main-bower-files".

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4:  
   5: gulp.task('TASKNAME', function () {
   6:     return gulp.src('index.tpl.html')
   7:         .pipe(inject(
   8:             gulp.src(bower({ paths: 'app' }), { read: false }),
   9:             { name: 'bower', relative: true, transform: gulpInjectVersioningTranform }))
  10:         .pipe(inject(
  11:             gulp.src(javaScriptFiles, { read: false }),
  12:             { relative: true, transform: gulpInjectVersioningTranform }))
  13:         .pipe(inject(
  14:             gulp.src(cssFiles, { read: false }),
  15:             { relative: true, transform: gulpInjectVersioningTranform }))
  16:         .pipe(/* next step */);
  17: });

By default, gulp-inject will generate <script> elements in targeting file between comments

   1: <!-- inject:js -->
   2: <!-- endinject -->

and <link> elements between comments

   1: <!-- inject:css -->
   2: <!-- endinject -->

But we can specify more targeting placeholders in gulp-inject name property. In the code above, JavaScript and CSS elements will be generated to the placeholders named "bower", while others will go default. Then the "index.tpl.html" would be like this.

   1: <head lang="en">
   2:     <meta charset="UTF-8">
   3:     <meta http-equiv="X-UA-Compatible" content="IE=edge">
   4:     <title></title>
   5:     <base href="/">
   6:  
   7:     <!-- bower:css -->
   8:     <!-- <link> elements detected by bower will be here. -->
   9:     <!-- endinject -->
  10:  
  11:     <!-- inject:css -->
  12:     <!-- <link> elements specified in gulp will be here. -->
  13:     <!-- endinject -->
  14:  
  15:     <!-- bower:js -->
  16:     <!-- <script> elements detected by bower will be here. -->
  17:     <!-- endinject -->
  18:  
  19:     <!-- inject:js -->
  20:     <!-- <script> elements specified in gulp will be here. -->
  21:     <!-- endinject -->
  22: </head>

I also specified "relevant: true" means the <script> and <link> elements will use relevant path.

And in order to add timestamp suffixing for each elements, I specified the transform function of the inject plugin. The function is very simple.

   1: var gulpInjectVersioningTranform = function (filepath, i, length, sourceFile, targetFile) {
   2:     var extname = path.extname(filepath);
   3:     if (extname === '.js' || extname === '.css') {
   4:         filepath += '?v=' + version;
   5:         return inject.transform.apply(inject.transform, [filepath, i, length, sourceFile, targetFile]);
   6:     }
   7:     else {
   8:         return inject.transform.apply(inject.transform, arguments);
   9:     }
  10: };

With these settings the output file content would be like this.

   1: <head lang="en">
   2:     <meta charset="UTF-8">
   3:     <meta http-equiv="X-UA-Compatible" content="IE=edge">
   4:     <title></title>
   5:     <base href="/">
   6:  
   7:     <!-- bower:css -->
   8:     <link rel="stylesheet" href="bower_components/bootstrap/dist/css/bootstrap.css?v=20150216161421">
   9:     <link rel="stylesheet" href="bower_components/fontawesome/css/font-awesome.css?v=20150216161421">
  10:     <!-- endinject -->
  11:  
  12:     <!-- inject:css -->
  13:     <link rel="stylesheet" href="styles/kendo.common-bootstrap.min.css?v=20150216161421">
  14:     <link rel="stylesheet" href="styles/kendo.bootstrap.min.css?v=20150216161421">
  15:     <link rel="stylesheet" href="styles/app.css?v=20150216161421">
  16:     <link rel="stylesheet" href="modules/module_1/k1.css?v=20150216161421">
  17:     <link rel="stylesheet" href="modules/shared/style.css?v=20150216161421">
  18:     <link rel="stylesheet" href="modules/shared/login/login.css?v=20150216161421">
  19:     <link rel="stylesheet" href="modules/shared/validation/validation.css?v=20150216161421">
  20:     <!-- endinject -->
  21:  
  22:     <!-- bower:js -->
  23:     <script src="bower_components/jquery/dist/jquery.js?v=20150216161421"></script>
   1:  
   2:     <script src="bower_components/bootstrap/dist/js/bootstrap.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/node-uuid/uuid.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/signalr/jquery.signalR.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/angular/angular.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/angular-ui-router/release/angular-ui-router.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/angular-cookies/angular-cookies.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/angular-local-storage/dist/angular-local-storage.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/angular-growl/build/angular-growl.js?v=20150216161421">
   1: </script>
   2:     <script src="bower_components/moment/moment.js?v=20150216161421">
   1: </script>
   2:     <!-- endinject -->
   3:  
   4:     <!-- inject:js -->
   5:     <script src="app.conf.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/module_1/module.conf.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/module_2/module.conf.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/module.conf.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/module_1/controllers.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/module_2/controllers.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/authorization.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/loadingIndicator.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/logger.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/message.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/security.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/signalr.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/utilities.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/wix.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/home/controller_home.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/login/controller_login.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/login/controller_session.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/validation/validation.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/task_status/task_status.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/view1/controller_view1.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/view2/controller_view2.js?v=20150216161421">
   1: </script>
   2:     <script src="modules/shared/widgets/serverTimeWidget.js?v=20150216161421">
   1: </script>
   2:     <script src="app.env.js?v=20150216161421">
   1: </script>
   2:     <script src="app.js?v=20150216161421">
   1: </script>
   2:     <script src="js/search.js?v=20150216161421">
   1: </script>
   2:     <script src="js/layout.js?v=20150216161421">
</script>
  24:     <!-- endinject -->
  25: </head>

 

gulp-rename

This plugin is very simple, it's to be used to rename and file. In my project I have a template of "index.html" named "index.tpl.html" and <script>, <link> elements are generated into this file stream in previous step. Then I need to save the file content and I need rename to "index.html" which is done by this plugin.

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4: var rename = require('gulp-rename');
   5:  
   6: gulp.task('TASKNAME', function () {
   7:     return gulp.src('index.tpl.html')
   8:         .pipe(inject(
   9:             gulp.src(bower({ paths: 'app' }), { read: false }),
  10:             { name: 'bower', relative: true, transform: gulpInjectVersioningTranform }))
  11:         .pipe(inject(
  12:             gulp.src(javaScriptFiles, { read: false }),
  13:             { relative: true, transform: gulpInjectVersioningTranform }))
  14:         .pipe(inject(
  15:             gulp.src(cssFiles, { read: false }),
  16:             { relative: true, transform: gulpInjectVersioningTranform }))
  17:         .pipe(rename(target))
  18:         .pipe(gulp.dest('app'));
  19: });

 

gulp-chmod

When we are working with some version control system, for example Team Foundation Server, if the workspace was set to Server Mode, all local files will be read-only. Then when using gulp-rename and dest function to write the output file, it will maintain read-only mode. This will make the second time we run gulp task failed since you cannot overwrite a read-only file.

In this case we need to use this plugin to change the file mode. It uses Linux "chmod" argument syntax. So if I want to remove the read-only flag, I need the code as below.

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4: var rename = require('gulp-rename');
   5: var chmod = require('gulp-chmod');
   6:  
   7: gulp.task('TASKNAME', function () {
   8:     return gulp.src('index.tpl.html')
   9:         .pipe(/* load script and css files then inject */)
  10:         .pipe(rename(target))
  11:         .pipe(chmod(666))
  12:         .pipe(gulp.dest('app'));
  13: });

 

gulp-concat, gulp-uglify and gulp-minify-css

In release build, I need to combine all JavaScript and CSS files, minify them and inject into "index.html". I used these 3 plugins for combination and minification.

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4: var rename = require('gulp-rename');
   5: var chmod = require('gulp-chmod');
   6: var uglify = require('gulp-uglify');
   7: var minifyCSS = require('gulp-minify-css');
   8:  
   9: gulp.task('TASKNAME1', function () {
  10:     return gulp.src(javaScriptFiles)
  11:         .pipe(uglify())
  12:         .pipe(concat('app.min.js'))
  13:         .pipe(chmod(666))
  14:         .pipe(gulp.dest(build + '/js'));
  15: });
  16:  
  17: gulp.task('TASKNAME2', function () {
  18:     return gulp.src(cssFiles)
  19:         .pipe(minifyCSS())
  20:         .pipe(concat('app.min.css'))
  21:         .pipe(chmod(666))
  22:         .pipe(gulp.dest(build + '/css'));
  23: });

 

gulp-filter

The code works well for JavaScript and CSS files we created, but didn't work for files installed through bower. Since the files detected by "main-bower-file" includes JavaScript and CSS files, we need to somehow filter them and ran "gulp-uglify" and "gulp-minifyCSS".

"gulp-filter" enables us to work based on a subset of the original files by filtering them using globbing. Now we can get all JavaScript files from "main-bower-file", by specifying "**/*.js" into "gulp-filter", and pipe "gulp-uglify", while "**/*.css" and pipe "gulp-minifyCSS".

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4: var rename = require('gulp-rename');
   5: var chmod = require('gulp-chmod');
   6: var uglify = require('gulp-uglify');
   7: var minifyCSS = require('gulp-minify-css');
   8: ar filter = require('gulp-filter');
   9:  
  10: gulp.task('TASKNAME1', function () {
  11:     return gulp.src(bower({ paths: 'app' }))
  12:         .pipe(filter('**/*.js'))
  13:         .pipe(uglify())
  14:         .pipe(concat('bower.min.js'))
  15:         .pipe(chmod(666))
  16:         .pipe(gulp.dest('.build/js'));
  17: });
  18:  
  19: gulp.task('TASKNAME2', function () {
  20:     return gulp.src(bower({ paths: 'app' }))
  21:         .pipe(filter('**/*.css'))
  22:         .pipe(minifyCSS())
  23:         .pipe(concat('bower.min.css'))
  24:         .pipe(chmod(666))
  25:         .pipe(gulp.dest('.build/css'));
  26: });

 

gulp-if

Some bower package specifies original JavaScript and CSS files while some specified minified version. I don't want to re-minify those files in my gulp task. So I need to use "gulp-if" it filter then out.

"gulp-if" allows me to use a function to check input files, pipe plugins for those pass the condition check. In this case I tested files' name, and perform "gulp-uglify" or "gulp-minifyCSS" only if their extension name were not "min.js" or "min.css".

   1: var gulp = require('gulp');
   2: var bower = require('main-bower-files');
   3: var inject = require('gulp-inject');
   4: var rename = require('gulp-rename');
   5: var chmod = require('gulp-chmod');
   6: var uglify = require('gulp-uglify');
   7: var minifyCSS = require('gulp-minify-css');
   8: var filter = require('gulp-filter');
   9: var gulpif = require('gulp-if');
  10:  
  11: var isNotMinified = function (file) {
  12:     var extname = path.extname(file.path);
  13:     if (extname === '.js' || extname === '.css') {
  14:         return path.extname(file.path.substr(0, file.path.length - extname.length)) !== '.min';
  15:     }
  16:     else {
  17:         return false;
  18:     }
  19: };
  20:  
  21: gulp.task('TASKNAME1', function () {
  22:     return gulp.src(bower({ paths: 'app' }))
  23:         .pipe(filter('**/*.js'))
  24:         .pipe(gulpif(isNotMinified, uglify()))
  25:         .pipe(concat('bower.min.js'))
  26:         .pipe(chmod(666))
  27:         .pipe(gulp.dest('.build/js'));
  28: });
  29:  
  30: gulp.task('TASKNAME2', function () {
  31:     return gulp.src(bower({ paths: 'app' }))
  32:         .pipe(filter('**/*.css'))
  33:         .pipe(gulpif(isNotMinified, minifyCSS()))
  34:         .pipe(concat('bower.min.css'))
  35:         .pipe(chmod(666))
  36:         .pipe(gulp.dest('.build/css'));
  37: });

 

gulp-preprocess

In order to generate some configuration files based on the system environment variant, such as the WebAPI endpoint, protocol and debug flag, I need to use "gulp-preprocess".

   1: var gulp = require('gulp');
   2: var preprocess = require('gulp-preprocess');
   3:  
   4: gulp.task('app.env.js', function () {
   5:     return gulp.src('app/app.env.tpl.js')
   6:         .pipe(preprocess())
   7:         .pipe(rename('app.env.js'))
   8:         .pipe(chmod(666))
   9:         .pipe(gulp.dest('app'));
  10: });

The content of the template file "app.env.tpl.js" specified which environment variant should be replaced.

   1: (function (window) {
   2:     angular.module('environment', [])
   3:         /* @ifdef DEBUG */
   4:         .value('debug', true)
   5:         /* @endif */
   6:         .factory('wixEndpoint', [ function () {
   7:             var scheme = '/* @echo WIX_ENDPOINT_SCHEME */';
   8:             var address = '/* @echo WIX_ENDPOINT_ADDRESS */';
   9:             var port = '/* @echo WIX_ENDPOINT_PORT */';
  10:             return scheme + '://' + address + ':' + port;
  11:         }])
  12:         .factory('apiEndpoint', [ 'wixEndpoint', function (wixEndpoint) {
  13:             var api = '/* @echo WIX_ENDPOINT_API */';
  14:             return wixEndpoint + api;
  15:         }]);
  16: })(window);

It will output debug value if DEBUG was specified in environment variant. It will also load values for WIX_ENDPOINT_SCHEME, WIX_ENDPOINT_ADDRESS, WIX_ENDPOINT_PORT and WIX_ENDPOINT_API from environment variant, and write the value into this file. So the result in one of my development lab would be like this.

   1: (function (window) {
   2:     angular.module('environment', [])
   3:         .value('debug', true)
   4:         .factory('wixEndpoint', [ function () {
   5:             var scheme = 'http';
   6:             var address = '10.222.115.220';
   7:             var port = '8080';
   8:             return scheme + '://' + address + ':' + port;
   9:         }])
  10:         .factory('apiEndpoint', [ 'wixEndpoint', function (wixEndpoint) {
  11:             var api = '/api';
  12:             return wixEndpoint + api;
  13:         }]);
  14: })(window);

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.


One of my project needs a C++ assembly for data encrypt and decrypt. We built that assembly from Visual Studio 2013 and tested in local machine. Everything ran well. But when I published to Microsoft Azure Website, it failed.

We spent half a day to get it resolved and I think it's good to write down what we tried for future reference.

 

Bad Image Format Exception

The first exception we met is BadImageFormatException (Exception from HRESULT: 0x8007000B). This is a common exception when an Azure application tried to load a C++ assembly. In azure our application are deployed in x64 Windows system. So if your C++ assembly was built in x86 you will see this exception.

One resolution is, if you are building a web application deployed under IIS, you can specify 'Enable 32bit Application' in the advance setting of your application pool.

image

If you are deploying your application under Azure Website, you can login the management portal and switch your Website to 32bit mode if its web hosting mode is standard. This will be the same as turning Enable 32-Bit Application to True.

image

Unfortunately we cannot change it since we also need some other assemblies in x64 mode. So we need to make sure the C++ assembly we built was x64.

 

Check Assembly x86 or x64

There are a lot of questions in StackOverflow asking how to find a DLL was compiled as x86 or x64. You can use DUMPBIN with /headers or /all flag. It will print "machine (x86)" or "machine (x64)".

If you have Cygwin installed, or have Linux or Mac system available, you can use "file" command to test the assembly more quickly.

Below I'm using x86 and x64 version of Internet Explorer as an example. As you can see for x86 assembly it returned "PE32" while x64 returned "PE32+".

image

 

File Not Found Exception

After we ensured our C++ assembly was built in x64 and published to Azure, we got another exception said

"System.IO.FileNotFoundException (Exception from HRESULT: 0x8007007E)". In some articles they said this is because your application cannot find the assembly and you'd better put it into %windir%\system32. You can try and if it still said "FileNotFoundException", this is mostly because the assembly depends something that are missing on your machine.

In order to check which were missing we ran Dependency Walker on azure machine and it reported MSVCP120D.DLL and MSVCR120D.DLL were missed.

IMG_0113

These DLLs are included in Visual C++ Redistributed Package. But note that both of them with "D" at the end of the name. That means they need some debug mode VC++ assemblies. These should not be necessary in production environment and the reason our assembly needs is because we built in debug mode.

Now the resolution is clear, build C++ assembly in x64 release mode and published, then everything works smoothly.

 

Summary

Load C++ assembly from .NET project is very common. But it often introduces some problem once we published to Azure while worked well in local. In this post I talked about what I met and how I solve this kind of problem. Basically when working with C++ in Azure, we need to keep in mind

1, Is it built in x86 or x64?

2, Is it built in release or debug?

3, Is the hosting environment support x86?

And in order to find the problem as early as possible, we'd better have a dedicate local test machine

1, Windows Server x64 (English), 2008 R2 or 2012 based on what we need.

2, .NET Framework 4 or 4.5.

3, DO NOT INSTALL Visual Studio or any other development packages.

 

PS: Happy Chinese New Year!

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.