Login | Register   
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


 
 
Posted by Sandeep Chanda on July 28, 2014

Express is a powerful, yet lightweight and flexible web application framework for Node.js. In this post we will explore how you can create and deploy an express application in Microsoft Azure.

Prerequisites

First and foremost you need Node.js. Once you have installed Node.js, use the command prompt to install Express.

npm install express

You can also use the –g switch to install express globally rather than to a specific directory.

In addition, you will need to create a web site in Microsoft Azure that will host the application. If you have the Azure SDK for Node.js then you would already have the command line tools, but if not, use the following command to install the Azure command line tool:

npm install azure-cli

Create an Express App

Once you have installed Node.js, use the command prompt to create the Express scaffolding using the express command. This will install the scaffolding templates for views, controllers and other relevant resources with Jade and Stylus support:

express --css stylus [Your App Name] 

Next, run the install command to install the dependencies

npm install

This command will install the additional dependencies that are required by Express. The express command creates a package.json file that will be used by the Azure command tool to deploy the application and the dependencies in Azure.

The express command creates a folder structure for views, controllers and models to which you can add your own. To modify the default view, you can edit the index.jade file under the views folder and add your own mark-up code. The app.js file under the application folder will contain an instance of express:

var express = require('express');
var app = express();

You can now use VERBs to start defining routes.

Deploy an Express App

In order to deploy the Express app in Azure, first, install the Azure command tools for Node.js if you don’t have the SDK installed. The next thing you need to do is to get the publish settings from Azure and import it into your Node.js application using the following commands

azure account download
azure account import <publish settings file path>

Next, you need to create a web site in Azure, and also create a local git repository inside your application folder.

azure site create [your site name] --git

You can now commit your files to your local git repository and then use it to push for deployment to Azure using the following command:

git push azure master

You are now all set. The express Node.js application is deployed in Azure.


Posted by Sandeep Chanda on July 8, 2014

The Microsoft Open Technologies team announced the support for Cordova tools in Visual Studio 2013 Update 2 during Tech-Ed US. If you have been building cross platform apps using Apache Cordova, you would know that it is a great tool for leveraging your existing skills and assets in HTML5 and JavaScript, and then use them to package the assets for building native apps in Android, iOS, and Windows Store/Phone.

You can download the Cordova tools for Visual Studio here. An important pre-requisite to note is that it will run only on Windows 8.1. The multi-device hybrid apps that you create using the tools support the following platforms:

  1. Android version 4+
  2. iOS 6 and iOS 7
  3. Windows 8+ Store
  4. Windows 8+ Phone

Once you install the tool, the additional dependencies will automatically be downloaded to support debugging and running the apps from within Visual Studio for various devices. Some of the significant dependencies are:

  1. Node.js
  2. Android SDK
  3. Apache Ant
  4. Java SDK
  5. SQLLite
  6. Apple iTunes

For Android apps, you can use the Ripple emulator to run the app for devices of various form factors. For iOS apps, you need to run Visual Studio from a Mac with support for Xcode 5.1. The tools, however, provide a remote agent that can be used to configure a remote Mac to allow you to remote debug. To set up the remote agent, you need to first install the ios-sim node module and then start the build server with the allowEmulate option set to true. In Visual Studio, you need to configure the port and address of the remote build server using the Multi-Device Hybrid Apps in general settings under Options.

You are now all set to create your first Cordova project. You will find the project template under JavaScript to create a new blank Cordova project. You will find that the default solution structure has been organized into folders to store the CSS files, Scripts and Images. In addition, there will be a folder called Res to store the platform specific assets and certificates. There will also be a folder called Merges to add platform specific code.


Posted by Sandeep Chanda on June 30, 2014

In line with the Cloud First strategy, the Microsoft Azure team is aggressively working towards creating a Connected Systems and Devices platform using the Azure Service Bus, called the Microsoft Azure Intelligent Systems Service. This will enable enterprises to aggregate data from a variety of devices and sensors, and other lines of business applications. The data can then be made as a source for big data analytics using Azure HD Insight.

With all the scalability and performance attributes of Microsoft Azure in play, the platform could address the needs of industries such as transportation, manufacturing, supply chain, retail and healthcare, and enable the creation of smart cities and buildings. The idea of the Internet of Everything is to be able to support more efficient use of resources by providing insight on the data aggregated from them. As Todd Holmquist-Sutherland and Paolo Salvatori included in their Build 2014 conference session, the significance of Internet of Things (IoT) is in building the business of data driven insight. In addition, the significance is not just limited to getting business insights, but also to take appropriate action using the platform. For example, in a smart grid, the system could decide based on insights, the power consumption trends of house-holds and take action to control the ingress.

The Azure IoT stack provides a Cloud Gateway that acts as a secure boundary for communicating with the backend services. The gateway interprets messages from proprietary protocols and sends it to the Ingress Messaging Layer. The Cloud Gateway supports two patterns for communication. A Device Direct pattern where the IP Capable Devices are able to communicate directly with Azure Service Bus for storage and notification, or the brokered Custom Gateway where a set of Cloud Services broker the communication. The gateway allows partitioning of resources that are assigned to specific population of devices, allowing the ability to scale on demand.

The Cloud Gateway interacts with the Azure Service Bus using a topic based publisher subscriber pattern. Messages from the telemetry pump are split into alerts and data, and are sent to the alert processor and storage pre-processor respectively. This video from Build provides a quick preview into what's in store for the future of IoT with Microsoft Azure.


Posted by Sandeep Chanda on June 24, 2014

With mobile services in the fray, an ecosystem of publishers are constantly giving rise to data and business processes that are being licensed to consumers. This has resulted in significant investments towards outsourcing API management to providers than can scale the infrastructure to support the growing demands of consumers. Microsoft Azure is not behind in this race, and part of its major May 12, 2014 release announced support for API Management.

In today's post (and future posts as well), we will explore some of the API Management capabilities that Microsoft Azure brings to the table.

Before you read further, note that this like many other features announced in the post is still an early preview release not meant to be used for applications in production.

Azure API Management

Microsoft Azure API Management addresses some key issues concerning sharing of Cloud assets by creating a façade on top them and isolating the developer and publisher personas to control access. It also allows metering by developers to control usage and provide rich analytics on it.

In Azure, APIs are published as Products, containing one or more APIs wrapping the core backend operations.

Groups

The visibility to Products are controlled using Groups. There are three built-in groups namely Administrators, Developers and Guests. While administrators are primarily responsible for authoring, developers are consumers of the authored APIs.

Policies

Azure API Management also provides a feature called Policies that are used to control the behaviour of APIs. Policies contain a series of instructions that can be executed on the request of an API operation.

Developer Portal

APIs can be explored through a developer portal, where developers can read the documentation about API operations and ultimately consume them in their application. In a future blog post, we will look at creating and publishing a Cloud service as API and then consuming it in an application using the developer portal.


Posted by Sandeep Chanda on June 16, 2014

Redis is an extremely popular and powerful in-memory key-value store supporting a wide variety of data structures. Microsoft Azure has recently announced support for Redis Cache and the applications in Azure can leverage it as a high performance caching platform. Currently in preview mode, Azure Redis Cache is limited to a maximum size of 1GB. The platform can also be configured for replication providing high availability.

Configuring Redis Cache is very simple. First login to your Microsoft Azure Preview Portal and click "New" from the menu options.

You will find the Redis Cache (Preview) option in the list and select it to configure Redis Cache. In the next dialog, specify a DNS name for the cache service. If required specify the resource group, location and select the pricing tier (at present options are 250 MB and 1 GB).

You are now ready to access the cache from your application. Redis supportsclients in several languages including C# and Node.js. StackExchange.Redis is a popular client in C#. Use the StackExchange.Redis NuGet package to install the necessary client components required to connect to Azure Redis Cache.

The following code snippet illustrates the usage of the Redis cache provider.

var connection = ConnectionMultiplexer.Connect("[specify the connection string to Azure Redis Cache Service]");
            var cacheProvider = connection.GetDatabase();
            cacheProvider.StringSet("key","value");

You can now use it as you would any other cache, but with addition of the powerful features of Redis.


Posted by Sandeep Chanda on June 6, 2014

Configuring Breeze Controller

Continuing from where we left in the previous post, let us explore how you can use Breeze syntax to create powerful HTTP services with Web API. When you add the Breeze Server nuget package, you will notice that along with the relevant assemblies, a WebApiConfig class will also get added under the App Start folder with a Register method as shown below:

public static void Register(HttpConfiguration config)
        {
            // Web API configuration and services

            // Web API routes
            config.MapHttpAttributeRoutes();

            config.Routes.MapHttpRoute(
                name: "DefaultApi",
                routeTemplate: "api/{controller}/{id}",
                defaults: new { id = RouteParameter.Optional }
            );
        }

This registers the appropriate routes for Breeze. The Register method is called in the Application_Start method under Global.asax.

protected void Application_Start()
        {
            GlobalConfiguration.Configure(WebApiConfig.Register);
        }

Now let us create a Web API Controller for an Entity Framework Model. The first step is to decorate the Controller class with the BreezeControllerAttribute. Please note that you need to have the Breeze Server for Web API 2 also installed via nuget to get the necessary artefacts for augmenting the Web API services with Breeze.

 [BreezeController]
    public class SalesController : ApiController
    {
        
    }

Next, create an instance of Breeze EFContextProvider using the EF DbContext as shown in the code below:

 [BreezeController]
    public class SalesController : ApiController
    {
        readonly EFContextProvider<AdventureWorksEntities> contextProvider = 
            new EFContextProvider<AdventureWorksEntities>();
    }

The instance of Breeze EFContextProvider will now enable you to write methods to query the AdventureWorksEntities DbContext. What is more, it will translate your expressions into SQL Queries and execute them on the database providing optimal performance.

Methods for Querying Data

To query your entities, you can create methods that returns those entities as the EFContextProvider properties. This will allow you to write OData expressions to filter results.

// ~/breeze/sales/SalesOrders
        // ~/breeze/sales/SalesOrders?$filter=[expression]
        [HttpGet]
        public IQueryable<SalesOrderDetail> SalesOrders()
        {
            return contextProvider.Context.SalesOrderDetails;
        }

Save Changes

Breeze also provides a mechanism to allow you to perform multiple insert / update operations and then persist them in the repository using SaveChanges method exposed by the EFContextProvider.

// ~/breeze/sales/SaveChanges
        [HttpPost]
        public SaveResult SaveChanges(JObject saveBundle)
        {
            return contextProvider.SaveChanges(saveBundle);
        }

Metadata

One of the challenges with Web API design with RPC style implementation is that there is no metadata available that you can expose to the consumers of your service. The contract signatures have to be shared in an out-of-bounds fashion, which is a problem. Breeze allows you to expose the service metadata using the EFContextProvider Metadata method:

// ~/breeze/sales/Metadata
        [HttpGet]
        public string Metadata()
        {
            return contextProvider.Metadata();
        }

You are all set now, having configured a Web API service with Breeze. If you run and browse the metadata URL you will see the method definitions listed.


Posted by Sandeep Chanda on May 26, 2014

Introduction

Breeze.js is a fairly popular JavaScript library (in quite a long list) that helps you manage business data objects from rich JS client applications. The origins of the library first surfaced during the Silverlight days where it gained popularity in seamless integration with LINQ. Since the creators come from the .NET background, Breeze.js suitably provided a server side, where it enhanced the capabilities of HTTP services in the form of Web API.

The ASP.NET Web API is very powerful in creating HTTP Services. While using Web API you have two approaches for creating HTTP Services. The first approach is creating controllers for each Entity having four methods representing the CRUD operations using the HTTP verbs. This follows the RESTFul approach. Another option is to use OData services that allow clients to create their own queries and use Web API to execute them on the data store.

If you are building services using Web API which are idiosyncratic in nature, then you need to provide outside-the-band metadata information to your consumers about your API design. This is where Breeze Server can be very useful. You can use Breeze to design robust APIs and Breeze will create and expose metadata for your controller.

Install Breeze

To use Breeze Server, you need to use a couple of nuget packages as shown in the screen print below:

This is assuming that you will use Entity Framework as your data access layer. Alternatively you can also use it with NHibernate.

Configure Breeze

Now that you have installed Breeze, you can start configuring your Controller and API methods with Breeze. For every controller, you can use the BreezeController attribute to indicate a Breeze Controller class. Breeze also provides the EFContextProvider class that encapsulates EF DbContext into controlling the CRUD operation behaviour and providing the metadata for the operations.

In the next blog post we will see how the Web API controllers are manipulated by a Breeze Server implementation and explore its true power.


Posted by Sandeep Chanda on May 13, 2014

The Open Web Interface for .NET (OWIN) is a specification created out of the efforts from the .NET community to build an abstraction between web servers and the .NET framework. ASP.NET until .NET 4.0 was extremely monolithic with a series of pipeline components tightly coupled with IIS. Whether you need them or not, there is no way you could get rid of them and you had to pay for the cost in performance with every request.

The project, code named Katana, and OWIN attempts to solve some of the performance changes with System.web.dll and gives you greater control in porting and hosting applications between different platforms and also makes it very easy to develop and extend.

The number of dependencies that you need to write an OWIN compatible web application is greatly reduced as compared to traditional ASP.NET. When you create an OWIN project you will automatically see the difference.

Note that OWIN components are owned by the community, however project Katana is run by Microsoft (hence the credibility). The OWIN components are part of Owin.dll and are available as a nuget package.

It is very easy to start with OWIN. You can create an empty Web Application project and add the OWIN nuget package using the following command:

 install-package Microsoft.Owin.Host.SystemWeb 

Once you have successfully installed, you can host the OWIN server in IIS using the IAppBuilder interface. Create a class named "Startup" (it is important to name it this way for OWIN discovery) and add the following code to create the host:

 public void Configuration(IAppBuilder app)
        {
            app.Run(context =>
            {
                context.Response.ContentType = "text/html";
                return context.Response.WriteAsync("OWIN Started!");
            });
        } 

You can now run the application just like any other ASP.NET application and you will notice OWIN running. Note that in this case, OWIN is running in the ASP.NET pipeline, however it is not difficult to switch to a different host.

To create a console host, install the nuget package OwinHost using the command:

 Install-package OwinHost 

You can now launch the OwinHost.exe from the installed packages folder. Make sure to run it from the project root to allow the host to search for an OwinStartupAttribute assembly. You are now all set to run OWIN from a console host. You will see the same output as earlier, just that it is no longer using the ASP.NET pipeline. This is very powerful. In a future post we will explore some very interesting capabilities that the Katana project is throwing at us!


Posted by Sandeep Chanda on May 7, 2014

Callbacks have been the lifeline for the most part in the Ajax world. However, with increasing usage of JavaScript in the enterprise, developers are looking beyond simple callbacks--primarily because as the code grows, regular Ajax callbacks become clumsy and hard to maintain and extend.

Learn more about how Promise (Promise/A+ specification) and jQuery deferred objects show you the better side of JavaScript programming.

jQuery deferred objects and the Promise/A+ specifications are slightly different, with Promise/A+ being more canonical. jQuery deferred is a proxy for asynchronous future events, whereas promise allows for state inspection. The Promise specification is completely immutable.

A typical Ajax with a callback function looks like the following code snippet:

The most frequent issues with callback are the conflicts with DOM ready state and rendering. The same code written in Promise is more elegant. Promise provides the done(), fail() and always() event handlers in an orderly fashion.

You can also use Promise to chain callbacks with the help of the then() method as shown below:

Another significant feature of Promise is the ability to combine multiple service calls with the when() function as shown below:

The Promise specification is the recommended way of writing JavaScript in enterprise scale applications, allowing you to easily write clean and maintainable code.


Posted by Sandeep Chanda on April 30, 2014

AppHarbor is a hosted .NET Platform as a Service that can scale your ASP.NET and ASP.NET MVC applications in the cloud. It also provides you a marketplace of add-ons that can help you extend the platform to support features such as database connectivity, application monitoring and scheduled services. An interesting aspect of AppHarbor is that it supports connectivity with GitHub, allowing it be used in conjunction with your Git Source to create a continuous build environment. AppHarbor can then automatically deploy a successful build into the preconfigured environment to host the application.

To create a build and deployment environment, you need to first create an application in AppHarbor.

Login to your AppHarbor account and create an application as shown in the figure below:

Notice that you can also specify the host country. Currently AppHarbor is supported only in three countries. Once the application is created you will get the options to deploy an application to AppHarbor using BitBucket, CodePlex, and GitHub. If you are not using any of these services, you can deploy using a built-in Git Repository.

After creating the application go ahead and select the "Deploy using GitHub" option. It will prompt you to login to your GitHub account. Once logged in, it will take you to the New Bridge page for you to select the application source from GitHub that you want to deploy to AppHarbor as shown below:

Next if your application has a SQL Database to be hosted, then click on the Add-ons option and search for the choice of database you need to connect. In this example, we are adding an instance of SQL Server.

The installed add-ons will be available in the Application Home Page. In our example you can find the SQL Server add-on installed as shown in the figure below:

Click on the installed add-on, and then on the add-on details page, click on "Go to SQL Server" to land on the database configuration page. You will find a few configuration variables than will be empty. The variable of interest for us would be the "Connectionstring alias". Edit the variable, and specify the name of the connectionstring from your app in this variable. This would indicate AppHarbor to find the connectionstring in your application and replace it with the connection information of database hosted by the add-on.

You are now all setup for build and deployment using AppHarbor. You can make changes to the application using Visual Studio 2013 and check-in your changes using the TFS Git Client.

Once your Git changes are synced, AppHarbor will automatically pick them up for a build and subsequent deployment. All builds are also stored for historical purpose, and if you need to deploy an older version you can do so by selecting the build to deploy.

AppHarbor is easy to configure and can get you up and running very quickly with a build and deployment environment to host .NET applications.


Sitemap