Login | Register   
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


 
 
Posted by Sandeep Chanda on August 14, 2014

In Visual Studio 2013, the team unified the performance and diagnostics experience (memory profiling, etc.) under one umbrella and named it Performance and Diagnostics Hub. Available under the Debug menu, this option reduces lot of clutter in terms of profiling client and server side code during a debug operation. There was lot of visual noise in the IDE in the 2012 version and the hub is a significant addition in improving developer productivity.

In the Performance and Diagnostics hub, you may select the target, and specify the performance tools with which you would want to run diagnostics. There are various tools that you can use to start capturing the performance matrices like CPU Usage and Memory Allocation. You can collect CPU utilization matrices on a Windows forms based or WPF application.

The latest release of Update 3 brings with it some key enhancements to the CPU and memory usage tools. In the CPU usage tool, you can now right-click on a function name that was captured as part of the diagnostics and click View Source. This will allow you to easily navigate to the code that is consuming CPU in your application. The memory usage tool now allows you to capture memory usage for Win32 and WPF applications.

The hub will also allow you to figure hot paths in the application code that might be causing more CPU cycles and may need refactoring.

You can also look for functions that is doing most work as illustrated in the figure below.

Overall, the Performance and Diagnostics hub has become a useful arsenal for developer productivity and catering to non-functional aspects of the application scope.


Posted by Sandeep Chanda on August 5, 2014

Microsoft Azure Service Bus Event Hubs provide a topic based publish/subscribe messaging platform that allows for high throughput and low latency message processing. A preview version has been recently released by the Microsoft Azure team.

Event Hubs is a component of the service bus, and works alongside service bus topics and queues. Event Hubs provide a perfect platform for collecting event streams from multiple devices and sending them to an analytics engine for processing. This makes it ideal for an Internet of Things (IoT) scenario, where you can capture events from various connected devices and make meaningful decisions based on the ingested event stream.

You can also make use of analytics on the ingress to perform tenant billing, and performance monitoring amongst many possibilities. Event Hubs not only provide a reliable message processing platform, but also support durability for a predefined retention period allowing consumers to connect back in case of a failure.

Getting Started

An Event Hub is part of a Service Bus namespace, and typically consists of a Publisher Policy, Consumer Groups and Partition Keys. A publisher is a logical concept for publishing a message into an Event Hub, and a consumer is a logical concept for receiving messages. Partition allows for scaling Event Hubs and subscribers connect to a partition. Events in a partition are also ordered for delivery.

Currently the supported protocols for pub-sub are HTTP and AMQP. Note that for receiving data only AMQP is currently supported.

The Azure Service Bus NuGet package provides the EventProcessorHost and the EventHubClient API to process messages and send messages to the hub respectively. To start a host that can listen for incoming messages you can create a new instance of the EventProcessorHostas shown below:

host = new EventProcessorHost(
                hostName,
                eventHubName,
                consumerGroupName,
                eventHubConnectionString,
                storageConnectionString, eventHubName.ToLowerInvariant());

Note that it is a good practice to share the hub name in lowercase to avoid any case conflicts on names that the subscribers may present. You need to provide the connection strings for the event hub on the service bus namespace, the storage connection string for the queue, the name of the consumer group and a host name. You can then create a processor factory implementation using the IEventProcessorFactory interface to provide a factory implementation for processing the incoming messages. The host instance can then register the factory to listen for ingress using the RegisterEventProcessorFactoryAsync method. Similarly from the client, you can create an instance of the Event Hub Client using the EventHubClient.CreateFromConnectionString method, and then start sending messages using the SendAsync method that the client exposes.


Posted by Sandeep Chanda on July 31, 2014

In the previous post you learned how to setup an Express Node.js application in Microsoft Azure and also make it a unit of continuous deployment using Git. An Express Node.js application in silos without a data store to back it is not very useful. In this post you will explore setting up a MongoDB database using the Microsoft Azure marketplace that can then act as a repository for your Express Node.js web application to store large scale unstructured data. Hosted in Azure, it is limited only by the ability of the platform to scale, which is virtually infinite.

Getting Started

The first thing you would need to do is to subscribe to the MongoLab service from the Microsoft Azure store. MongoLab is a fully hosted MongoDB cloud database that is available with all the major cloud providers, including Azure.

To add MongoLab service to your subscription, click New in your management portal, and select the Store (preview) option.

Note that, depending on your subscription, the store may or may not be available to you. Reach out to Azure support if you need more details.

Find MongoLab from under the App Services category in the store and select to add it to your subscription.

500 MB is free to use. Enter your subscription details in the form that is presented and then click Purchase to complete the operation of adding it to your subscription. You can now use the Mongoose Node.js driver to connect to the MongoLab service database and start storing your model data.

Installation

To install the Mongoose driver, run the following command in your console:

 npm install mongoose –save

You are now all set to connect to the MongoDB database hosted in Azure. Get the connection string for the hosted instance and then use it in your express Node.js application controller code:

var mongoose = require('mongoose');
mongoose.connect('[connection string]');

You can use the model function to get associate a model with the Mongoose schema and then perform your operations on the model data.


Posted by Sandeep Chanda on July 28, 2014

Express is a powerful, yet lightweight and flexible web application framework for Node.js. In this post we will explore how you can create and deploy an express application in Microsoft Azure.

Prerequisites

First and foremost you need Node.js. Once you have installed Node.js, use the command prompt to install Express.

npm install express

You can also use the –g switch to install express globally rather than to a specific directory.

In addition, you will need to create a web site in Microsoft Azure that will host the application. If you have the Azure SDK for Node.js then you would already have the command line tools, but if not, use the following command to install the Azure command line tool:

npm install azure-cli

Create an Express App

Once you have installed Node.js, use the command prompt to create the Express scaffolding using the express command. This will install the scaffolding templates for views, controllers and other relevant resources with Jade and Stylus support:

express --css stylus [Your App Name] 

Next, run the install command to install the dependencies

npm install

This command will install the additional dependencies that are required by Express. The express command creates a package.json file that will be used by the Azure command tool to deploy the application and the dependencies in Azure.

The express command creates a folder structure for views, controllers and models to which you can add your own. To modify the default view, you can edit the index.jade file under the views folder and add your own mark-up code. The app.js file under the application folder will contain an instance of express:

var express = require('express');
var app = express();

You can now use VERBs to start defining routes.

Deploy an Express App

In order to deploy the Express app in Azure, first, install the Azure command tools for Node.js if you don’t have the SDK installed. The next thing you need to do is to get the publish settings from Azure and import it into your Node.js application using the following commands

azure account download
azure account import <publish settings file path>

Next, you need to create a web site in Azure, and also create a local git repository inside your application folder.

azure site create [your site name] --git

You can now commit your files to your local git repository and then use it to push for deployment to Azure using the following command:

git push azure master

You are now all set. The express Node.js application is deployed in Azure.


Posted by Sandeep Chanda on July 8, 2014

The Microsoft Open Technologies team announced the support for Cordova tools in Visual Studio 2013 Update 2 during Tech-Ed US. If you have been building cross platform apps using Apache Cordova, you would know that it is a great tool for leveraging your existing skills and assets in HTML5 and JavaScript, and then use them to package the assets for building native apps in Android, iOS, and Windows Store/Phone.

You can download the Cordova tools for Visual Studio here. An important pre-requisite to note is that it will run only on Windows 8.1. The multi-device hybrid apps that you create using the tools support the following platforms:

  1. Android version 4+
  2. iOS 6 and iOS 7
  3. Windows 8+ Store
  4. Windows 8+ Phone

Once you install the tool, the additional dependencies will automatically be downloaded to support debugging and running the apps from within Visual Studio for various devices. Some of the significant dependencies are:

  1. Node.js
  2. Android SDK
  3. Apache Ant
  4. Java SDK
  5. SQLLite
  6. Apple iTunes

For Android apps, you can use the Ripple emulator to run the app for devices of various form factors. For iOS apps, you need to run Visual Studio from a Mac with support for Xcode 5.1. The tools, however, provide a remote agent that can be used to configure a remote Mac to allow you to remote debug. To set up the remote agent, you need to first install the ios-sim node module and then start the build server with the allowEmulate option set to true. In Visual Studio, you need to configure the port and address of the remote build server using the Multi-Device Hybrid Apps in general settings under Options.

You are now all set to create your first Cordova project. You will find the project template under JavaScript to create a new blank Cordova project. You will find that the default solution structure has been organized into folders to store the CSS files, Scripts and Images. In addition, there will be a folder called Res to store the platform specific assets and certificates. There will also be a folder called Merges to add platform specific code.


Posted by Sandeep Chanda on June 30, 2014

In line with the Cloud First strategy, the Microsoft Azure team is aggressively working towards creating a Connected Systems and Devices platform using the Azure Service Bus, called the Microsoft Azure Intelligent Systems Service. This will enable enterprises to aggregate data from a variety of devices and sensors, and other lines of business applications. The data can then be made as a source for big data analytics using Azure HD Insight.

With all the scalability and performance attributes of Microsoft Azure in play, the platform could address the needs of industries such as transportation, manufacturing, supply chain, retail and healthcare, and enable the creation of smart cities and buildings. The idea of the Internet of Everything is to be able to support more efficient use of resources by providing insight on the data aggregated from them. As Todd Holmquist-Sutherland and Paolo Salvatori included in their Build 2014 conference session, the significance of Internet of Things (IoT) is in building the business of data driven insight. In addition, the significance is not just limited to getting business insights, but also to take appropriate action using the platform. For example, in a smart grid, the system could decide based on insights, the power consumption trends of house-holds and take action to control the ingress.

The Azure IoT stack provides a Cloud Gateway that acts as a secure boundary for communicating with the backend services. The gateway interprets messages from proprietary protocols and sends it to the Ingress Messaging Layer. The Cloud Gateway supports two patterns for communication. A Device Direct pattern where the IP Capable Devices are able to communicate directly with Azure Service Bus for storage and notification, or the brokered Custom Gateway where a set of Cloud Services broker the communication. The gateway allows partitioning of resources that are assigned to specific population of devices, allowing the ability to scale on demand.

The Cloud Gateway interacts with the Azure Service Bus using a topic based publisher subscriber pattern. Messages from the telemetry pump are split into alerts and data, and are sent to the alert processor and storage pre-processor respectively. This video from Build provides a quick preview into what's in store for the future of IoT with Microsoft Azure.


Posted by Sandeep Chanda on June 24, 2014

With mobile services in the fray, an ecosystem of publishers are constantly giving rise to data and business processes that are being licensed to consumers. This has resulted in significant investments towards outsourcing API management to providers than can scale the infrastructure to support the growing demands of consumers. Microsoft Azure is not behind in this race, and part of its major May 12, 2014 release announced support for API Management.

In today's post (and future posts as well), we will explore some of the API Management capabilities that Microsoft Azure brings to the table.

Before you read further, note that this like many other features announced in the post is still an early preview release not meant to be used for applications in production.

Azure API Management

Microsoft Azure API Management addresses some key issues concerning sharing of Cloud assets by creating a façade on top them and isolating the developer and publisher personas to control access. It also allows metering by developers to control usage and provide rich analytics on it.

In Azure, APIs are published as Products, containing one or more APIs wrapping the core backend operations.

Groups

The visibility to Products are controlled using Groups. There are three built-in groups namely Administrators, Developers and Guests. While administrators are primarily responsible for authoring, developers are consumers of the authored APIs.

Policies

Azure API Management also provides a feature called Policies that are used to control the behaviour of APIs. Policies contain a series of instructions that can be executed on the request of an API operation.

Developer Portal

APIs can be explored through a developer portal, where developers can read the documentation about API operations and ultimately consume them in their application. In a future blog post, we will look at creating and publishing a Cloud service as API and then consuming it in an application using the developer portal.


Posted by Sandeep Chanda on June 16, 2014

Redis is an extremely popular and powerful in-memory key-value store supporting a wide variety of data structures. Microsoft Azure has recently announced support for Redis Cache and the applications in Azure can leverage it as a high performance caching platform. Currently in preview mode, Azure Redis Cache is limited to a maximum size of 1GB. The platform can also be configured for replication providing high availability.

Configuring Redis Cache is very simple. First login to your Microsoft Azure Preview Portal and click "New" from the menu options.

You will find the Redis Cache (Preview) option in the list and select it to configure Redis Cache. In the next dialog, specify a DNS name for the cache service. If required specify the resource group, location and select the pricing tier (at present options are 250 MB and 1 GB).

You are now ready to access the cache from your application. Redis supportsclients in several languages including C# and Node.js. StackExchange.Redis is a popular client in C#. Use the StackExchange.Redis NuGet package to install the necessary client components required to connect to Azure Redis Cache.

The following code snippet illustrates the usage of the Redis cache provider.

var connection = ConnectionMultiplexer.Connect("[specify the connection string to Azure Redis Cache Service]");
            var cacheProvider = connection.GetDatabase();
            cacheProvider.StringSet("key","value");

You can now use it as you would any other cache, but with addition of the powerful features of Redis.


Posted by Sandeep Chanda on June 6, 2014

Configuring Breeze Controller

Continuing from where we left in the previous post, let us explore how you can use Breeze syntax to create powerful HTTP services with Web API. When you add the Breeze Server nuget package, you will notice that along with the relevant assemblies, a WebApiConfig class will also get added under the App Start folder with a Register method as shown below:

public static void Register(HttpConfiguration config)
        {
            // Web API configuration and services

            // Web API routes
            config.MapHttpAttributeRoutes();

            config.Routes.MapHttpRoute(
                name: "DefaultApi",
                routeTemplate: "api/{controller}/{id}",
                defaults: new { id = RouteParameter.Optional }
            );
        }

This registers the appropriate routes for Breeze. The Register method is called in the Application_Start method under Global.asax.

protected void Application_Start()
        {
            GlobalConfiguration.Configure(WebApiConfig.Register);
        }

Now let us create a Web API Controller for an Entity Framework Model. The first step is to decorate the Controller class with the BreezeControllerAttribute. Please note that you need to have the Breeze Server for Web API 2 also installed via nuget to get the necessary artefacts for augmenting the Web API services with Breeze.

 [BreezeController]
    public class SalesController : ApiController
    {
        
    }

Next, create an instance of Breeze EFContextProvider using the EF DbContext as shown in the code below:

 [BreezeController]
    public class SalesController : ApiController
    {
        readonly EFContextProvider<AdventureWorksEntities> contextProvider = 
            new EFContextProvider<AdventureWorksEntities>();
    }

The instance of Breeze EFContextProvider will now enable you to write methods to query the AdventureWorksEntities DbContext. What is more, it will translate your expressions into SQL Queries and execute them on the database providing optimal performance.

Methods for Querying Data

To query your entities, you can create methods that returns those entities as the EFContextProvider properties. This will allow you to write OData expressions to filter results.

// ~/breeze/sales/SalesOrders
        // ~/breeze/sales/SalesOrders?$filter=[expression]
        [HttpGet]
        public IQueryable<SalesOrderDetail> SalesOrders()
        {
            return contextProvider.Context.SalesOrderDetails;
        }

Save Changes

Breeze also provides a mechanism to allow you to perform multiple insert / update operations and then persist them in the repository using SaveChanges method exposed by the EFContextProvider.

// ~/breeze/sales/SaveChanges
        [HttpPost]
        public SaveResult SaveChanges(JObject saveBundle)
        {
            return contextProvider.SaveChanges(saveBundle);
        }

Metadata

One of the challenges with Web API design with RPC style implementation is that there is no metadata available that you can expose to the consumers of your service. The contract signatures have to be shared in an out-of-bounds fashion, which is a problem. Breeze allows you to expose the service metadata using the EFContextProvider Metadata method:

// ~/breeze/sales/Metadata
        [HttpGet]
        public string Metadata()
        {
            return contextProvider.Metadata();
        }

You are all set now, having configured a Web API service with Breeze. If you run and browse the metadata URL you will see the method definitions listed.


Posted by Sandeep Chanda on May 26, 2014

Introduction

Breeze.js is a fairly popular JavaScript library (in quite a long list) that helps you manage business data objects from rich JS client applications. The origins of the library first surfaced during the Silverlight days where it gained popularity in seamless integration with LINQ. Since the creators come from the .NET background, Breeze.js suitably provided a server side, where it enhanced the capabilities of HTTP services in the form of Web API.

The ASP.NET Web API is very powerful in creating HTTP Services. While using Web API you have two approaches for creating HTTP Services. The first approach is creating controllers for each Entity having four methods representing the CRUD operations using the HTTP verbs. This follows the RESTFul approach. Another option is to use OData services that allow clients to create their own queries and use Web API to execute them on the data store.

If you are building services using Web API which are idiosyncratic in nature, then you need to provide outside-the-band metadata information to your consumers about your API design. This is where Breeze Server can be very useful. You can use Breeze to design robust APIs and Breeze will create and expose metadata for your controller.

Install Breeze

To use Breeze Server, you need to use a couple of nuget packages as shown in the screen print below:

This is assuming that you will use Entity Framework as your data access layer. Alternatively you can also use it with NHibernate.

Configure Breeze

Now that you have installed Breeze, you can start configuring your Controller and API methods with Breeze. For every controller, you can use the BreezeController attribute to indicate a Breeze Controller class. Breeze also provides the EFContextProvider class that encapsulates EF DbContext into controlling the CRUD operation behaviour and providing the metadata for the operations.

In the next blog post we will see how the Web API controllers are manipulated by a Breeze Server implementation and explore its true power.


Sitemap