Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


 
 
Posted by Sandeep Chanda on October 15, 2014

In one of the previous blog posts, I introduced DocumentDB - Microsoft's debut into the world of NoSQL databases. You learned how it is different for being a JSON document only database. You also learned to create an instance of DocumentDB in Azure.

In the previous post, you used NuGet to install the required packages to program against DocumentDB in a .NET application. Today let's explore some of the programming constructs to operate on an instance of DocumentDB.

First step is to create a repository to allow you connect to your instance of DocumentDB. Create a repository class and reference the Microsoft.Azure.Documents.Client namespace in it. The Database object can be used to create an instance. The following code illustrates:

Database db = DbClient.CreateDatabaseAsync(new Database { Id = DbId } ).Result; 

Here DbClient is a property of type DatabaseClient exposed by Microsoft.Azure.Documents.Client API in your repository class. It provides the method CreateDatabaseAsync to connect to DocumentDB. You need to have the following key values from your instance of DocumentDB in azure:

  1. End point URL from Azure Management Portal
  2. Authentication Key
  3. Database Id
  4. Collection name

You can create an instance of DocumentClient using the following construct:

private static DocumentClient DbClient
    {
        get
        {
            Uri endpointUri = new Uri(ConfigurationManager.AppSettings["endpoint"]);
                return new DocumentClient(endpointUri, ConfigurationManager.AppSettings["authKey"];

        }
    }

Next you need to create a Document Collection using the method CreateDocumentCollectionAsync.

DocumentCollection collection = DbClient. CreateDocumentCollectionAsync ( Database.SelfLink, new DocumentCollection { Id = CollectionId } ).Result; 

You are now all set to perform DocumentDB operations using the repository. Note that you need to reference Microsoft.Azure.Documents.Linq to use Linq constructs for querying. Here is an example:

var results = DbClient.CreateDocumentQuery<T>(collection.DocumentsLink); 

Note that whatever entity replaces type T, the properties of that entity must be decorated with JsonProperty attribute to allow JSON serialization.

To create an entry you can use the CreateDocumentAsync method as shown here:

DbClient.CreateDocumentAsync(collection.SelfLink, T); 

In a similar fashion, you can also use the equivalent update method to update the data in your instance of DocumentDB.

Beyond .NET, DocumentDB also provides libraries to allow using JavaScript and Node.js. The interesting aspect is it allows T-SQL style operations such as creation of stored procedures, triggers, and user defined functions using JavaScript. You can write procedural logic in JavaScript, with atomic transactions. Performance is typically very good with JSON mapped all the way from the client side to DocumentDB as the unit of storage.  


Posted by Sandeep Chanda on October 10, 2014

The ongoing Xamarin Evolve conference is generating a lot of enthusiasm amongst cross-platform developers across the globe.

Xamarin has so far showcased an Android player, a simulator with hardware acceleration that claims to be much faster than the emulator with Android SDK. It is based on OpenGL and utilizes hardware accelerated virtualization with VT-x and AMD-V. The player also relies on Virtual Box 4.3 or higher to run. It would run equally well on Windows (7 or later) and OS X (10.7 or higher). After installing the player, you can select the emulator image to run. Select the device to simulate from the Device Manager. The emulator will then run exactly like the Android SDK emulator and you can perform various actions (typical of a hardware operation) by clicking the buttons provided on the right hand side. You can also simulate operations like multi-touch, battery operations, and location controls, etc. To install your apps for testing, you can drag and drop the APK file into the player.

Another cool release is the profiler that can be leveraged to perform code analysis of the C# code and profile it for potential performance bottlenecks and leaks. The profiler performs two important tasks. It does sampling for tracking memory allocation and looks at the call tree to determine the order of calling functions. It also provides a snapshot of memory usage on a timeline allowing the administrators to gain valuable insights into memory usage patterns.

My most favourite feature so far, however, is the preview of Sketches. Sketches provides an environment to quickly evaluate code and analyse the outcome. It offers immediate results without having the need to compile or deploy and you can use it from your Xamarin Studio. More on Sketches in the next post after I install and give it a try myself.


Posted by Sandeep Chanda on September 29, 2014

Azure is increasingly becoming the scalable CMS platform with support for a host of popular CMS providers via the marketplace. The list already includes some of the big names in the CMS industry, like Umbraco, Kentico, Joomla, and DNN.

The most recent addition to this list is WordPress. It is very simple to create a WordPress website. Go to the Azure Preview Portal and click New to go to the Gallery. Select Web from the navigation pane and you will see Scalable WordPress listed as one of the options (along with other options such as Umbraco and Zoomla).

Scalable WordPress uses Azure Storage by default to store site content. This automatically allows you to use Azure CDN for the media content that you want to use in your WordPress website.

Once you select Scalable WordPress, you will be redirected to the website configuration pane, where you can specify the name of the website, the database and the storage configuration settings. You are all set!

Login to your WordPress site dashboard to configure plug-ins like Jetpack. Jetpack, formerly available with WordPress.com, is now also available with Scalable WordPress. Your WordPress CMS site hosted in Azure can now support millions of visits and scale on demand. The Azure WordPress CMS website will support auto-scale out of the box. You can also enable backup and restore features available with Azure websites for your CMS site. It will also support publishing of content from stage to production.


Posted by Sandeep Chanda on September 15, 2014

NuGet has been a fairly popular mechanism to publish and distribute packaged components to be consumed by Visual Studio projects and solutions. Releases from the Microsoft product teams are increasingly being distributed as NuGet packages and it is officially the package manager for the Microsoft development platform. including .NET.

NuGet.org is the central package repository used by authors and consumers for global open distribution. One limitation of NuGet central repository is that, in large scale enterprise teams, it often results in package version mismatch across teams/solutions/projects. If not managed early this spirals into a significant application versioning problem for release managers during deployment.

One approach to solving this problem would be to use a Local NuGet Server that you can provision for your enterprise. It mimics the central repository, however it remains in the control of your release managers who can now decide which package versions to release for your consumers. The idea is that your Visual Studio users will point to your local NuGet server instead of the central repository and the release management team will control what versions of packages the teams use for consistency. The following figure illustrates the process:

It is very easy to create a NuGet server. You can use the nuget command line tool to publish packages. You will need an API Key and the host URL.

Developers using Visual Studio can go to Tools  →  Options  →  NuGet Package Manager → Package Sources and add the internal package server as a source.

While local NuGet servers are used today as a mechanism for distributing internal packages, they can also be extended to become a gated process for distributing global packages to bring consistency in the versions used across teams.


Posted by Sandeep Chanda on September 3, 2014

Microsoft's recent addition into the world of NoSQL Databases has been greeted with quite a fanfare and with mixed reviews from competing products. What is interesting is that Microsoft chose DocumentDB as a new Azure-only feature against enhancing its already existing table storage capabilities.

DocumentDB is a JSON document only database as a service. A significant feature included in DocumentDB, that is missing in its traditional rivals, is the support for rich queries (including support for LINQ) and transaction support. What is also interesting is that the new SQL syntax for querying JSON documents automatically recognizes native JavaScript constructs. It also supports programmability features such as user defined functions, stored procedures, and triggers. Given that it is backed by Azure with high availability and scalability, the offering seems to hold an extremely promising future.

To start, first create a new instance of DocumentDB in your Microsoft Azure Preview portal.

Click New in the preview portal and select DocumentDB. Specify a name and additional details like the capacity configuration and resource group. Go ahead and click the Create button to create an instance of DocumentDB. After creating the instance you can get the URI and keys by clicking on the Keys tile.

Done! You are now good to start using DocumentDB to store and query JSON documents. In your instance of Visual Studio, run the following NuGet command using the package manager console to install the pre-requisites in order to start programming with DocumentDB.

PM> Install-Package Microsoft.Azure.Documents.Client -Pre

If you want to program it using JavaScript, you can also install the JavaScript SDK from here https://github.com/Azure/azure-documentdb-js, and then leverage the REST interface to access DocumentDB using permissions authorization. In a future post, we will look at some of the language constructs in programming with DocumentDB.


Posted by Sandeep Chanda on August 25, 2014

Enterprise monitoring needs over the years have been addressed by Microsoft Systems Centre Operations Manager to a large extent. The problem however is that SCOM produces a lot of noise and the data could very quickly become irrelevant for producing any actionable information. IT teams very easily fall in the trap of configuring SCOM for every possible scheme of alerts, but do not put effective mechanisms in place to improve the alert to noise ratio by creating usable knowledge base out of the alerts that are generated by SCOM. Splunk and its cloud avatar, Hunk could be very useful in the following aspects:

  1. Providing actionable analytics using the alert log in the form of self-service dashboards
  2. Isolation of vertical and horizontal monitoring needs
  3. Generating context around alerts or a group of alerts
  4. Collaboration between IT administrators and business analysts
  5. Creating a consistent alerting scale for participating systems
  6. Providing a governance model for iteratively fine tuning the system.

In your enterprise, Splunk could be positioned in a layer above SCOM, where it gets the alert log as input for processing and analysis. This pair can be used to address the following enterprise monitoring needs of an organization:

  1. Global Service Monitoring - Provides information on the overall health of the infrastructure, which includes surfacing actionable information on disk and CPU usage. It could also be extended to include the network performance and the impact specific software applications are having on the health of the system. Splunk will augment SCOM in creating dashboards from the data collected that could help make decisions. For example, looking at the CPU usage trends on a timeline, IT owners can decide increasing or decreasing the core fabric.
  2. Application Performance Monitoring - Splunk can be extremely useful in making business decisions out of the instrumentation you do in code and the trace log it generates. You can identify purchase patterns of your customers. The application logs and alerts generated by custom applications and commercial of the shelf software (COTS) could be routed to Splunk via SCOM using the management packs. Splunk can then help you create management dashboards that in-turn will help the executive team decide the future course of business.

Using Splunk in conjunction with SCOM provides you a very robust enterprise monitoring infrastructure. That said, the true benefit of this stack can be realized only with an appropriate architecture for alert design, a process guidance on thresholds, and identification of key performance indicators to improve the signal to noise ratio.


Posted by Sandeep Chanda on August 14, 2014

In Visual Studio 2013, the team unified the performance and diagnostics experience (memory profiling, etc.) under one umbrella and named it Performance and Diagnostics Hub. Available under the Debug menu, this option reduces lot of clutter in terms of profiling client and server side code during a debug operation. There was lot of visual noise in the IDE in the 2012 version and the hub is a significant addition in improving developer productivity.

In the Performance and Diagnostics hub, you may select the target, and specify the performance tools with which you would want to run diagnostics. There are various tools that you can use to start capturing the performance matrices like CPU Usage and Memory Allocation. You can collect CPU utilization matrices on a Windows forms based or WPF application.

The latest release of Update 3 brings with it some key enhancements to the CPU and memory usage tools. In the CPU usage tool, you can now right-click on a function name that was captured as part of the diagnostics and click View Source. This will allow you to easily navigate to the code that is consuming CPU in your application. The memory usage tool now allows you to capture memory usage for Win32 and WPF applications.

The hub will also allow you to figure hot paths in the application code that might be causing more CPU cycles and may need refactoring.

You can also look for functions that is doing most work as illustrated in the figure below.

Overall, the Performance and Diagnostics hub has become a useful arsenal for developer productivity and catering to non-functional aspects of the application scope.


Posted by Sandeep Chanda on August 5, 2014

Microsoft Azure Service Bus Event Hubs provide a topic based publish/subscribe messaging platform that allows for high throughput and low latency message processing. A preview version has been recently released by the Microsoft Azure team.

Event Hubs is a component of the service bus, and works alongside service bus topics and queues. Event Hubs provide a perfect platform for collecting event streams from multiple devices and sending them to an analytics engine for processing. This makes it ideal for an Internet of Things (IoT) scenario, where you can capture events from various connected devices and make meaningful decisions based on the ingested event stream.

You can also make use of analytics on the ingress to perform tenant billing, and performance monitoring amongst many possibilities. Event Hubs not only provide a reliable message processing platform, but also support durability for a predefined retention period allowing consumers to connect back in case of a failure.

Getting Started

An Event Hub is part of a Service Bus namespace, and typically consists of a Publisher Policy, Consumer Groups and Partition Keys. A publisher is a logical concept for publishing a message into an Event Hub, and a consumer is a logical concept for receiving messages. Partition allows for scaling Event Hubs and subscribers connect to a partition. Events in a partition are also ordered for delivery.

Currently the supported protocols for pub-sub are HTTP and AMQP. Note that for receiving data only AMQP is currently supported.

The Azure Service Bus NuGet package provides the EventProcessorHost and the EventHubClient API to process messages and send messages to the hub respectively. To start a host that can listen for incoming messages you can create a new instance of the EventProcessorHostas shown below:

host = new EventProcessorHost(
                hostName,
                eventHubName,
                consumerGroupName,
                eventHubConnectionString,
                storageConnectionString, eventHubName.ToLowerInvariant());

Note that it is a good practice to share the hub name in lowercase to avoid any case conflicts on names that the subscribers may present. You need to provide the connection strings for the event hub on the service bus namespace, the storage connection string for the queue, the name of the consumer group and a host name. You can then create a processor factory implementation using the IEventProcessorFactory interface to provide a factory implementation for processing the incoming messages. The host instance can then register the factory to listen for ingress using the RegisterEventProcessorFactoryAsync method. Similarly from the client, you can create an instance of the Event Hub Client using the EventHubClient.CreateFromConnectionString method, and then start sending messages using the SendAsync method that the client exposes.


Posted by Sandeep Chanda on July 31, 2014

In the previous post you learned how to setup an Express Node.js application in Microsoft Azure and also make it a unit of continuous deployment using Git. An Express Node.js application in silos without a data store to back it is not very useful. In this post you will explore setting up a MongoDB database using the Microsoft Azure marketplace that can then act as a repository for your Express Node.js web application to store large scale unstructured data. Hosted in Azure, it is limited only by the ability of the platform to scale, which is virtually infinite.

Getting Started

The first thing you would need to do is to subscribe to the MongoLab service from the Microsoft Azure store. MongoLab is a fully hosted MongoDB cloud database that is available with all the major cloud providers, including Azure.

To add MongoLab service to your subscription, click New in your management portal, and select the Store (preview) option.

Note that, depending on your subscription, the store may or may not be available to you. Reach out to Azure support if you need more details.

Find MongoLab from under the App Services category in the store and select to add it to your subscription.

500 MB is free to use. Enter your subscription details in the form that is presented and then click Purchase to complete the operation of adding it to your subscription. You can now use the Mongoose Node.js driver to connect to the MongoLab service database and start storing your model data.

Installation

To install the Mongoose driver, run the following command in your console:

 npm install mongoose –save

You are now all set to connect to the MongoDB database hosted in Azure. Get the connection string for the hosted instance and then use it in your express Node.js application controller code:

var mongoose = require('mongoose');
mongoose.connect('[connection string]');

You can use the model function to get associate a model with the Mongoose schema and then perform your operations on the model data.


Posted by Sandeep Chanda on July 28, 2014

Express is a powerful, yet lightweight and flexible web application framework for Node.js. In this post we will explore how you can create and deploy an express application in Microsoft Azure.

Prerequisites

First and foremost you need Node.js. Once you have installed Node.js, use the command prompt to install Express.

npm install express

You can also use the –g switch to install express globally rather than to a specific directory.

In addition, you will need to create a web site in Microsoft Azure that will host the application. If you have the Azure SDK for Node.js then you would already have the command line tools, but if not, use the following command to install the Azure command line tool:

npm install azure-cli

Create an Express App

Once you have installed Node.js, use the command prompt to create the Express scaffolding using the express command. This will install the scaffolding templates for views, controllers and other relevant resources with Jade and Stylus support:

express --css stylus [Your App Name] 

Next, run the install command to install the dependencies

npm install

This command will install the additional dependencies that are required by Express. The express command creates a package.json file that will be used by the Azure command tool to deploy the application and the dependencies in Azure.

The express command creates a folder structure for views, controllers and models to which you can add your own. To modify the default view, you can edit the index.jade file under the views folder and add your own mark-up code. The app.js file under the application folder will contain an instance of express:

var express = require('express');
var app = express();

You can now use VERBs to start defining routes.

Deploy an Express App

In order to deploy the Express app in Azure, first, install the Azure command tools for Node.js if you don’t have the SDK installed. The next thing you need to do is to get the publish settings from Azure and import it into your Node.js application using the following commands

azure account download
azure account import <publish settings file path>

Next, you need to create a web site in Azure, and also create a local git repository inside your application folder.

azure site create [your site name] --git

You can now commit your files to your local git repository and then use it to push for deployment to Azure using the following command:

git push azure master

You are now all set. The express Node.js application is deployed in Azure.


Sitemap