Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


 
 
Posted by Sandeep Chanda on June 26, 2015

Yesterday, the Microsoft Azure team announced the availability of the Azure Resource Usage and Rate Card API that developers on the Azure platform can now leverage to programmatically retrieve usage and billing information. This feature will now allow enterprises in turn to charge their customers based on usage. This was long overdue for multi-tenant systems hosted on Azure and will allow accurate tracking of cloud spend and also make it more predictable to manage the cost of your operations on the cloud. Specifically, using the Billing API, there are two areas that you can query at your subscription level:

  1. Resource usage: The resource usage REST API allows you to get data consumption at a subscription level. The API acts as a resource provider as part of the Azure resource manager and you can use the role based access control features to allow/deny access to data. The URI you would call is the Usage Aggregates resource https://management.azure.com/subscriptions/{subscription-Id}/providers/Microsoft.Commerce/UsageAggregates. You need to pass the API version, report start and end date time. You need to also pass the granularity value as daily or hourly. What you will get back, amongst other things, is the usage start and end times representing the timestamp of the actual recorded usage, the meter category (storage or otherwise), meter subcategory representing if the storage is geo-redundant, and finally the quantity in units (typically GB). You can also send the show details flag as true in which case the response will also show the region and the project using the resource.
  2. Rate card: The rate card REST API allows you to fetch the pricing information by locale, currency, and region. The URI you would call is the Rate Card resource https://management.azure.com/subscriptions/{subscription-Id}/providers/Microsoft.Commerce/RateCard. The response will give you the meter rates (based on the currency specified in the input) for all the available meter categories like cloud services, networking, virtual machines, etc.

The API can be used against various scenarios such as finding the monthly spend, setup alerts if the usage varies against a specific threshold, and metering tenants based on usage.


Posted by Sandeep Chanda on June 17, 2015

The Command Query Responsibility Segregation (CQRS) pattern isolates the data querying aspects of an application from the insert, update and delete operations. There are limited use cases regarding when you should use this pattern and it doesn't apply to request response style scenarios where the updated results may need to be displayed to the user immediately after an insert / update / delete. You must carefully evaluate your requirements to determine if CQRS is suitable to address your architectural requirements. Typically, requirements that are more sophisticated than just information systems driven CRUD operations, such as information in different transient states of representation, are good candidates for CQRS.

Event Sourcing is a useful scenario where the CQRS pattern can be leveraged. In an event sourcing scenario, the application stores state transitions as events in an event store. The read and write models in the scenario may be in different states, but the application eventually gets a consistent current state by playing the events in sequence. A good example is a highly scalable hotel reservation system in which certain attributes of the reservation can be modified until midnight of the day before arrival. The query and command operations in this scenario can be dealt with separately using the CQRS pattern where the states maybe not in sync but will eventually become consistent to determine the current state of the reservation.

CQRSLite is a useful and lightweight CQRS and Event Sourcing framework to start getting your head wrapped around the pattern. To create a more robust architecture around the CQRS pattern, often adding a complex event processing (CEP) tool or a bus is useful. Event Store is an open source, high performance, scalable and highly available CEP written in JavaScript. Client interfaces are also provided in .NET apart from native HTTP.


Posted by Sandeep Chanda on June 1, 2015

Last week's Google I/O 2015 event saw a slew of announcements, most notably the announcement of Android M. A number of new features were also in the upcoming 7.5 version of Google Play Services. The version brings along quite a few interesting features and optimizations to the entire Android ecosystem. The integration of Google Smart Lock with Android apps is a cool new addition to this version. Chrome allows you to save your Open ID and password based credentials using the Chrome Password Manager. You can now retrieve the stored credentials using the newly added Credential API in your Android Apps. The credentials can be retrieved as part of the login process on Apps running on any device.

The API automatically provides the necessary UI to prompt users to fetch and store the credentials for the purpose of future authentication. To store the credentials, you can use the Credential API's Auth.CredentialsApi.save() method and to retrieve the stored credentials, you can use the Auth.CredentialsApi.request() method. Other than sign-in you can also use the API to rapid on-board the user by partially filling the sign-up form for the app.

Another interesting addition was the release of App Invites (beta). The feature allows you to share your App with people you know. You can create actionable invite cards and send them via email, enabling you to market your app better. The invitations can be sent via SMS as well, providing a wider outreach. You can also personalize the access to Apps such as adding discount codes for certain invitees. You can send an invitation by creating an Intent using the AppInviteInvitation.IntentBuilder class. The Intent contains a title, the message, and a deep link data.

If the App is already installed on the user's device, the user can follow the invitation workflow generated by the deep link data. If the app is not installed, then they can choose to do so from the Play Store. The service is also available on iOS.


Posted by Sandeep Chanda on May 26, 2015

GitColony provides an easy-to-use, one-stop-shop collaborative environment for your code reviews and QA processes. It directly integrates with your GitHub repository and provides an intuitive and gamified dashboard that helps you review code as you write, instead of holding a large delta to get piled up for a review over a period of time. With GitColony, you don't have to wait for pull requests to review tons of code at one go. You can review code as it is written in the form of Partial Reviews making your review more actionable. It also remembers the last review so that the same review need not be done twice.

More interestingly, you can setup business rules around code quality and as they get built, you can get notifications on the critical paths which you have identified. If there are any changes to files in the critical path, you would certainly know about it. Using the rules you can reinforce the code review policies as well. The process to request a review is also very simple. You can tag the user you want to participate in reviewing your code in your commit message and a review request will automatically get created.

Setting up GitColony is pretty easy. After registering your company and pointing to your GitHub account, you can setup the repositories to sync as shown in the following figure.

You can also configure the people in your team who will collaborate and participate in the review process.

This is however an optional step during configuration and you can always come back to add people. Note that the pricing option you select will limit the number of people you can add to your GitColony account for collaboration. You can also setup the profile of each person collaborating on the platform.

You are now all set to use GitColony. The GitColony dashboard will display everything you need to know for monitoring the quality of your code. In addition, the dashboard will also display the incidents that are assigned to you.

GitColony not only allows you to establish a code review process, it also supports a robust QA process. It provides a QA plugin that allows you to automate tests by recording actions from the browser. It also provides for an integrated Dev-QA environment by allowing the QA team to vote for approving or rejecting a live branch or a pull request. This can ensure that no code makes its way to production unless certified by a QA team.


Posted by Sandeep Chanda on May 11, 2015

At the recent Build 2015 event, Microsoft announced the launch of an open source extensible tool for debugging JavaScript called Vorlon.js. Vorlon.js is powered by Node.js and you can remotely connect to a maximum of 50 devices simultaneously to run your JavaScript code with a single click. You can then see the results in your Vorlon.js dashboard. The same team that brought WebGL to the JavaScript world with the launch of Babylon.js is also the team that created this powerful remote debugging JavaScript tool.

The idea behind creating Vorlon.js was to allow developers to better collaborate with JavaScript code and debug together. The code written by one person will be visible to all and the experience is browser agnostic. No dependency, just pure play JavaScript, HTML, CSS running on the target devices.

The tool itself is a lightweight web server that can be installed locally or run from a server for the team to access the dashboard which acts as the central command centre and communicates with all the remote devices. The tool is also extensible with a plug-in framework that developers can use to create extensible plug-ins. It already comes with a few out of the box plug-ins to view console logs, inspect the DOM, and display the JavaScript variable tree using the Object Explorer.

It is very easy to start using Vorlon.js. First install the NPM package using the Node.js console:

$ npm i -g vorlon

To start running the tool, type the vorlon command:

$ vorlon

Now you have it running on the localhost port 1337 by default. To start monitoring your application you can add reference to the following script:

<script src="http://localhost:1337/vorlon.js/SESSIONID"></script>

Here SESSIONID is any string that can uniquely identify the application. You can ignore it as well, in which case it is replaced by default. You can start seeing the output, DOM, and console log in the dashboard by navigating to the following URL:

http://localhost:1337/dashboard/SESSIONID

You are now all set to use Vorlon.js.


Posted by Sandeep Chanda on April 28, 2015

While the last few decades were dominated by client server technologies, this one will be the decade of connected systems and devices operating on cloud platforms. Service orientation has paved the way for hosted APIs and Software as a Service (SaaS). Communications between publishers and subscribers of such services are getting orchestrated through the cloud -- with a hosted and managed foundation, giving rise to a new world of software defined infrastructure with programmable compute, networking, security and storage. What this means is that development teams can worry less about the hosting of the infrastructure and instead focus on optimizing the core, non-functional aspects of the system under development. The following figure illustrates the reorganized technology stack that you can expect to shape up in the near future:

At the infrastructure tier, virtual machines are now a thing of the past. The majority of the application stack will be managed by container technologies such as Docker. They are lightweight and can be built and deployed using scripts, making the GUI redundant. Microsoft Windows is already hitching a ride on container tech and making forays therein with the announcement of Nano server. Container technologies will make it very easy for DevOps teams to automate the release processes. Platforms, such as Chef, will be leveraged to turn your infrastructure into code and automate deployments and testing. Microsoft is also working very closely with Chef to extend its capabilities in Nano server.

Sitting a layer above will be APIs delivering Software as a Service. Platforms like Azure App Service and APIARY are already making it easier for developers to host their API and make it accessible via a marketplace. In addition, a variety of UI technologies are also evolving, targeting multiple form factor devices and allowing for consumption of the data from the APIs.


Posted by Sandeep Chanda on April 20, 2015

The Microsoft Azure team had been treating web and mobile app platforms independently, up until now, supporting a different hosting model for each one. While Azure Mobile Services catered to the needs generated by the mobile first approach, Azure Websites would support hosting of .NET, Node.js, Java, PHP, and Python based web applications with built-in scale. Underneath, these services were not that different from each other and it was time that the Azure team unified the different services under one platform. They did just that last month in the form of Azure App Service. The Azure App Service brings together the Web App, Mobile App, Application Logic (Workflow App) and API Services together, which can easily connect with SaaS based and on-premises applications — all with unified low cost pricing.

The Logic App is a new feature that you can use to create a workflow and automate the use of data across realms without having to write any code. You can login to the Azure preview portal and create a new Logic App by navigating to the Web + Mobile group under the New menu.

From under the Web + Mobile menu, click on the Logic App option to create a new Logic App. Specify a name for the app and select an App Service Plan.

You can also configure other options, such as the pricing package, resource group, subscription and location. You can then click on the Triggers and Actions tab to configure the trigger logic. The following figure illustrates using the Facebook connector to post a recurring message from the weather service to the timeline.


Posted by Sandeep Chanda on April 7, 2015

Last week, Microsoft released an alpha version of its application-scale JavaScript in TypeScript 1.5. The language will now feature a plug-in for one of the most popular editors in Sublime Text. The 1.5 version will incorporate new concepts from ES6 and ES7 specifications in the form of Modules, which were introduced in ES6, and Decorators. TypeScript 1.5 syntax will allow you to define external modules and allow them to be imported with the ES6 style default import/export feature incorporated as well.

To define an export module you can use the export keyword before a function in your .ts file that is likely to be referenced.

export function greet(person) {
              return "Hello" + person;
}

For default import you can also use the default keyword:

export default function greet(person) {
              return "Hello" + person;
}

In your calling script, you can now reference the file containing the greet function and then import it.

import greet from "<the referenced file name>";

You can import multiple functions by wrapping them in curly braces.

import { greet, …} from "<the referenced file name>";

Alternatively you can import all using the following syntax:

import * from "<the referenced file name>";

TypeScript 1.5 will also support a new ES7 specification called Decorators. Decorators are a superset of metadata annotations. Decorators allow you to annotate and modify classes and properties at design time. The syntax is to precede the property or class you want to annotate. For example the following syntax shows how to create a ready only property:

class Person {
  @readonly
  name() { return '${this.first} ${this.last}' }
}

In other examples of decorators, you can use the memorize to memoize an accessor.

To compile TypeScript in Node.js, you must create a git clone of the repository and then install the Jake tools and dependencies.

npm install -g jake
npm install


Posted by Sandeep Chanda on March 26, 2015

For many years now, Dominic Baier and his team at Thinktecture has been relentlessly pursuing the cause to provide a lightweight alternative to securing costly server technologies in implementing really simple claims-based identity solutions. Their IdentityServer framework has graduated into an enterprise class identity suite with many large corporations leveraging it for single sign-on. With the release of IdentityServer3, it now becomes an OWIN/ Katana based framework with hostable components to support SSO in modern web applications supporting all modern identity specifications like OpenID Connect and OAuth2.0. It is very easy to configure IdentityServer3 in your ASP.NET MVC or Web API application.

First you need to install the relevant NuGet packages in Microsoft.Owin.Host.SystemWeb and Thinktecture.IdentityServer3. Next you need to setup an OWIN startup host file that replaces the ASP.NET host. You can create a Startup.cs file in your ASP.NET MVC project and call the UseIdentityServer extension method with IAppBuilder to setup IdentityServer in your OWIN host.

public void Configuration(IAppBuilder app)
{
    var options = new IdentityServerOptions
    {
        SigningCertificate = <implementation to fetch the certificate>,
        Factory = Factory.Create()
    };

    app.UseIdentityServer(options);
}

You must also decorate the class with OwinStartupAttribute attribute.

 [assembly: OwinStartup(typeof(<your project name space>))]

In addition, in your Web.config file you must set the run all managed modules for all requests attribute to true to allow identify server resources to be loaded correctly.

It is also possible to specify the clients that will leverage the identity server for authentication and the provider supplying the identity information from a user database or LDAP repository. This configures identity server and you can browse the /identity/.well-known/opened-configuration URL to discover the end points.

To add OAuth 2.0 support, the IAppBuilder provides the UseJsonWebToken method that you can configure in your Startup.cs file

app.UseJsonWebToken(
               issuer: ConfigurationManager.AppSettings["Issuer"],
                audience: ConfigurationManager.AppSettings["Audience"],
                signingKey: signingKey); 

You are all set. You can now use the AuthorizeAttribute attribute on your controller actions to authorize resource access, and initiate authentication with IdentityServer3. IdentityServer3 will present the login page, and based on the configured identity provider will allow you to login to access the resource. The Authorize attribute is available out of the box in MVC. You can use the more robust annotated resource authorization feature in IdentityServer3. To use that, install the Thinktecture.IdentityModel.Owin.ResourceAuthorization.Mvc package and then you can start using the ResourceAuthorizationAttribute attribute in your controller actions:

 [ResourceAuthorize("Read", "OrderDetails")]

You can now isolate access control in terms of who can read the order details (in our example above) in an AuthorizationManager call that invokes the relevant manager depending on the resource being accessed.

The AuthorizationManager should be part of the OWIN startup configuration using the IAppBuilder UseResourceAuthorization method.


Posted by Sandeep Chanda on March 20, 2015

Recently, Scott Guthrie announced the release of ASP.NET 5 on his blog. This release will feature the most radical changes to the popular web development framework in its 15 year history. The preview is now available for download. According to Scott, this open source framework release will focus on making it more lean, modular, and optimized for Cloud. It will also be highly portable, running with equal ease on Windows, Linux and Mac based systems. Some of the important updates to the core framework feature include the .NET Core assembly now getting deployed as part of the app allowing you to run multiple versions side by side and keeping the application independent of the framework installed on the host OS. In addition, in order to keep the framework lightweight, the components are available as NuGet packages and you can install only what your application requires.

Another interesting update for developer productivity in Visual Studio 2015 for ASP.NET 5 is the dynamic compilation feature that allows you to reflect the changes, such as values assigned to variables, at runtime and see them updated in the UI output. This will go a long way in helping developers debug without having to waste time in recompiling the entire project.

ASP.NET 5 also comes with MVC 6, which be a more unified model than previous editions by removing the redundancy across Web API, MVC and Web Forms. This will also aid in seamless transitions from one platform to another. Syntactical improvements in Razor are also a nice addition. You can now use a much more robust declarative syntax such as using asp-validation-summary instead of @Html.ValidationSummary in your views by virtue of extending the semantics of the tags in markup.

However, the most important addition, in my opinion, is the native support for Dependency Injection. It now provides the ActivateAttribute attribute that can be leveraged to inject services via properties not just in controllers but filters and views as well.


Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date