Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

The Time Has Come for the Decentralized Application Architecture : Page 2

The business application architecture many companies use is as archaic as the decades old centralized mainframe model. Companies would be better served by a decentralized architecture in which their applications and databases are distributed across regional locations and local devices.


advertisement
The Roadblock to Decentralization: The Corporate Database
The corporate database represents the thorniest problem in adopting decentralized, or distributed, application architectures, for the following reasons:
  • Technology—The database is the most challenging technology aspect of a distributed application model. Traditional replication technologies have not provided two-way sharing of both structured and unstructured data among multiple or different databases. No technology has been able to automatically distribute, replicate, and synchronize multiple distributed and heterogeneous databases, while still providing corporate IT with centralized control.
  • Security—Placing valuable data outside corporate offices traditionally has been considered a risk because of the threat of theft or other losses. Partitioning the dataset to provide only required data has been a complex undertaking and difficult to maintain in a changing environment. Furthermore, computing resources associated with the encryption of data within a device historically has been too compute-intensive to be viable protection against theft, and authentication technology has been too expensive to provide local use of biometric or multi-token devices.
  • Economics—Corporate databases are diverse, expensive, and management intensive. Distributed databases that require local DBA support are expensive to deploy, and purchasing separate licenses for each remote location can quickly become cost-prohibitive.
  • Cultural—Most IT personnel have been schooled in the culture of the untouchable centralized corporate database. Vendors of centralized, mainframe-type systems, such as Oracle and IBM, have promoted this view. Adopting a distributed model requires a shift in thinking, from maximizing IT efficiency to maximizing field-worker effectiveness.
  • A number of market forces at work today are breaking down these barriers to the distributed application model. For one, cheap disk storage and powerful PCs make it possible and cost-effective to distribute, store, and process vast amounts of data anywhere in the enterprise. Databases used to be centralized out of necessity. Today, however, it is quite possible to, for example, put the entire customer database on a salesperson's notebook. Additionally, today's corporate emphasis on disaster recovery and high availability, combined with slashed IT budgets, make an inherently redundant, decentralized application infrastructure far more attractive and cost-effective than today's capital-intensive "fail-over" or "hot-standby" backup schemes.

    Network infrastructure has improved to the point where many systems actually operate in an "occasionally disconnected" mode. Remote workers generally have network connectivity or can easily gain access to a corporate network with limited effort. These connections—whether they are wired or wireless—likely have limited bandwidth or are subject to latency unsuitable for presenting a complex user interface, but they are ideal for providing a channel for synchronizing data on a near real-time frequency. This, combined with an approach that allows for disconnected operation, means that the user experience can have the highest quality possible.



    To illustrate this, think about a simple trip from home to the airport during which a wireless network device might operate great on the highway, lose connectivity for a short time in an underpass, tunnel, or parking garage, and suffer complete loss once the airplane departs. That type of generally available, but occasionally disconnected, access is going to be the typical connectivity model for remote workers for the foreseeable future. This is a network infrastructure that cannot support true continuous access to critical, centralized applications and data, but is entirely capable of transferring data to provide the highest quality user experience for applications that can support disconnected operation. There is no reason enterprise applications cannot provide the same network transparency that a mail reader can.

    While encryption of data being transferred is no longer an issue with widely available strong encrypted stream technology, the additional power of today's computing devices also makes it possible to encrypt data stored on disk without sacrificing basic performance characteristics. The advent of biometric devices such as the PC-Card thumbprint reader means that special devices can be deployed at a very low cost. The net result is data—whether stored in remote servers or on individual laptops—can enjoy a high level of security at a reasonable cost.

    And finally, open-source databases and application servers enable companies to overcome the cost barriers inherent in traditional vendor licensing schemes, making it economically feasible to deploy these system components in multiple locations at the edge of the network.

    The time is ripe for IT departments to decentralize their business applications and move databases and applications to the edge of the network. So, how do they do it?



    Comment and Contribute

     

     

     

     

     


    (Maximum characters: 1200). You have 1200 characters left.

     

     

    Sitemap
    Thanks for your registration, follow us on our social networks to keep up-to-date