Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

ADO.NET Best Practices, Part II

In ADO, inner joins are accomplished using proper SQL statements within a single batch. With ADO.NET, you can obtain the same data organized in independent but related tables which greatly simplifies updates. What's faster? That depends on the application.


advertisement
DO.NET has a strong and powerful disconnected model that lets programmers build a Web of in-memory objects and relate contents coming from different tables and even from different data sources. It's important to think through your code strategy carefully when processing queries and updates on interrelated tables to preserve scalability and maintain high performance. Sometimes, it's much more effective to split compound queries; sometimes it isn't. Submitting changes in batch mode, exploiting the DataSet and its disconnected model, may be the perfect solution. However, when you need to move dozens of megabytes, the DataSet's serialization mechanism increase the size of the data considerably. In ADO.NET, the magic word is "tradeoffs." All data-driven applications look alike. They are built around a few basic tasks—fetch data across different tables, join columns, and filter data based on some user-specific criteria. Data-driven applications must also format data in such a way that end-users can easily and pleasantly consume it. More often than not, data is made of cross-related tables in which parent/child relationships are in place to define the data model and to extract a specific subset of information like detail views and subtotals.

So can we conclude that all applications that deal with data are the same and can be built using a common pattern? Of course not. Other aspects, including scalability and performance, affect and influence the design. All distributed applications must be scalable to some extent. If a Web application is made open to anonymous users—for example, a shopping site—providing a good, if not optimal, level of scalability is a sort of survival test. Scalability is related to performance, but it can't be identified exclusively with that. Likewise, performance affects scalability but in no case are they synonyms.

ADO.NET gives you the tools to configure the runtime environment so it can automatically cascade changes from parent to child rows.
In general, caching data does not seem to be an optimal choice for all Web applications because it consumes valuable resources on a per-user or per-application basis. As a result, having less memory available to the Web server might lead to performance degradation. On the other hand, if you're developing an application that needs to be both fast and scalable, you should carefully consider caching data at some level to reduce the workload on middle-tier components and the bottleneck of database connections. The more tiered your system is, the more options you have to scale it out as long as you can move disconnected caches of data across the tiers. In the first part of this article, I examined a few recommended approaches for connections and transactions management in ADO.NET applications. In this second installment, I'll tackle SQL-level programming and review a few scenarios in which the quality and the structure of SQL statements can make a world of difference between applications.


Untangle the Database Ball of Wool
Even the simplest Web application can have a rather complex and interrelated structure of databases. While a database with dozens of related tables, views, constraints, and triggers is perhaps not what you deal with every day, I've never seen a real-world application with less than a handful of related tables. The problem with a grid of databases is with both queries and updates. End users should be able to access and filter related rows, but also to update and delete rows in a way that does not violate the consistency and integrity of the database.

Some databases resort to server-side constraints to block undesired changes. The fact that most database administrators use this strategy to stay on the safe side is not a good reason to avoid code that could potentially corrupt the database. Ideally, client applications should submit changes that respect the database structure and its internal update rules (for example, relationships and constraints). ADO.NET has a strong and powerful disconnected model that allows applications to work with sort of in-memory databases. The DataSet object governs the storage of multiple tables and it enables filtering, sorting, constraints, and cascading changes. This model, though, seems to fit more easily for updates than for queries. For compound queries, the hurdle is the way in which an SQL result set is adapted to fit into a DataSet and whether cached data is effective to the application. Let's review a couple of solutions for compound queries.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap