Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

ADO.NET Best Practices, Part II : Page 3

In ADO, inner joins are accomplished using proper SQL statements within a single batch. With ADO.NET, you can obtain the same data organized in independent but related tables which greatly simplifies updates. What's faster? That depends on the application.


advertisement
Splitting Monolithic Queries
Suppose that you are not allowed to use multiple queries in your application. This has nothing to do with a bad opinion you may have of the solution. It is just that your DBA strongly disagrees with any suggested change in the set of stored procedures. You would like to manage data using comfortable ADO.NET objects, but your DBA will only let you employ old-style stored procedures to get them. How can you load the result sets into ADO.NET disconnected objects?

ADO.NET provides you with the tools to configure the runtime environment so that changes can automatically be cascaded from parent to child rows.
Let's consider the following more complex query. It retrieves, grouped by customer, all orders issued in a given year. The query won't retrieve all the orders, though, but only those with at least 30 items. In addition, the name of the employee who processed the order and the name of the company that issued it are inserted into the result set.

SELECT o.CustomerID, od.OrderID, o.OrderDate, o.ShippedDate, SUM(od.Quantity * od.UnitPrice) AS price, c.CompanyName, e.LastName FROM Orders o INNER JOIN Customers c ON c.CustomerID=o.CustomerID INNER JOIN Employees e ON e.EmployeeID=o.EmployeeID INNER JOIN [Order Details] od ON o.OrderID=od.OrderID AND o.OrderID=od.OrderID WHERE (YEAR(o.OrderDate) = 1997) GROUP BY o.CustomerID, c.CompanyName, od.OrderID, o.OrderDate, o.ShippedDate, e.LastName HAVING (SUM(od.Quantity) > 30) ORDER BY o.CustomerID, od.orderid

Listing 1 demonstrates how to split the results of the query above. The main procedure—SplitData—gets the DataSet originated by the query and processes it. The input DataSet contains just one table.

The method in Listing 1 makes a copy of the DataSet and then creates two new tables—Employees and Customers. The SplitData method fills these two tables up using data from the monolithic table. Finally, the method obtains the third table—Order Details—and removes processed columns from the original table. As a result, you run a traditional stored procedure, pay the price of redundant data being moved across the wire, but gain in flexibility because you can handle disconnected data in distinct and possibly related tables. What's the big advantage of this approach? You can now make batch updates.

Conducting Cross-Table Updates
When it comes to updating related tables, the order in which your program executes each constituent statement is critical. For example, you can't delete an order without first deleting all of its details. Likewise, you aren't normally allowed to add an invoice detail row without first adding its parent row.

When related tables are involved, changes flow from the parent to the children in different ways, depending on the type of the change you're attempting—update, delete, or insert.

In general, there are two ways of conducting table updates—through a direct connection or batch update. In the case of a direct connection, you typically open a connection and execute the stored procedure or the SQL batch with all the logic and data you need to apply. In the case of a batch update, you typically yield to a middle-tier component, which gets disconnected data from you and takes care of submitting data to the database, possibly in an asynchronous way.

The disconnected approach in a Web scenario requires you to serialize the data that you want to submit from the Web server layer down to an internal tier, which holds a physical connection to the database. The availability of an easy-to-serialize object such as the DataSet and the powerful batch update mechanism of data adapters give you the tools to build highly scalable applications.

The batch update process passes all in-memory changes recorded in a DataSet table to the back-end system for permanent storage. In ADO.NET, you don't submit your blocks of data being sent to the database in a single shot. Your ADO.NET batch update will execute individual statements on the target system, one for each change that needs to be submitted. Batch update opens one connection but still performs multiple statements in a predefined order.

To run a batch update, you don't have to do much more than call the Update method of the data adapter class. This simple mechanism, though, won't work at all with inter-related, hierarchical data. When you deal with hierarchical data, you need client-side cascading changes and custom update logic to build effective applications.

ADO.NET supports two types of constraints—unique and foreign-key constraints. A unique constraint is a restriction on a data column in which all values must be unique. A foreign-key constraint set on a DataTable restricts the action performed when you try to delete or update a value in a column. For example, deleting a value from the parent table can affect rows in the child table in various ways. Often you want to delete child rows in the child table. However, in some cases, you'll have a cascading action set the child rows to null or default values. You can programmatically set the action that works best for using the programming interface of the ForeignKeyConstraint class.

What really matters, though, is that ADO.NET provides you with the tools to configure the runtime environment to automatically cascade changes from parent to child rows. Such a mechanism allows you to work on a DataSet object, enter changes, and (if needed), have them propagated through the tree. When you have gone through all the changes, ADO.NET guarantees that all the tables in the DataSet have been properly updated. If a change cannot be cascaded, or simply contains invalid values that would violate any of the relationships, an exception would be thrown.

After ADO.NET has updated your tables, you just take the resulting DataSet and apply the logic that best suits your updates. For example, depending on the relationships set between your tables, you may need to process child deleted rows before parent insertions and child updates after parent updates. You can easily extract the subset of rows to process using the Select method on the DataTable object. Whatever the right order is to enter changes to your application, you have the tools to get ready-to-process rows.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date