Browse DevX
Sign up for e-mail newsletters from DevX


The Baker's Dozen: 13 Productivity Tips for ADO.NET 2.0 : Page 3

This installment of "The Baker's Dozen" presents a variety of tips and techniques to become productive with data handling techniques using ADO.NET 2.0 in Visual Studio 2005.




Building the Right Environment to Support AI, Machine Learning and Deep Learning

Tip 8: New Data LOAD Options
In ADO.NET 1.0, populating a DataSet from a back-end database meant either you must add the Fill method of the Data Adapter, or create a DataReader and loop through it. ADO.NET 2.0 provides greater capabilities for creating a data table when all that exists is a data reader.

SqlDataReader oReader = oMyCommand.ExecuteReader(); DataTable DtTable = new DataTable(); DtTable.Load(oReader,LoadOption.OverwriteRow); // note that additional LoadOption enumerations // exist, such OverwriteChanges, PreserveChanges

ADO.NET 2.0 also provides a new DataTableReader class, which a developer can use to stream data from a DataSet/DataTable using the new GetDataReader method.

Tip 9: More Control Over RowState
ADO.NET 2.0 provides two new methods to provide greater control over a DataRow's RowState: SetAdded and SetModified. These methods set the RowState property of a DataRow, which was a read-only property in ADO.NET 1.0.

// Developer can use these to set the RowState // of an unchanged row oMyDataRow.SetAdded(); oMyDataRow.SetModified();

Tip 10: SqlBulkCopy
ADO.NET 2.0 provides the ability to bulk copy the contents of a DataReader, DataTable, array of DataRows, or a DataSet to a SQL Server database. Listing 5 demonstrates an example of this welcome new capability to transfer data from one source to another.

Tip 11: Multiple Active Result Sets (MARS)
SQL Server 2005 supports Multiple Active Result Sets (MARS), which permits multiple DataReaders on a single connection. MARS has the potential to optimize database connections and performance, as multiple stored procedures or SQL pass-through queries can be executed asynchronously. This code demonstrates how to use MARS.

SqlConnection oMyConn = new SqlConnection(cMyConnString); oMyConn.Open(); SqlCommand cmd1 = new SqlCommand( "select * from Table1",oMyConn); SqlCommand cmd2 = new SqlCommand( "select * from Table2",oMyConn); SqlDataReader rReader1 = cmd1.ExecuteReader(); SqlDataReader rReader2 = cmd2.ExecuteReader(); oMyConn.Close();

Tip 12: Connection Pooling
ADO.NET 2.0 improves connection pooling through two new methods: ClearPool and ClearPools. Both the SQL Server and Oracle providers support these functions. ClearPool will clear a specific pool while ClearPools will clear all connection pools.

Tip 13: Connection Statistics
A developer can retrieve Connection Statistics from the SqlConnection class by setting the connection object's StatisticsEnabled property to true and calling the RetrieveStatistics method as shown below. Table 3 lists the statistics.

SqlConnection oMyConn = new SqlConnection(cMyConnString); oMyConn.Open(); oMyConn.StatisticsEnabled = true; SqlCommand oCmd = new SqlCommand( "SELECT * FROM Authors",oMyConn); SqlDataReader oReader = oCmd.ExecuteReader(); Hashtable hStats = (Hashtable) oMyConn.RetrieveStatistics(); IDictionaryEnumerator eStats = hStats.GetEnumerator(); String cResults = String.Empty; While(eStats.MoveNext()) cResults += eStats.Key.ToString() + " " + eStats.Value.ToString() + "\n"; MessageBox.Show(cResults); oMyConn.Close();

Table 3: Available connection statistics.




















In Addition…
Coming up with thirteen tips is always difficult, especially when a particular technology has more than thirteen new capabilities! ADO.NET 2.0 also contains the following enhancements:

  • Support for the new SQL DataTypes (user-defined types, the XML DataType, and large character support)
  • Query notifications to refresh cached data when the original server data changes
  • Promotable transactions
  • Namespace-qualified tables
  • Batch updates
  • New Execute methods for the DataReader
The Baker's Dozen Commentary
An old debate has resurfaced in the form of the argument about using DataSets versus custom entities, passing XML across boundaries versus passing custom objects, etc. Here are my twin copper Lincolns:

I'm immediately skeptical of any argument that begins with "never use DataSets" or "always use DataSets." To paraphrase the old expression, the form of an application follows the functionality. I'll freely admit that I come down more on the DataSet side, primarily because the applications I've built have the requirements that are often addressed by the built-in capabilities of ADO.NET. Had the applications been different, taking a more hardcore OOP approach to creating entities might have been preferable.

But for the ubiquitous line-of-business, data-driven applications, the DataSet approach provides many benefits. Tasks such as two-way data-binding, RowFilter/Sorting, XML integration, setting up relationships between data objects, preserving rowstate information, merging changes, and complex reporting operations are much easier with DataSets.

You may have to write additional code—sometimes a substantial amount of code-in the absence of DataSets. Those advocating this approach should ask themselves-is it worthwhile? And with the performance enhancements in ADO.NET 2.0, will custom code to accomplish these tasks execute as quickly as native ADO.NET 2.0 capabilities? In my opinion, the ability to effectively finish an application more quickly using native capabilities in .NET is a fundamental argument that developers should seriously evaluate when opting for approaches that exclude DataSets.

Kevin S. Goff is the founder and principal consultant of Common Ground Solutions, a consulting group that provides custom web and desktop software solutions in .NET, Visual FoxPro, SQL Server, and Crystal Reports. Kevin has been building software applications for 17 years. He has received several awards from the U.S. Department of Agriculture for systems automation. He has also received special citations from Fortune 500 companies for solutions that yielded six-figure returns on investment. He has worked in such industries as insurance, accounting, public health, real estate, publishing, advertising, manufacturing, finance, consumer packaged goods, and trade promotion. In addition, Kevin provides many forms of custom training.
Thanks for your registration, follow us on our social networks to keep up-to-date