ny Microsoft-sponsored conference, whether it be Tech?Ed or the PDC, ends up being a double-edged sword. You find out about all sorts of great tools and technologies that Microsoft is cooking up for your development pleasure. Then, the other shoe drops, and you find out that you have to wait for months (sometimes years) before you will be able to actually use them.
The technology tease du jour at this year’s PDC was the .NET Framework v2.0, most notably known through its more visible counterpart, Visual Studio .NET Whidbey. I wasn’t able to attend PDC this year, but I have been using Whidbey for a few months now as a member of the Whidbey alpha program. While there are some features of the Whidbey IDE that I would absolutely love to have today (such as more complete IntelliSense and the MasterPages page template designer), the new .NET Framework itself is what I’m really after.
Many new features have been added to the C# language since the v1.1 timeframe (most of which will make it into VB .NET as well). Aside from any necessary bug cleanup, these features appear ready for prime-time, yet they won’t see the bright lights (or monitors, as it were) of software development nirvana until well into next year. Similarly, there are hundreds (actually thousands) of new .NET Framework classes that I am just dying to put into production use that aren’t available right now, but will soon join C# in a big holding pattern, waiting for Whidbey to be stabilized, polished, and prepared to ship.
I’ll be the first one to step forward and say that I am glad that Microsoft is taking the requisite time and precautions to ensure that the next version of the .NET universe is even more robust, secure, and powerful than the sterling product that they have already turned out. I don’t agree with their idea of a coordinated pre-launch of the whole kit and caboodle, though. By their very nature, software development architectures have to be developed in stages. Language enhancements are followed by framework enhancements, which are then followed by tool enhancements.
Tools are the last link in the chain, so they are obviously going to be the last ones finished, but that shouldn’t stop Microsoft from releasing the .NET Framework v2.0 itself. Imagine if Microsoft decided to hold off on shipping Longhorn until the first round of third-party applications for it were finished. We’d have to wait another two presidential terms to get our hands on it. When the OS is done, let us have it. Likewise, when the .NET Framework is done, let us have that, too.
Technologically speaking, the end result would be the same, but it would allow developers who weren’t solely dependant on wizards and fancy developer tools to get a jump on using the new functionality that Microsoft is going to spend the next year or so hyping. The launch of the new .NET Framework without the Whidbey IDE won’t be as sexy or glamorous (did I just associate those words with software development), but I can guarantee that the decision would be embraced wholeheartedly by developers who aren’t afraid of putting in a little extra effort to tap into a powerful new resource. Hell, I built angryCoder.com using Notepad during the beta days of version 1.0 of the .NET universe, so I am familiar with both the challenges and the rewards with such an endeavor. The productivity gains that I could get from partial classes and generics alone are enough to make it all worthwhile for me.
Of course, Microsoft undoubtedly has ulterior motives that prevent them from exercising their option to release the .NET Framework before Visual Studio .NET Whidbey. With a coordinated launch of an entire .NET universe, Microsoft ensures itself of first-mover-advantage in the developer tools space. A .NET Framework-only launch would open the door for another vendor to release a competing developer IDE before Whidbey was ready. Developers might jump ship to the competing IDE as an alternative to more rudimentary text editors, even though Whidbey will undoubtedly be the best .NET IDE for version 2.0 (the cynic in me hates to concede this fact so early, but history speaks volumes).
Some of you might argue that Microsoft already does what I want through the combination of their beta program and “go-live” license program. Unfortunately, many companies have strict policies against putting beta software into production (regardless of developer assurances of stability). So Microsoft, consider this an open request for you to give us the goods as they become available, instead of making us wait and then inundating us with new technology. You’ve got us hooked, now reel us in.
Secretary of RowState
I have never been a big fan of transmitting DataSets via Web services because they tend to be pretty bloated. I prefer to send arrays of entity classes via .NET Remoting. However, I had a client recently that wanted to use the batch updating feature of DataSets on a PocketPC device using the .NET Compact Framework. For various reasons (most notably feature lock in, configuration, and security issues) they wanted to avoid using Remote Data Access (RDA) and Merge Replication to get data from their Web service (the .NET CF doesn’t support Remoting) into SQL Server CE on a PDA.
What I needed to do to get this process working optimally was change the RowState property of each DataRow object to DataRowState.Added inside the Web service before I sent it over the wire to the PocketPC device, so that all it had to do was use a SqlDataAdapter object and a SqlCommandBuilder object to batch insert the contents of the DataSet into SQL Server CE. Unfortunately, the RowState property of the DataRow class is read only, so I had to get a bit creative.
I came up with two approaches that accomplish what I needed. The first was to serialize the DataSet to an XML string, manually insert the HasChanges=”inserted” DiffGram hint using string manipulation, then deserialize the string back into a DataSet object:
//...load DataSet into variable ds...// XmlSerializer serializer = new XmlSerializer(typeof(DataSet)); MemoryStream ms = new MemoryStream(); serializer.Serialize(ms,ds); string xml = Encoding.ASCII.GetString(ms.ToArray()); ms.Close(); xml = xml.Replace("
The approach shown above worked relatively well, but since serialization requires Reflection, and a host of other "not so speedy" processes, I sought out a faster solution that was less of a "hack." The approach below is nearly three times as fast as the serialization example:
//...load DataSet into variable ds2...// ds2 = ds.Clone(); for(int i=0;i
Note that you need to clone the source DataSet in order to preserve its schema, which will enable the batch update process on the PocketPC device to work properly. You then loop through the source DataSet and copy each row to the target DataSet. This adds the "inserted" hint to the DiffGram for each row. You have to use the ItemArray property of each DataRow >object, because a DataRow reference cannot be assigned to more than one DataTable (it generates an exception).
Sadly, the RowState property of the DataRow class is still read only in the .NET Framework v2.0. I'll request the change, though, and perhaps the Microsoft machine will respond favorably. Until next issue, stay angry and don't follow the rules that suck. My name is Jonathan Goodyear and I am the angryCoder.Share the Post:
Data Observability Explained
Data is the lifeblood of any successful business, as it is the driving force behind critical decision-making, insight generation, and strategic development. However, due to its intricate nature, ensuring the
Logitech G502 Software: Optimize and Customize Your Gear
One of the most significant surges of the 21st century is gaming. Gaming is more popular than ever before thanks to innovative new consoles, high-tech PC setups, mobile gaming improvements,
Different Types of Data Models Explained with Examples
In the modern world, data is everything and everywhere. With so much access to technology, data has become a valuable resource for any business. Albeit a complex one. Data is
Revolutionizing Search: A Glimpse Into Google’s Generative Experience
Google is revolutionizing the search experience as we know it with its latest generative experience. No longer will you be bound by the limitations of traditional keyword searching. Now, you
10 Productivity Hacks to Supercharge Your Business in 2023
Picture this: your team working seamlessly, completing tasks efficiently, and achieving goals with ease. Sounds like too good to be true? Not at all! With our productivity hacks, you can
GM Creates Open Source uProtocol and Invites Automakers to Adopt It: Revolutionizing Automotive Software Development.
General Motors (GM) recently announced its entry into the Eclipse Foundation. The Eclipse Foundation is a prominent open-source software foundation. In addition, GMC announced its contribution of “uProtocol” to facilitate
What is Metadata?
What is metadata? Well, It’s an odd concept to wrap your head around. Metadata is essentially the secondary layer of data that tracks details about the “regular” data. The regular
What We Should Expect from Cell Phone Tech in the Near Future
The earliest cell phones included boxy designs full of buttons and antennas, and they only made calls. Needless to say, we’ve come a long way from those classic brick phones
The Best Mechanical Keyboards For Programmers: Where To Find Them
When it comes to programming, a good mechanical keyboard can make all the difference. Naturally, you would want one of the best mechanical keyboards for programmers. But with so many
The Digital Panopticon: Is Big Brother Always Watching Us Online?
In the age of digital transformation, the internet has become a ubiquitous part of our lives. From socializing, shopping, and learning to more sensitive activities such as banking and healthcare,
Embracing Change: How AI Is Revolutionizing the Developer’s Role
The world of software development is changing drastically with the introduction of Artificial Intelligence and Machine Learning technologies. In the past, software developers were in charge of the entire development
The Benefits of Using XDR Solutions
Cybercriminals constantly adapt their strategies, developing newer, more powerful, and intelligent ways to attack your network. Since security professionals must innovate as well, more conventional endpoint detection solutions have evolved