Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Coordinate Your DTS Tasks, Part I: "After the Fact" Data Updates-2 : Page 2


advertisement
Business Problem: "After the Fact" Updates
In terms of resource usage, establishing a database connection between an application and a server is one of the most expensive things you can do. Yet because the interface (DTS Designer) makes selecting a connection and placing it in a package so easy, you may be tempted to just select a new connection, point it to a task, and go—ignoring the resulting resource glut. But the right thing to do and the easy thing to do are two different courses.

If you need to import a collection of objects, the easy thing to do is to open a connection and hook it up to a task that performs the operation (insert, update, find a substring, etc). However, the right thing to do is reuse the connection and loop through the collection, moving on to the next instance.

This connection principle lent itself well to a recent project I had to tackle at a client organization. My job was to build a solution that would automatically refresh check register data from Excel, kind of like replication does for databases. The problem was, as far as I knew, replication didn't exist for Excel.



Using a selection of DTS tasks enabled me to rebuild a snapshot between two disparate data sources (Excel and the data warehouse), saving the organization hours of manual reconciliation between their claim data and their check data. By using DTS, I was able to import their data from Excel and compare it against the data warehouse to generate a list of only the checks that required investigation, typically less than $5,000 worth of checks on claims batches of more than $5 million.

Basically, I automated the business process of importing cleared checks from Excel workbooks and comparing them to claim records from the data warehouse. However, the unstable nature of claims, which can undergo a change in state after they are paid, complicates this process. For example, utilization management could decide that a doctor was overpaid, a claim was paid twice, or a claim was underpaid. These facts of life mean that checks often undergo changes after they are written. Likewise, a check could be lost voided or reissued.

I needed to account for this dynamic aspect in my solution. A snapshot of the check register was not adequate, because these changes in state occurred sometimes months after the claim was settled. Instead, the solution needed to refresh the check data every time a new batch of claims was reconciled, typically three times a week. So while replication would have been a good solution if the check data were also stored in a database, in this case the Finance department, which was responsible for the checking part of the process, managed the data in Excel. (What finance department doesn't?)



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date