Browse DevX
Sign up for e-mail newsletters from DevX


Coordinate Your DTS Tasks, Part I: "After the Fact" Data Updates-3 : Page 3




Building the Right Environment to Support AI, Machine Learning and Deep Learning

The Task to Fulfill the Task
So after examining the pre-packaged tasks in the DTS Designer, I came across the Execute Process task, which lets you execute a process from DTS (kind of how xp_cmdshell does, except without the usual restrictions or rights). Since this task reaches out to the OS, the rights to execute it are left to members of the administrator's role. DTS lets you short-circuit this restriction by executing the command in the context of the service running the server. So you get a little immunity over the rights required to run OS jobs.

Figure 1 shows a top down view of the first leg of this process.

Click to enlarge
Figure 1: Top Down View of Executive Process

The step Run Checkloc.bat is an Execute Process task, something that runs a program or operating system command. Here, I'm using it to execute a batch file. I want to redirect the output of a DIR command into a text file, and then use the DTS text import task to push the list from the DIR command into a table. The batch file changes the current directory (in two steps, just for clarity's sake) and then the DIR command is executed. It looks for all instances of files ending in XLS and redirects them to my local drive into a text file named Checkloc.txt. This file is input to the text file source connection later on in the package.

For easier deployment, you should record all your path references in DTS using the \\ServerName\Sharename designation instead of referring to the mapped drive letter. The mapped drive is dependent on the location where the process runs, and the server may not have installed the mapping that the developer who created the package did when he or she created it. So preface your paths with the \\ServerName\ShareName convention rather than using an explicit drive letter.

The commands shown in Figure 2 produce a text file that looks like the directory display at the command window. Figure 3 shows the output of the cmd file.

Click to enlarge
Figure 2: Commands to Produce Text File

By pointing the Text File Source task to this text file, I was able to direct DTS to use this as input into a table. I just brought a whole line from the text file into a staging table, then applied the Execute SQL Tasks one after another to clean up the row (using the substring function and an update command). As a result, I was left with only the directory name, the file name, and (by doing a substring select) a file type. File types were embedded in the file names, so HH represented one type of check payment and OV another. These were required for grouping the report that Finance creates at the end of their reconciliation process. Making data part of a file name generally is a bad practice, but the IT department doesn't have a lot of influence over Finance's naming conventions.

Click to enlarge
Figure 3: The Output of the cmd File

I input the text file into a transform data task (see Figure 4). The source for the transformation is the Text file I created from the batch file. The destination is the scrubbing table, which has one 255 varchar row holding the untransformed input and another column called Cleaned to hold the output of the transformation process. The transformation type is Copy Column, which does exactly what it says. I used T-SQL to parse the contents of Col001 into Cleaned, and then just replaced Col001 with the substring I needed out of Cleaned.

Click to enlarge
Figure 4: The Transform Data Task

After the Transform Data task, a sequence of Execute SQL tasks perform cleanup on the imported rows. The tasks' names indicate their jobs. For example, the first Execute SQL task removes an End of Line character from the end of the row, which appears as a little box sticking out at the end of the file name. It stands for the carriage return line feed characters, which are not displayable. I also had to remove the excess lines that were imported from the directory listing. I used the line starting with the path and appended it to the front of the file name so that I could tell the source from which the Excel file came. The last two tasks parse the HH and OV out of the file name and put them in the first column.

At the end of this process, I had the contents of the directory holding the Excel spreadsheets in a table. My next job was to extract the contents of each of these files into a scrubbing table. That would leave me with the most current version of the check registers to validate the payments of claims for which the checks were issued.

Comment and Contribute






(Maximum characters: 1200). You have 1200 characters left.



Thanks for your registration, follow us on our social networks to keep up-to-date