he experience of answering e-mail to the SQL Server Pro on DevX has made me appreciate the limits of my knowledge and experience. Some questions are funny because they are so broad: “I need to design a Web site using ADO and SQL Server. Please post the code.”
How do you address user comments such as “Performance is slow” or “This takes a long time”? The challenge stems from the subjective language they use to describe their problems. You could answer the question with a question, but that would give you the perception of being a wise guy. If you reply with “Slow compared to what?” or “What is a short time for this job?” then this will cast you, the database administrator, in an unfavorable light in the questioner’s eyes.
Here’s how you avoid being defensive, while being helpful. Be prepared in advance for these questions with data that describes the response of your system. While I hesitate to call this “metadata”?because, strictly speaking, it is not?there is data you can collect about performance, query response times, resource consumption amounts, or the number of transactions in a unit of time, so let’s call them “performance statistics” for lack of a better word. (By the way, check your license agreements; your database vendor may prohibit you from publicizing or discussing your performance test results in public!)
If you maintain a history of your readings, you will establish statistics you can use that document system responses over fixed and measurable criteria. These statistics are the very ones that come from your hardware, your configuration, and your own real-life production queries.
Let’s start off with what I think is the most direct way to get information from your installation?namely, executing DBCC PERFMON. This generates a report that highlights three key areas of your server installation:
- Input and output statistics
- Least recently used buffer information
- Network statistics
This is an extremely useful snapshot because it’s such a wealth of information, and we can use the output as a template and hack our own “diary” out of the resulting database statistics.
This technique can be extended to other jobs, like monitoring output from SQL Trace files from the ODBC dialog, working from the NT log, or working from the Performance Monitor. The concepts behind this technique are pretty simple; the approach just takes a little getting used to:
- Find or create a text log file that has labels and values.
- Create a table that mimics the text file’s structure.
- BCP the text file into the table.
- Examine the data.
- Graph the results.
That’s the procedure; let’s see how it works with our first foray into trapping statistics, the DBCC PERFMON output.
Step 1: Get the statistics.
According to our work plan, the first thing to do is find or create the text file that contains the statistics we want to trap. As I said above, we’re going to use the one-shot report generated from DBCC PERFMON, so fire up ISQL. (By the way, this is a SQL Server 6.5 plan of attack, because 7.0 already does a lot of this for you, but I’m sure you’ll find a use for the general technique regardless of the version you use.)
The first three rows of my output look like this:
Statistic Value ---------------------------------------Log Flush Requests 40.0 Log Logical Page IO 19.0 Log Physical IO 25.0
After you examine your output, be sure to save the query result to a file. I called mine DBCC.TXT. Watch out for saving it as the default, DBCC.RPT, because Crystal Reports Server has usurped .RPT as its report file extension. I’m not certain whether .RPT files have anything special going on inside of them?such as special characters and the like?but name your text file with the .TXT extension just to be on the safe side. This report is grist for our future statistics mill.
Step 2: Create a table to store the stats in.
Once you’ve executed the command and generated the report, it’s time to trap the results in a table. (What self-respecting database person would do otherwise?) To get the right data types and lengths for the table, I counted dashes to figure out how big to make the columns and wrote the following statement to create a “bucket” for my DBCC PERFOM output:
CREATE TABLE MyStats (STATISTIC VARCHAR(32), VALUE NUMERIC(10,6))
Step 3: Fill the table with data.
BCP is a black art. It is built with plenty of flexibility, but it is an old command-line utility; as such, it’s much like working with a grumpy old man. Mind your cases on the command line and pay attention to the structure of your input file. To get BCP to work right, you may find yourself prepping your input files. For example, I’ve found it best to be sure that the text file’s output is tab-delimited, with a CHAR(13) at the end of each line. You can check this out in a word processor or text editor that has the “show codes” feature. Confirming that the file is so formatted will save you a lot of grief.
Once you are certain of the above, confirm that the structure of the text file matches the table definition. The only other thing to remember is to use the -c switch in your BCP line so that BCP uses character mode to import your text file. (There is such a raft of issues with BCP that I cannot explore that today. It’s a series all by itself.)
The statement you need to execute is something along the lines of what I show below. (I’m hedging because you must make adjustments to the command line for your server name, user name, and password.) From a command box prompt, enter this:
BCP pubs.dbo.mystats in dbcc.txt -c -SSPI23B -Usa -P
This will load the table with the contents of the text file. Alternatively, if you prefer working from ISQL-w and have the rights to do so, you can run this:
EXEC master.xp_cmdshell "BCP pubs.dbo.mystats in dbcc.txt -c -SSPI23B -Usa -P"
That will do nicely as well.
Step 4: Examine the data.
You are now on the threshold of being able to read the entrails.
In order to ease the data import, we directly mimicked the structure of the text file, but now we would like to see a date and time to place our findings in history, so that time is part of our system analysis. Rather than edit the file to include the date or change our original data file, we can make use of a built-in function called GETDATE() and use it to populate a date column in a new table that looks almost like MyStats, except that it has an extra column for the date and time. Here’s the idea for getting the date and time associated with your statistics. Remember that the order of the columns counts as we finesse the population of MyStats2 with INSERT SELECT, like this:
CREATE TABLE MyStats2 ( STATISTIC CHAR(32), VALUE NUMERIC(10,6), STATDATE DATETIME NULL DEFAULT GETDATE()) )
Now that we’ve created our new bucket, we can fill it up with the contents of our DBCC PERFMON report:
INSERT MyStats2 SELECT *, GETDATE() FROM MyStats
If you try just the column list, you’ll get the error that the target table doesn’t match the source table structure. But if you use STAR, you can include GETDATE(). This gets the table loaded with the statistics name, its value, and the date you did the load?unfortunately, not exactly the time you ran the statistics, though if this were bound in a stored procedure, I can’t see the latency invalidating the results, unless we’re doing lunar module docking or maybe heart bypass surgery.
Now you can examine your work with:
SELECT * FROM MyStats2
In order to refresh MyStats2 with new data, you will have to go through the procedure again (without recreating MyStats2, of course
- Create your measurements by running DBCC PERFMON.
- Save its output to a plain text file.
- BCP that text into a staging table. (In our example, that’s MyStats.)
- INSERT SELECT the contents of the staging table into your statistics table. (In our example, that’s MyStats2.)
This lets you trap the date with your observations.
Step 5: Graph the data.
Now that you know how to accumulate log data, it won’t be too long before you ask, “Why has the cache hit ratio been falling?” or “How has the ratio of logical to physical reads changed since we bought more RAM for you?” While you can answer these questions from reports or inspecting the data in ISQL-w, it is really much more compelling to take your data out to Excel and graph it. A graph really animates your measurements and makes it easier to follow trends over time.
Well, that’s this month’s 10-Minute Solution. I think you’ll agree that it is the tip of the iceberg.
Next month I’ll look at other ways to apply this technique and show you what you can learn about your system with it.