I’m working on a database built for reporting purposes. Tables are updated once a month. Reporting queries are taking many hours to complete, some up to 26 hours.
The database has about 30 tables. Two of these tables have a little over 3 million rows; the major table has about 15 indexes on it.
I checked all indexes using the DBCC SHOWCONTIG and everything appears to be in good shape.
What should I be looking at to resolve this problem?
I know that in SQL Server 6.5 there are several factors about the construction of the index that can have an effect on performance. These are the data type of the index, the data length of the column in the index, the order of columns in the index and the selectivity or cardinality of the index (that is, the number of distinct values).
In SQL 7, these aren’t as critical, but I’ll bet integers are still faster than characters 🙂 .
Another thing that springs to mind is that you mention that the database is built for reporting. Are you building cubes in an OLAP product or making do in a transactional database set aside for reporting? It sounds as if more of the data warehousing techniques using data staged and aggregated before it is reported on may be of help to you.
You can write to me offline about this further, if you like.