I have another thread about a report "suddenly" getting very slow in MS SQL. But after spending the afternoon on this, the lesson is really more about how to get better performance from MS SQL. I figured I'd share what I've learned with a new properly titled thread for anyone searching.
The problem was that I have a grouped report --- total count of groups by the time it's done is about 200 for any one run. Each group has lots of little lists in it -- like subreports. But what I did to get those data strings was create functions to do the job and included them as calculated fields in the report. I like this model a lot, especially now that I've solved the speed problem. To help with your understanding of this, the report is about each activity that each trainer spends with each animal each day. Each training session has it's own list of foods, it's own list of medicines, etc. So for 200 groups multiplied by 5 "sub lists" per group, I have 1,000 "sub lists". The food list for one might look like this:
Capelin 1.25 lbs
Clams .50 lbs
Smelt .25 lbs
My report, however, was taking more than 5 minutes to run. And this is a small one -- larger customers are going to have more animals, more trainers, more sessions per day and I had visions of 15 minute report generation time. Clearly, I had to do something.
Here's the select statement I use for food, for instance:
Now I already know that some people are thinking I should have a View, and the use of Distinct here doesn't help, but the lesson I have is still good.
So a typical report might run this Select statement and some surrounding code 200 times, multiplied by 5 of these.
With a little analysis, I discovered that the average time to run this function for food was 0.3 seconds. In my sample, 70 times the function ran in under 0.27 seconds, but the other 130 all exceeded 1 second - up to 1.5 seconds. In general, not all that bad a performance, but there was a clear jump from the "fast" executions to the "slow" ones.
My experiment to fix this worked.
If you look at the Select statement above, you'll see some joins. Having dealt with Alpha's dbfs for a very long time, and knowing a little bit about potential sources of problems, I decided to install indexes on the child fields of the joined tables.
Voila! For 200 executions of the Foods function, total time dropped from 65 seconds to - ready for this? : 4.2 seconds. Wow! Average time dropped from 0.3 seconds to 0.02 seconds.
So, indexes on the child fields where a Join is created will go a very, very long way to achieving the kind of performance you need.
Off to make a few people happier now....
The problem was that I have a grouped report --- total count of groups by the time it's done is about 200 for any one run. Each group has lots of little lists in it -- like subreports. But what I did to get those data strings was create functions to do the job and included them as calculated fields in the report. I like this model a lot, especially now that I've solved the speed problem. To help with your understanding of this, the report is about each activity that each trainer spends with each animal each day. Each training session has it's own list of foods, it's own list of medicines, etc. So for 200 groups multiplied by 5 "sub lists" per group, I have 1,000 "sub lists". The food list for one might look like this:
Capelin 1.25 lbs
Clams .50 lbs
Smelt .25 lbs
My report, however, was taking more than 5 minutes to run. And this is a small one -- larger customers are going to have more animals, more trainers, more sessions per day and I had visions of 15 minute report generation time. Clearly, I had to do something.
Here's the select statement I use for food, for instance:
Code:
SELECT Distinct Food_Names.Foodname, Session_Food.Qty as Qty FROM Session_Food Session_Food INNER JOIN (Food_Master Food_Master INNER JOIN Food_Names Food_Names ON Food_Master.Description = Food_Names.ID ) ON Session_Food.Food_ID = Food_Master.Description where Session_Food.Session_ID = :Session_ID
So a typical report might run this Select statement and some surrounding code 200 times, multiplied by 5 of these.
With a little analysis, I discovered that the average time to run this function for food was 0.3 seconds. In my sample, 70 times the function ran in under 0.27 seconds, but the other 130 all exceeded 1 second - up to 1.5 seconds. In general, not all that bad a performance, but there was a clear jump from the "fast" executions to the "slow" ones.
My experiment to fix this worked.
If you look at the Select statement above, you'll see some joins. Having dealt with Alpha's dbfs for a very long time, and knowing a little bit about potential sources of problems, I decided to install indexes on the child fields of the joined tables.
Voila! For 200 executions of the Foods function, total time dropped from 65 seconds to - ready for this? : 4.2 seconds. Wow! Average time dropped from 0.3 seconds to 0.02 seconds.
So, indexes on the child fields where a Join is created will go a very, very long way to achieving the kind of performance you need.
Off to make a few people happier now....
Comment