I have a budget form used to create budgets for each fiscal year. The form includes several different controls displaying data needed to create the budget, by budget subaccount: things like total dollars already obligated (by contract) in the fiscal year, planned purchases not yet obligated but serving as placeholders, etc. This is to ensure that the user budgets at least that much to cover expected expenses for that budget subaccount.
To populate a couple of the controls (on the continuous form), I have to run a series of queries. The queries are basically teasing apart data for three different types of "orders", doing some calculations on them related to fiscal year amortization, then bringing them back together in a union. The queries work just fine (all seven of them!) to find the appropriate sum, then I use a DSUM lookup for the budget subaccount and department to populate the control.
However, when I open the form there is a second or two delay in populating the controls. I only have a very small set of test data (116 records) in the database right now. This number will balloon very quickly once I enter ALL the contracts. I am concerned the queries will take so much longer so as to make the form unusable.
So my questions are these:
1) Is the speed of a query directly inversely proportional to the volume of data in the database? Or does the growth of data over time only marginally contribute to slowing a query?
2) If my concern is valid, what are some options (from a generic design point of view)? Budgets are created separately from orders...so would it be better to have the queries run as a cleanup routine after orders are created, then have the query populate a table for the budget form to draw from? It seems drawing directly from a table is MUCH faster than using the query series.
I've made the queries as efficient as I can (in fact I went too far at one point and broke them) so I don't think there's any value there.
Appreciate whatever advice you may have to offer.
Thanks,
Phil