I developed an Excel program that queries a database. Very recently the data source changed (new tables, new structure and everything). Now, I must link to these tables in Access.
In an attempt to keep things working seamlessly, I created a few "master queries" that are named the same as the tables I queried in the old database. The field names match exact as well. Each of these master queries consist of other smaller queries to pull the data together from the linked tables because it is stored different than before. For example, it takes about 8-10 sub-queries to get all the data for a property that was previously in a single table in the old database...
In the end, when I run my query, it see's the query name and thinks it is the table and can get the data I need. Although this looks great, it has slowed greatly as the record count grows, and will only continue to grow. I'm looking for ideas to help speed things up.
Here are a few untested thoughts:
• Store most all the historical data in a local table (not linked). Then use a union query that pulls history from local, and newer data from linked?
• Store the data in an MS SQL DB or SQL Lite -- faster?
• Create queries to update the old database with new data? --the old database didn't have the performance issues I'm seeing. I suspect it's due to the number of sub-queries required or that the tables are linked.
I have set relationships, joins, indexes... My largest table is about 70k rows, and will add about 225 new each day.
Thoughts?