I want to load some test data into a table to test .. the performance of an Access database and run a query against it.
So .. yes, 10 million records. The records will be quite small .. but lots of them.
Well .. after some testing. I decided that it is insane to run a SQL statement .. to load the data. I did it the old fashioned way. I made a huge txt file with 11 million+ lines in it. That look a few minutes.
Then .. imported the file in Access into my table.
One hell of a lot faster ... but the jury is still out to see how Access will perform in querying the data. That is for another thread maybe.
But looking for normalization / optimization suggestions. My relationships are below .. mainly cause I don't know how to delete it.