I just ran a test on this by streaming 1.7GB of data FROM an access module into a text file. The process took only 3 and a half minutes. here's the code I used:
Code:
Sub f()
Dim i As Double
Open "c:\myfile.txt" For Append As #1
For i = 1 To 12000000
Write #1, "String,String,String,String,String,String,String,String,String,String,String,String,String," & _
"String,String,String,String,String,String,String,String,String,String,String,String,String," & _
"String,String,String,String,String,String,String,String,String,String,String,String,String"
i = i + 1
Next i
Close #1
End Sub
more than likely, you ARE overwhelming the program. Especially if you are using a wizard. Try streaming the data on in with vba, just as I did above. My code produced 12 million comma-delimited records but that's all it did. With your attempt you would obviously have to import it all into one table with a recordset, OR make a bunch of small tables and issue append operations.
Give it a try. I think your RAM likes I/O a lot more than programs trying the equivalent. After all, my code above was run on a refurbished laptop with a 1.5gbs processor and 512 ram. That's pretty sad specs!