Results 1 to 11 of 11
  1. #1
    SteveApa is offline Advanced Beginner
    Windows 7 64bit Access 2010 64bit
    Join Date
    Jan 2015
    Posts
    89

    Running Concatenate LONG string speed - alloc mem issue?

    Hi all...
    I am writing to an sql server. I created a function to create an Insert into string... I pass the function the table name, field names, values, and it returns the Insert into string. I concatenate the return string each time to get 1 huge string of transactions. Then I Push that string to the server to write all at once.

    1 line is 1 transaction with 25 field names and their corresponding values. The string length of 1 line is 800 characters.

    In this example, I am writing 147k lines to the server. The issue is this... It is slow... So, I break it down to writing 25k lines (transactions) at a time... this occurs 6 times. With this method, each 25k lines takes 1.5 minutes to CREATE the string of 25k transactions and 20s to write it to the server.... The string of 25k lines = 25k lines * 800 characters per line = 20,000,000 characters.

    I am thinking that the slowness of concatenating (I am a self taught 1992 borland C programmer) is that it is re-allocating memory every time I concatenate the new string to the running string. I added code to watch each 1000 lines concatenate and the longer it goes, each additional 1000 takes longer and longer. If it is a re-allocate thing, could I allocate the memory 1st, like malloc in c?
    Create running string of
    5k lines = 5 sec
    10k lines = 15s
    15k lines = 35sec
    25k lines = 90sec
    25klines write to server = 20 sec

    So, the trade off in speed is to optimize the number of times I write to the server vs how many times I concatenate. Meaning, I could write each line to the server, but that would be a world slower.



    I could create 5 concatenations of 5k each (5k was about 5s) and do it 5 times = 25s + 20sec to write to server. Then do this 6 times to get the 147k written...

    Any help on optimizing the running concatenate???

    Code:
    bla bla bla...
    alldata = rs_local.GetRows(rs_local.RecordCount) ' (column, row) (0,1) = value in column 0, row 1
    rs_local.close
    set rs_local = nothing
    bla bla bla...
    
    
    For j = 1 To loop_thru_lines ' this is looping thru rows
        times_in_25000 = times_in_25000 + 1
        For i = 1 To loop_thru_fields 'This is loooping thru columns, same as rs_local.Fields.Count
              value_array_1d(i) = alldata(i - 1, j - 1)
         Next i
         sql_string_returned = Insert_Into_Create_Sql_String_For("[" & sql_server_table_name & "]", field_names_array_1d, value_array_1d)
         sql_string = sql_string & sql_string_returned ' I THINK this is the cause of the slowness - needing to reallocate memery every time
         If times_in_25000 = 25000 Then
            times_in_25000 = 0
            times_in = times_in + 1
            Check_Set_Sql_Server_Connection
            SQL_SERVER_ADODB_CN.Execute sql_string
            sql_string = ""
        End If
    Next j
    Check_Set_Sql_Server_Connection
    SQL_SERVER_ADODB_CN.Execute sql_string
    Thanks
    Steve
    Harrisburg, PA
    Last edited by SteveApa; 06-16-2024 at 09:43 PM. Reason: extra detail

  2. #2
    Join Date
    Jun 2010
    Location
    Belgium
    Posts
    1,180
    I do hope you're not sending 25000 separate inserts to the SQL server. If you do, the SQL DBA will be thinking of killing you slowly in 25000 steps.
    Can't you upload the data using SQL-pass through queries in stead of creating insert lines? Or use a SSIS package to do the job?

  3. #3
    Minty is offline VIP
    Windows 10 Office 365
    Join Date
    Sep 2017
    Location
    UK - Wiltshire
    Posts
    3,157
    Unless those concatenated field names change for every iteration that is about as inefficient an insert method as you can have.
    Even if there were 20 or 50 variations, 50 separate bulk insert statements would be infinitely quicker.
    DLookup Syntax and others http://access.mvps.org/access/general/gen0018.htm
    Please use the star below the post to say thanks if we have helped !
    ↓↓ It's down here ↓↓

  4. #4
    SteveApa is offline Advanced Beginner
    Windows 7 64bit Access 2010 64bit
    Join Date
    Jan 2015
    Posts
    89
    Hi Minty, Field names don't change... I am pushing 147k lines (25 field per line) from a local access table to a sql server table, with VBA. 1st I delete all rows in the server table, then "insert into" the new lines. I do this once per week. Doing an append query in access is much slower. Any other method you suggest that is automated to get data from a local access table to a sql server table?

    I changed the code last night and got it down from 750s to 176s... Just guessing about the memory reallocation every time I concatenate a new string to the running string, I created an array of 2000 block of 64k length strings... I limit each item in the array to <65000 characters in length. Now I have approx 1700 string arrays and each string is less then a length of 64000 characters.

    Now it takes 6 seconds to create this array full of transaction strings and 170 seconds to write to the server in 4 separate pushes
    Code:
    start_timer = Timer
    loop_thru_lines = UBound(alldata, 2) - LBound(alldata, 2) + 1 ' this is looping thru rows
    loop_thru_fields = UBound(alldata, 1) - LBound(alldata, 1) + 1 'This is loooping thru columns, same as rs_local.Fields.Count
    Dim blocks_of_64k(1 To 2000) As String
    Dim block_used As Integer
    block_used = 1
    
    
    For j = 1 To loop_thru_lines ' this is looping thru rows
        For i = 1 To loop_thru_fields 'This is loooping thru columns, same as rs_local.Fields.Count
            value_array_1d(i) = alldata(i - 1, j - 1)
        Next i
        sql_string_returned = Insert_Into_Create_Sql_String_For("[" & sql_server_table_name & "]", field_names_array_1d, value_array_1d)
        If Len(blocks_of_64k(block_used)) > 64000 Then
            block_used = block_used + 1
        Else
            block_used = block_used
        End If
        blocks_of_64k(block_used) = blocks_of_64k(block_used) & sql_string_returned
    Next j
    sql_string = ""
    For i = 1 To 2000
        sql_string = sql_string & blocks_of_64k(i)
        If i = 500 Or i = 1000 Or i = 1500 or i = 2000 Then
            Check_Set_Sql_Server_Connection
            SQL_SERVER_ADODB_CN.Execute sql_string
            sql_string = ""
        End If
    Next i
    Last edited by SteveApa; 06-17-2024 at 07:03 AM. Reason: typeo

  5. #5
    SteveApa is offline Advanced Beginner
    Windows 7 64bit Access 2010 64bit
    Join Date
    Jan 2015
    Posts
    89
    You say "Can't you upload the data using SQL-pass through queries in stead of creating insert lines? "... How would i do an access pass thru of a local table to the sql server?

  6. #6
    Minty is offline VIP
    Windows 10 Office 365
    Join Date
    Sep 2017
    Location
    UK - Wiltshire
    Posts
    3,157
    Okay that clarifies things a bit, thanks for the added information.
    You may well be better using a Bulk Insert in SQL, that works remarkably efficiently in my experience.

    I'm guessing that the SQL server is remote/cloud based (Azure?) and not local?
    Does the SQL table have a lot of indexes on it?

    Have you tried SELECT INTO a new sql table just as a comparison?
    It can be a lot quicker in certain situations?
    DLookup Syntax and others http://access.mvps.org/access/general/gen0018.htm
    Please use the star below the post to say thanks if we have helped !
    ↓↓ It's down here ↓↓

  7. #7
    SteveApa is offline Advanced Beginner
    Windows 7 64bit Access 2010 64bit
    Join Date
    Jan 2015
    Posts
    89
    Again, I have a local Access db creating a local table calculating / pulling data from different sources, Oracle, Denodo, SQL server. The local table created is 147k lines long with 25 columns. I want to move this local table to a Shared SQL server. PowerBi service does a scheduled pull of this data from the sql server weekly. My understanding is that the sql server is in the US.

    How "Bulk Insert in SQL"? I am an Mechanical Engr, self taught in Access / vba / sql / sql server / (C / C++ from the '90s)
    How do I write an access passthru to read a local table and push it to the server?

  8. #8
    Minty is offline VIP
    Windows 10 Office 365
    Join Date
    Sep 2017
    Location
    UK - Wiltshire
    Posts
    3,157
    Okay - so it's a remote SQL Server.
    That will be troublesome I suspect with that many records in one go, unless you have a very fast stable connection, so you may already have programmed you way around the issue as efficiently as you can.

    A bulk insert in a specific SQL method of importing an external file into a SQL database.

    https://learn.microsoft.com/en-us/sq...l-server-ver16

    It is basically taking a text file of some description (CSV, Fixed width etc.) and importing it directly from SQL.
    You can use SSMS (SQL Server Management Studio) to do it "by Hand" or get involved with scripting a solution to use it.

    I don't think the pass-though option is going to help here as there is a limit on the file size that can be sent as a Pass-through, although I don't know what it is off the top of my head.

    So your process would be to take your Access processed data, and export it to a specific compatible text file, then bulk import to SQL.
    DLookup Syntax and others http://access.mvps.org/access/general/gen0018.htm
    Please use the star below the post to say thanks if we have helped !
    ↓↓ It's down here ↓↓

  9. #9
    Join Date
    Jun 2010
    Location
    Belgium
    Posts
    1,180
    I don't know if you need a text file. You can create a SSIS package importing data directly from Access.
    Working with a text file does have the advantage you can upload the text file to the SQL server location, so the import doesn't go over the network.
    If the SQL table has a lot of indexes, you can add instructions to the package or script to drop the indexes before the import and adding them again after the import is finished.
    But in any case I would start the import from the SQL server environment with a package or a script, not push the data from Access using VBA

  10. #10
    SteveApa is offline Advanced Beginner
    Windows 7 64bit Access 2010 64bit
    Join Date
    Jan 2015
    Posts
    89
    Thanks... I work for a global company. I have resources that I can ask about creating a procedure on the server to pull data from an outputed text file on a scheduled period.
    But, Please comment on the speed of What I have done...
    I push, from access vb, a 147k row local table of 25 columns of data (dates / decimals / strings), to a sql server. I got it down to 6 sec to create the String of transactions to push (actually 1700 strings). I do 4 separate pushes of data (concatenate 425 strings together in each push) to the server. Push takes 150sec. So, total time is 156sec. I do this 1/week.

    The table I am pushing is not indexed, it is a flat table that I have access crate with a Make Table Query

    Would I expect an automated import of a text file from the server to be much faster? Network speed is always good and stable
    Last edited by SteveApa; 06-18-2024 at 06:52 AM. Reason: additional info

  11. #11
    Minty is offline VIP
    Windows 10 Office 365
    Join Date
    Sep 2017
    Location
    UK - Wiltshire
    Posts
    3,157
    Just over 2.5 minutes to export 147k rows. I don't think that is too bad given the remote connection to the server.

    However, just as an experiment I have just selected 130k records from an MS Azure hosted SQL table into a new Access table using SELECT Into and it took under 10 seconds.
    The reverse - exporting the same table from Access into a new Azure SQL table has been running for over 3 minutes and looks like it's only about 25% of the way through.

    Some of this maybe due to the connectivity here: our upload speed is very poor compared to the download connection.

    So yes, depending on the state of your connectivity, you current method looks like acceptable performance to me.
    DLookup Syntax and others http://access.mvps.org/access/general/gen0018.htm
    Please use the star below the post to say thanks if we have helped !
    ↓↓ It's down here ↓↓

Please reply to this thread with any new information or opinions.

Similar Threads

  1. Speed Up Allen's Brown's Concatenate Code
    By PSSMargaret in forum Access
    Replies: 6
    Last Post: 12-13-2019, 06:15 AM
  2. Speed issue continues
    By fishhead in forum Programming
    Replies: 6
    Last Post: 07-04-2019, 07:22 AM
  3. Weird Speed issue
    By devlin7 in forum Access
    Replies: 12
    Last Post: 02-09-2014, 07:32 PM
  4. increase running query speed
    By afshin in forum Queries
    Replies: 1
    Last Post: 10-15-2011, 12:36 PM
  5. Speed issue
    By hey23 in forum Queries
    Replies: 1
    Last Post: 06-24-2009, 09:16 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Other Forums: Microsoft Office Forums