Page 1 of 2 12 LastLast
Results 1 to 15 of 17
  1. #1
    robs23 is offline Competent Performer
    Windows 7 64bit Access 2010 32bit
    Join Date
    Apr 2014
    Posts
    109

    Split database - how to make it run faster?


    Hi Guys,

    I recently split my database into 2: back-end (tables) and front-end (forms and rest). The reason for doing so was that back-end will be shared by multiple users and they'll be working on it at the same time. Split itself went well - fast and without any surprise. What surprised me is what happened next - my database (the front-end part) started to work significantly slower! I didn't expect it. There's especially 1 form, which is designed to calculate a new project end date based on several chosen steps (it analyzes the relations between steps and their duration). Previously, before I split my db, it took about 1 minute for my script to calculate and find the end date (it's quite complicated process) and this was acceptable. Afterwards, it takes about 20 minutes to do so! This is way over the top! How can I decrease the time to more reasonable time?
    I must say that the script heavily uses the functions I wrote to retrieve a certain information - for example to retrieve step's duration from related table. The function opens dao's recordset, takes out the data and closes the dao.database afterwards. I think this might be problem as that function runs several times during the process and I heard that it's better to always keep connection to back-end database. And I think now that statement
    Set db = Nothing
    kills the connection every time, right?

  2. #2
    RuralGuy's Avatar
    RuralGuy is offline Administrator
    Windows 7 64bit Access 2013
    Join Date
    Mar 2007
    Location
    8300' in the Colorado Rocky Mountains
    Posts
    12,922
    It does kill the connection. Here's a couple of links to review if you did not already have them:
    http://www.granite.ab.ca/access/performancefaq.htm
    http://www.accessmvp.com/TWickerath/.../multiuser.htm

  3. #3
    John_G is offline VIP
    Windows 7 32bit Access 2010 32bit
    Join Date
    Oct 2011
    Location
    Ottawa, ON (area)
    Posts
    2,615
    Hi -

    I'm not sure it would make a difference, but did you try just linking to the back end tables, instead of opening and closing a connection to the back end each time?

    One test you could do is make a copy of the back end database on your local hard drive, link or connect to that copy instead of the network copy, and then run your procedures. If they no longer run unacceptably slowly, then you can be quite sure the slowdown is caused by your network.

    How many records are those complex functions having to process?

    John

  4. #4
    orange's Avatar
    orange is online now Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    Did you streamline your recordsets? Only ask for the fields/records that you need to work on--don't just go for all records that must be transferred for your use --select the appropriate records (use where clause and/or indexing as required). Don't be bringing an entire table across the connection to work on one or two records.

    I wouldn't be dropping the connection. There is info somewhere Ican't find it quickly but I think it was
    Albert Kallal or Tony Toews) who suggest you link to a hidden table and keep that link throughout your session. Reestablishing a connection repeatedly causes slowdown (IIRC).

    see this from Luke Chung

  5. #5
    RuralGuy's Avatar
    RuralGuy is offline Administrator
    Windows 7 64bit Access 2013
    Join Date
    Mar 2007
    Location
    8300' in the Colorado Rocky Mountains
    Posts
    12,922
    Orange: I included the link to Tony's site in my post.

  6. #6
    John_G is offline VIP
    Windows 7 32bit Access 2010 32bit
    Join Date
    Oct 2011
    Location
    Ottawa, ON (area)
    Posts
    2,615
    Orange -

    I have noticed something interesting about linked tables - it appears a connection is not established until you actually open one of those tables. When I look at the folder where the linked tables are, there is no .ldb file until I actually open or access one of those liked tables (form, recordset, etc). When I close the table, the .ldb goes away again.

    I don't know how relevant that is to this discussion (!), but is interesting to note (IMHO).

    john

  7. #7
    RuralGuy's Avatar
    RuralGuy is offline Administrator
    Windows 7 64bit Access 2013
    Join Date
    Mar 2007
    Location
    8300' in the Colorado Rocky Mountains
    Posts
    12,922
    I believe that is exactly how you would want it to operate otherwise you would lose some control over your system. I have an invisible form that opens from Autoexec and opens a table in the BE. It is just a housekeeping form and is the last one to close when exiting. That persistent connection is important.

  8. #8
    robs23 is offline Competent Performer
    Windows 7 64bit Access 2010 32bit
    Join Date
    Apr 2014
    Posts
    109
    Quote Originally Posted by John_G View Post
    did you try just linking to the back end tables, instead of opening and closing a connection to the back end each time?
    How do I do that?

    Quote Originally Posted by John_G View Post
    One test you could do is make a copy of the back end database on your local hard drive, link or connect to that copy instead of the network copy, and then run your procedures. If they no longer run unacceptably slowly, then you can be quite sure the slowdown is caused by your network.
    I did this already - when back end is on my local drive, my script performs in about the initial time (1 minute).

    Quote Originally Posted by John_G View Post
    How many records are those complex functions having to process?
    Quite many. Total number of possible steps to be chosen is about 80. Total number of records containing dependencies between steps is about 4500. Of course it doesn't transfer the whole table of 4500 records but only a recordset containing steps that the given step depends on. And the most dependable step depends on about 60 other steps. My script goes through all the 80 possible steps, checks whether they are chosen and tries to calculate a deadline (latest possible date of completion). If it doesn't find it (for example one of superior steps doesn't have its own deadline) it skips it and go on. When it hits the last step and there are still steps without deadline, it goes through the process one more time (of course skipping those already calculated) till all deadlines are found. Sorry if my explanation isn't clear enough but I find it really hard to explain..

    Quote Originally Posted by orange View Post
    Did you streamline your recordsets? Only ask for the fields/records that you need to work on--don't just go for all records that must be transferred for your use --select the appropriate records (use where clause and/or indexing as required). Don't be bringing an entire table across the connection to work on one or two records.
    Of course. I create recordsets transferring exactly what matches my query. The thing is I transfer it many times during the script's performance. It's hard to estimate but my functions are called probably about 100-200 times during the performance and each call starts with
    Set db = CurrentDb
    and ends with
    Set db = Nothing
    So, in other words, I believe my script opens and closes the connection that many time. Am I right?

    Quote Originally Posted by orange View Post
    I wouldn't be dropping the connection. There is info somewhere Ican't find it quickly but I think it was
    Albert Kallal or Tony Toews) who suggest you link to a hidden table and keep that link throughout your session. Reestablishing a connection repeatedly causes slowdown
    And that's what I want to do. 2 questions:
    - what is the best way of doing this? As you Guys mentioned linking to a "dummy" table might not keep the connection all the time..
    - if I set my front ends to keep the connection all the time during their performances, will it mean that the back end is read-only for other users?

    Robert

  9. #9
    orange's Avatar
    orange is online now Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    Sorry Allan I didn't mean to repeat, I should have recognized the granite.ab site as Tony's.

    Here's the link I was searching for regarding persistent connection.
    The link from FMS(Luke Chung) has many performance tips.

    Robert,
    I recommend you look at the tips in the links provided. There are suggestions there.
    No, the BE will not be read only.

    Good luck.

  10. #10
    robs23 is offline Competent Performer
    Windows 7 64bit Access 2010 32bit
    Join Date
    Apr 2014
    Posts
    109
    Hi Guys,

    Ok, I looked into this case and applied FMS's solution (a module opening and closing connection at log in and log out). Now I can see that .laccdb file is created as start and removed only at log out. Unfortunately, this doesn't improve my script's speed almost at all What else can I do? I already changed table's subdatasheet to [NONE] as it also had been pointed as possible solution.
    I'm really think I'm missing something.. I already applied 2 - in my opinion - the most speed-affecting solutions, and the speed hasn't changed almost at all.. I'm afraid I'll have to rebuild my script and I really wouldn't like to do that.
    Another thing that crossed my mind: is there any - I don't really know what to call it - log with detailed timeline from debugger's performance? I mean I'd like to see which line of code was executed at what time. I think it would allow me to see where are the biggest time losses...

    Robert

  11. #11
    orange's Avatar
    orange is online now Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    Robert,

    The immediate window keeps the last x number of messages from Debug.Print (I think about 150).
    You can get quite sophisticated with logging. For a relatively quick approach, I'd suggest the following.

    In your vba in each sub or function I have used this sort of thing
    Debug.print Now & " " & "Entering XXXXXXX" and
    Debug.Print Now & " " & "Exiting XXXXXXX" to get a trace of program flow.

    For this you have to add these lines manually at the start and exit of each sub and function substituting th procedure name for the XXXXXX.
    If you want to monitor at a finer detail, you could put relevant similar code where you want it.

    Good luck and let us know how it goes.

  12. #12
    John_G is offline VIP
    Windows 7 32bit Access 2010 32bit
    Join Date
    Oct 2011
    Location
    Ottawa, ON (area)
    Posts
    2,615
    Hi -

    In many cases when you are executing a select query or creating a recordset, the whole table - 4500 records - is transferred in order for the final selection to be made - and from your description, you do this a lot. In other words, even if the final recordet contains only 10 records for example, the entire table must be transferred to do it because all the processing is done locally.

    If you are only querying that table of 4500 records and not actually updating it, then before you start your calculations, you could make a local copy of that table in the front-end, and use that for all your calculations. 4500 records is not very many really, and would not take long to copy.

  13. #13
    robs23 is offline Competent Performer
    Windows 7 64bit Access 2010 32bit
    Join Date
    Apr 2014
    Posts
    109
    @orange : thanks for your answer. I was rather hoping for a kind of tool that already exists but hey - beggars can't be choosers, right? I think I can get very sophisticated with this. By the way, Can I debug e.g. a line of code number the interpreter is currently at?

    @John_G : I wanted to avoid this, but perhaps it can be a life savior here. I think I'll just make a drop of these 2 or 3 tables that the script utilizes to front end at the beginning of the run, and vanish it from the front end when script ends.. If it turns out, however, that it takes a considerable amount of time to make/vanish those tables, I might be better off keeping them in front end all the time and only update them with back end periodically..

    BTW, as running split database consumes more time to withdraw/add the data from back end, I plan to add a progress bar between different screens. I have already procedure prepared, that increases the progress bar (made with use of a rectangle) to a given percentage value (from 0 to 100%). All works good. Till now I only used it during my script performance by measuring how many records it's been through/how many left. Is there any other way to measure for example form's loading state? Is there any method I can use?

  14. #14
    orange's Avatar
    orange is online now Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    Robert,

    If you get more sophisticated, you could:
    -add a boolean flag to set this debugging on or off (that way you could leave the debug code in the procedure)
    -create a procedure to add code to existing or new subs/functions
    -write the information to a table or file system rather than immediate window

    Let us know or show us some code if you go this route.

  15. #15
    robs23 is offline Competent Performer
    Windows 7 64bit Access 2010 32bit
    Join Date
    Apr 2014
    Posts
    109
    Hi,

    I decided to make a local copies of tables my script uses at application's launch and remove it when app ends.. I think no other way would bring desired performance speed.
    One more time big thanks to all of you for your priceless ideas and pointers!

    Robert

Page 1 of 2 12 LastLast
Please reply to this thread with any new information or opinions.

Similar Threads

  1. HTML in VBA - faster way?
    By Ruegen in forum Programming
    Replies: 2
    Last Post: 12-11-2013, 08:52 PM
  2. Replies: 4
    Last Post: 11-27-2013, 09:51 AM
  3. Make my DB go faster
    By athyeh in forum Access
    Replies: 2
    Last Post: 11-05-2013, 08:41 AM
  4. Faster code? Which one?
    By starson79 in forum Programming
    Replies: 4
    Last Post: 05-13-2011, 06:11 AM
  5. A Faster Search??
    By bladelock9 in forum Forms
    Replies: 0
    Last Post: 03-17-2011, 09:25 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Other Forums: Microsoft Office Forums