Page 2 of 2 FirstFirst 12
Results 16 to 18 of 18
  1. #16
    orange's Avatar
    orange is offline Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,870
    I'm missing something. The title of the post indicated a SLOW process. The keyword approach, if applicable to your situation, changes a sequential read against ~419000 records.
    You should always work with normalized tables; then if necessary "de-normalize/restructure" for optimal performance.


    I don't understand the need for aggregate queries when trying to find a record(s).
    Minimize the size of the set of records involved and do subsequent processing if necessary. It's a bit like reading an entire recordset and then ignoring all but the 2 you need. Get the 2 records you need, then do the processing.
    Last edited by orange; 05-02-2015 at 05:01 AM.

  2. #17
    keviny04 is offline Competent Performer
    Windows 7 64bit Access 2007
    Join Date
    Apr 2015
    Posts
    128
    Quote Originally Posted by orange View Post
    I'm missing something. The title of the post indicated a SLOW process. The keyword approach, if applicable to your situation, changes a sequential read against ~419000 records, to an indexed search of ~3000 records.
    You should always work with normalized tables; then if necessary "de-normalize/restructure" for optimal performance.
    I don't understand the need for aggregate queries when trying to find a record(s).
    Minimize the size of the set of records involved and do subsequent processing if necessary. It's a bit like reading an entire recordset and then ignoring all but the 2 you need. Get the 2 records you need, then do the processing.
    Did you look at the code and query I returned?
    Before your thoughts run on tangent any further, which I've seen you do to other posters on occasion, let me state (multiple times already) point blank what I need right now: I need to know why there is a 1.5 to 2 second slowdown in aggregate queries on a half-million-row table. Your keyboard approach, though nice, didn't answer the question. Your code and query also didn't. That is why I titled this thread as such: slow. Even though I could avoid the slowness by not using aggregate queries, I still would like to know the answer to the original question.

    Just to answer one of your tangential questions: my actual aggregate query is not just to find a record. The actual query in my project is a summation query that returns records containing the substring being sought, hence the use of aggregate functions. The queries I posted in this thread were all "stripped down" versions of the real thing, made only to demonstrate the slowdown (the only subject I'm inquiring about, and nothing more). I stripped them down in order to prevent someone from noticing some tangential issues in the queries that were not relevant to my central question. Apparently, it didn't work.

  3. #18
    orange's Avatar
    orange is offline Moderator
    Windows XP Access 2003
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,870
    Sorry for all the tangential things like normalization and use of indexes.
    I've gone back to your database to see where you have a summation. I didn't see one.
    Still unclear with exactly what you are trying to do.
    Perhaps you could tell us more about the summation. And show us the real query with the summation.

    As I said the keyword approach may not be applicable
    The keyword approach, if applicable to your situation, changes a sequential read against ~419000 records, to an indexed search of ~3000 records.
    but if it is, it may get you some speed.

    After looking at all of your queries, I'll offer code for 2 guesses

    Group and Count for "*james*"
    Code:
    SELECT Table1.ContactName, Count(Table1.ContactName) AS CountOfContactName
    FROM Table1
    GROUP BY Table1.ContactName
    HAVING (((Table1.ContactName) Like "*james*"));
    Dcount without Grouping
    Code:
    ?Dcount("contactname","table1","Contactname Like '*James*'")
     5159
    Good luck with your project.

Page 2 of 2 FirstFirst 12
Please reply to this thread with any new information or opinions.

Similar Threads

  1. Replies: 5
    Last Post: 07-01-2014, 11:10 AM
  2. Replies: 3
    Last Post: 10-17-2012, 09:24 AM
  3. Replies: 3
    Last Post: 03-14-2012, 06:48 AM
  4. Query to find and COMBINE rows based on 2 fields
    By admessing in forum Queries
    Replies: 2
    Last Post: 12-13-2011, 12:59 PM
  5. Replies: 13
    Last Post: 10-26-2011, 03:49 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Other Forums: Microsoft Office Forums