Page 1 of 2 12 LastLast
Results 1 to 15 of 21
  1. #1
    fishhead is offline Competent Performer
    Windows 7 32bit Access 2010 32bit
    Join Date
    Apr 2015
    Posts
    167

    Primary keys vs indexes

    Can someone help me better understand how the data is pulled from the backend to the front end over a network?

    My Backend Dbase has a table with approx 20k records in it
    If only one user is on the network and they're front end application is pulling data off the BackEnd , things go rather quickly.
    As soon as another user is on the network the speed greatly slows down.

    I've ruled out any networking issues (i'm running a gigabit switch and have pretty beefy hardware/speed etc)



    I found out last week the value of the Primary key setting in each table. Just to make sure i have it correctly, the primary key is used for the field you are most likely going to be searching by - correct?
    There are also Indexing options in the table which seem to allow me to choose which fields would be typically searched and "index" them with an ascending or decending order option. From what i understand this will also greatly speed up the searches bused by the FrontEnd via queries etc.

    So getting back to my question, when the Frontend asks the backend for a record is the front end pulling all the records in the specific table or tables and finding the specific record once the data is on the Frontend computer? or is the Frontend asking for a record and the backend is only looking at the primary key field and then the indexed fields only - finding the correct record than sending only that single record back to the front end?

    Does it slow things down if my table has 1 primary key field and 3 or 4 indexed fields?

    Lastly, when the Frontend is running the query i can see the lower right hand of the screen showing the progress of the query but it doesn't show which specific query is bring run. Is there a way to find out the query names as a couple of these queries are taking a long time to process (especially with more than 1 user on the network)

    thanks in advance.

  2. #2
    ranman256's Avatar
    ranman256 is online now VIP
    Windows Vista Access 2010 32bit
    Join Date
    Apr 2014
    Location
    Kentucky
    Posts
    9,521
    Primary key is there to prevent duplicates...like tStates table. key the ST field
    NY,FL,etc....so you can't enter a duplicate.

    INdex a field to link to another table to lookup faster.
    say the tEmployee table,..index the ST field to join to tStates.

    (but you wouldn't really do it for states.)

  3. #3
    CJ_London is offline VIP
    Windows 10 Access 2010 32bit
    Join Date
    Mar 2015
    Posts
    11,397
    I found out last week the value of the Primary key setting in each table. Just to make sure i have it correctly, the primary key is used for the field you are most likely going to be searching by - correct?
    No - it is used to uniquely identify a record and is usually an autonumber. Other than identifying the record it has no meaning and is not normally visible to the user.
    There are also Indexing options in the table which seem to allow me to choose which fields would be typically searched and "index" them with an ascending or decending order option. From what i understand this will also greatly speed up the searches bused by the FrontEnd via queries etc.
    you just index the field, ascending and descending is a matter for presentation. All fields which you regularly search/join/sort on should be indexed - the exception being fields with a very limited range of values such as booleans.

    So getting back to my question, when the Frontend asks the backend for a record is the front end pulling all the records in the specific table or tables and finding the specific record once the data is on the Frontend computer? or is the Frontend asking for a record and the backend is only looking at the primary key field and then the indexed fields only - finding the correct record than sending only that single record back to the front end?
    It depends on how it asks - if the front end is a form and the form recoursource is a table or query name, all records will be pulled to be subsequently filtered. It will only use the PK field if specifically used i.e. WHERE PK=123. A criteria WHERE lastname='smith' does not use the PK field.

    example

    you have a form with a recordsource of 'TableA' - all records will be pulled when the form is opened
    you have a form with a recordsource of 'SELECT * FROM TableA' - all records will be pulled when the form is opened
    you have a form with either of the above, but open with docmd.openform and set the WHERE parameter - all records will be pulled when the form is opened, but the records selected will be displayed first - this is actually a filter, not a criteria

    To solve the problem,

    1. give the form an empty recordset - i.e. your recordsource is something like 'select * from TableA WHERE False'
    2. now set a criteria - if using docmd.openform, pass what you had in the WHERE parameter in the OpenArgs parameter
    3. In the form load event put some code along the lines of

    Note- make sure you do not sort a form until you have retrieved the data

    me.recordsource=replace(me.recordsource, "False",me.openargs)

    Alternatively perhaps your form has a combo to select a record and code along the lines of

    me.filter="ID=" & cboSearch
    me.filteron=true

    just replace the code with

    me.recordsource=replace(me.recordsource, "False",me.openargs)

    clearly if you want to do multiple searches then the code can't replace False the second time because you've already replace it the first time. So instead your code might be something like

    me.recordsource=replace(me.recordsource, mid(me.recordsource,instr(me.recordsource,"WHERE") +6),"ID=" & cboSearch)

    Other factors such as using lookups in tables will slow thing down because when you retrieve the primary table, all the lookup tables are retrieved as well - the same goes for subdatasheets.

    And on your forms you may have combo's, listboxes and subforms. Usually the first two will be relatively short lists but I've seen combos where user is required to select a customer from a table with 25k records. Subforms also load the entire dataset and subsquently filter using the linkMaster/Child properties. Instead use code on the main form current event similar to the above to modify the subform recordsource.


    Does it slow things down if my table has 1 primary key field and 3 or 4 indexed fields?
    No, indexes speed retrieval. If you have many indexes then inserting/updating can take a bit longer because the db has to update the indexes as well as the data.

    As far as query performance is concerned - indexing is the most important. This is followed by not using the initial * in a Like criteria (prevents the index being used), use of domain functions, use of subqueries. Also look at what is being processed - a 'view' i.e. a 'master' query that has the results you want to be subsequently filtered elsewhere in your application will be a lot slower than if you can apply the criteria first

    see this link about indexing https://www.access-programmers.co.uk...rmance.291268/

  4. #4
    orange's Avatar
    orange is online now Moderator
    Windows 10 Access 2016
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    See Primary Key

    You can put some Debug.Print statements into your code (eg. before and after a query execution) to show how much time the query took to run.

    Here's a sample:

    Code:
    .....
    20       Debug.Print Now & "  -  " & "Starting to process tblTelemetryEvents "
              Dim intFile As Integer
              Dim DelSQL As String
    30       DelSQL = "DELETE From tblTelemetryEvents;"
    .....

    And some output from the Debug.Prints:
    Code:
    27-Jul-20 6:24:07 PM  -  Starting to process tblTelemetryEvents 
    27-Jul-20 6:24:07 PM  -  Deleted old contents of the tblTelemetryEvents 
    27-Jul-20 6:24:07 PM  -  Checking to ensure no contiguous State codes false/false|ture/true
     record # 1 has desc false  OK to review remaining records
    27-Jul-20 6:24:07 PM ** contiguous True found at record 17014 so Delete it
    27-Jul-20 6:24:07 PM  -  Creating the output file 'c:\users\jack\documents\BellInternetDrops...txt'
    27-Jul-20 6:24:07 PM  -  Outages processed: 388
    27-Jul-20 6:24:07 PM  -  Finished
    Good luck.
    Last edited by orange; 07-27-2020 at 05:43 PM. Reason: spelling

  5. #5
    ssanfu is offline Master of Nothing
    Windows 7 32bit Access 2010 32bit
    Join Date
    Sep 2010
    Location
    Anchorage, Alaska, USA
    Posts
    9,664
    Quote Originally Posted by fishhead View Post
    Just to make sure i have it correctly, the primary key is used for the field you are most likely going to be searching by - correct?
    No, the PK field is used mostly to link tables and is set to be unique values. The PK field in one table links to the FK field in a different table. (one to many relationship)
    I use an autonumber type field as the PK field in all of my tables.

    Maybe start by watch Indexing Fields in Microsoft Access Database Tables and Databases with Microsoft Access 10 –Field Indexing, Primary Keys, Index Creation

    Also see Microsoft Access Tables: Primary Key Tips and Techniques

  6. #6
    Join Date
    Apr 2017
    Posts
    1,673
    Does every user have his/her own FE?

    Are users all in same LAN as BE?

  7. #7
    fishhead is offline Competent Performer
    Windows 7 32bit Access 2010 32bit
    Join Date
    Apr 2015
    Posts
    167
    I'm sort of following what your saying here....

    Here's what im seeing at my end.

    On the Frontend i'm running a query asking for 4 fields of data and it is returning +27k records very fast.
    So that tells me all the records are being pulled from the backend into my front end very fast OR it means my front end is showing me the +27k records that are sitting on the back end.

    I then run a 2nd query based on the 1st query asking for a specific record only. The thinking is if the data is on my Frontend already and i'm only looking for 1 of the +27k records this should also happen quickly but it doesn't - this query will take approx 3 seconds to run. (which doesn't seem like a lot but this is only one part of a larger function that is taking over 10 seconds to run).

    If the 2nd query is sorting through the +27k records while they are still on the backend computer and only showing me the result on the front end i don't understand why it takes the 3 seconds to do this when pulling all the records is taking only 1 second. seems to me the "load on the system" is in the volume of records not the 1 specific record?

    Can you help me understand what's happening here?
    btw i've also indexed the table on the backend by the field that i'm searching in the FE query.

    thanks in advance.

  8. #8
    CJ_London is offline VIP
    Windows 10 Access 2010 32bit
    Join Date
    Mar 2015
    Posts
    11,397
    On the Frontend i'm running a query asking for 4 fields of data and it is returning +27k records very fast.
    are you sure it is fast? - all records have not be returned until the recordcount figure in the navigation bar at the bottom of the query has been populated. Access will show the first few records - typically a 'screenful' as soon as it can, but that does not mean the entire recordset has been returned.

    I then run a 2nd query based on the 1st query asking for a specific record only. The thinking is if the data is on my Frontend already and i'm only looking for 1 of the +27k records this should also happen quickly but it doesn't - this query will take approx 3 seconds to run.
    because access has to rerun the first query. The fact you have already run it is irrelevant. Think of it this way. You have previously taken a train from London to Edinburgh (your first query). Now you are catching a train from London to a station on the Edinburgh route (your second query). What impact does your earlier train journey have on your current journey?

    Your objective should be to retrieve as few records as possible - so your criteria should be applied as soon as possibel i.e. within your first query, so the second query does not need any criteria - and probably isn't even necessary.

    Another example - you have a bucket of different coloured balls and want to know how many green balls are in the bucket. Your first query takes each ball out and separates the balls into each colour group (your first query). Then you count the green balls (your second query). Wouldn't it be quicker to take each ball out and if it is green, count it?

    To refine this slightly, this example assumes no indexing - the balls are randomly located in the bucket. If the bucket was indexed, the balls would be in separate layers for each colour so you could just go to the green layer and count what is there.

    it might be an idea if you post the sql to your two queries - we may be able to suggest improvements

  9. #9
    sgtdetritus's Avatar
    sgtdetritus is offline Novice
    Windows 7 64bit Access 2010 32bit
    Join Date
    Sep 2018
    Posts
    19
    I find myself in a similar situation. I have been trying to improve DB performance in Access since we all had to go and work from home. We used to all work from the on site LAN but now we all get onto the LAN from a VPN. As you can imagine we have taken an enormous hit on performance. Since I have explored and been shut down by the IT gods on trying to migrate to SQL server or some other solution, so I am left with pure access solutions.

    Anyway, I have 40k records in my main working table with a lot more columns than I probably should (but that was necessary at the time). When running queries to put records into a form for editing, I sometimes have to build the queries into layers due to the fact that I will have 9 or 10 criteria in order to narrow things down to a workable record set.

    If what I read above is correct, simply indexing one of the search criteria is not going to result in a performance boost, is that correct? If so, should I index each criteria?

    Thanks for your help

  10. #10
    CJ_London is offline VIP
    Windows 10 Access 2010 32bit
    Join Date
    Mar 2015
    Posts
    11,397
    If so, should I index each criteria?
    the easy answer is yes (any field that is joined/filtered/sorted) but it depends on what is in the field. If a wide variety of values, definitely yes. If a lot of nulls I would say yes, but set the index to ignore nulls. If only a few values (such as booleans or something like male/female or ethic group) then probably not.

    It is a case of try it and see, there is no hard and fast rule.

    A bigger impact on performance when you have slow connections is to minimise the data that needs to be brought back - see comments in this thread

    I sometimes have to build the queries into layers due to the fact that I will have 9 or 10 criteria in order to narrow things down to a workable record set.
    this smacks of an excel approach - queries should be ordered to bring back the least number of records from the get go

  11. #11
    sgtdetritus's Avatar
    sgtdetritus is offline Novice
    Windows 7 64bit Access 2010 32bit
    Join Date
    Sep 2018
    Posts
    19
    Ajax, so for a point of clarity, when I layer the queries, the first query will bring back everything with a certain Vendor name from the table. the next query will bring back all the results of the first query with a particular date range. Is that what you mean by an excel approach? The table itself sits in the back end and the queries are all on the front end. I was taught when I built my first few builds long ago that this is the efficient approach when dealing with complex sets of variables in a query. It is entirely possible I was misinformed.

  12. #12
    sgtdetritus's Avatar
    sgtdetritus is offline Novice
    Windows 7 64bit Access 2010 32bit
    Join Date
    Sep 2018
    Posts
    19
    Should I spin mine off for a separate question? I don't want a duplicate thread, but I also don't want to hijack a thread either...

  13. #13
    orange's Avatar
    orange is online now Moderator
    Windows 10 Access 2016
    Join Date
    Sep 2009
    Location
    Ottawa, Ontario, Canada; West Palm Beach FL
    Posts
    16,716
    Can you show us the design of your table(s)?
    And tell us briefly what your searches involve-even an example or 2.

  14. #14
    sgtdetritus's Avatar
    sgtdetritus is offline Novice
    Windows 7 64bit Access 2010 32bit
    Join Date
    Sep 2018
    Posts
    19
    Well the table is very wide, it's a byproduct of what I have to do to load new records. The source data comes in from a huge SAP report twice a month. The table lives on a shared drive on the LAN. All of the columns are needed on one form or another. I trimmed out what I don't need on the level of the SAP report so this is actually less than what we used to have to cope with. ID Number is the primary key since it is unique.
    Claim Vendor Submit To Vendor PN Desc SN On PN Ret SN Off Remove Date A/C Model A/C SN Defect Corrective Action RO PO ID Number Partner Name (sold_to_name) Return Note ID (ret_note) Return Note Text (ret_note_desc) Purchase Order PI Payer Id (po_pi) Vendor Denial Text (ven_den_code_desc) Disp__Code_Description (Disp Code Descr) Vendor Contract Ref Contract Spend Type (007) Order Type (obj_type) GL Date Special Instructions CIC Hours LVR WB Hours Labor Hours Labor Rate Total Labor Part Quantity Material Completion ManUp VendorBA External Invoices Freight Material Handling Vendor Handling Fees Shop Materials Total Net Cost Submit Yes or No Group To Action Comments Submit Date SAP Claim # Accting Doc # Credit Memo # Accepted/Denied Amount Paid Agreed Adjustment Balance DENIAL CODE Vendor Reply Vendor labor hours Vendor Labor Rate TOTAL MIsc Fees USD Handling Fee USD Freight USD Overal Claim Value USD Balance Due Complete Submission comments Paid Date Balance net-paid FY Model Calc Total Labor Calc Total Spend Submit By Credit Entered By Credit Entered Date SAP Entry By SAP Entry Date Last Review by Last Review Date Vendor Material Cost AC ID EIS Date Warranty Claim Age Submittal Age MSTOOL date
    12295972 UTC -- HAMILTON EPS - UTC -- HAMILTON GL542-3101-25 UNIT, CONTROL DISPLAY, EMS 3852 GL542-3101-25 3524 7/31/2020 03GX 9678 1001-"EMS CDU 2 PWR B" CIRCUIT BREAKER TRIP TROUBLESHOOTING PERFORMED BY SWAPPING EMD CDU NO.1 WITH EMD CDU NO.2 IAW AMM 24-70-01-000/400-801 REV 34 AND FOUND FAULT TRANSFERRED
    5800627471 12295972/10-10111 UBS AG 000060825657
    0000600105



    ZCOM SVO 2020.08 2018 Protocol 5 8.00 5.00 1 $0.00 $0.00 $0.00 $0.00 $0.00 $0.00

    ready to submit




    $0.00 $0.00 0.00

    8 No



    Shannon





    $0.00 9678 2/5/2016
    8/17/2020


    Here is the SQL from one of the queries that moves at an absolute crawl.

    SELECT [2016 & 2017 Master File].Claim, [2016 & 2017 Master File].Vendor, [2016 & 2017 Master File].[Submit To Vendor], [2016 & 2017 Master File].PN, [2016 & 2017 Master File].Desc, [2016 & 2017 Master File].[SN On], [2016 & 2017 Master File].[PN Ret], [2016 & 2017 Master File].[SN Off], [2016 & 2017 Master File].[Remove Date], [2016 & 2017 Master File].[A/C Model], [2016 & 2017 Master File].[A/C SN], [2016 & 2017 Master File].Defect, [2016 & 2017 Master File].[Corrective Action], [2016 & 2017 Master File].RO, [2016 & 2017 Master File].PO, [2016 & 2017 Master File].[ID Number], [2016 & 2017 Master File].[Partner Name (sold_to_name)], [2016 & 2017 Master File].[Return Note ID (ret_note)], [2016 & 2017 Master File].[Return Note Text (ret_note_desc)], [2016 & 2017 Master File].[Purchase Order PI Payer Id (po_pi)], [2016 & 2017 Master File].[Vendor Denial Text (ven_den_code_desc)], [2016 & 2017 Master File].[Disp__Code_Description (Disp Code Descr)], [2016 & 2017 Master File].[Vendor Contract], [2016 & 2017 Master File].[Ref Contract], [2016 & 2017 Master File].[Spend Type (007)], [2016 & 2017 Master File].[Order Type (obj_type)], [2016 & 2017 Master File].[GL Date], [2016 & 2017 Master File].[Special Instructions], [2016 & 2017 Master File].[CIC Hours], [2016 & 2017 Master File].[LVR WB Hours], [2016 & 2017 Master File].[Labor Hours], [2016 & 2017 Master File].[Labor Rate], [2016 & 2017 Master File].[Total Labor], [2016 & 2017 Master File].[Part Quantity], [2016 & 2017 Master File].Material, [2016 & 2017 Master File].[Completion ManUp VendorBA], [2016 & 2017 Master File].[External Invoices], [2016 & 2017 Master File].Freight, [2016 & 2017 Master File].[Material Handling], [2016 & 2017 Master File].[Vendor Handling Fees], [2016 & 2017 Master File].[Shop Materials], [2016 & 2017 Master File].[Total Net Cost], [2016 & 2017 Master File].[Submit Yes or No], [2016 & 2017 Master File].[Group To Action], [2016 & 2017 Master File].Comments, [2016 & 2017 Master File].[Submit Date], [2016 & 2017 Master File].[SAP Claim #], [2016 & 2017 Master File].[Accting Doc #], [2016 & 2017 Master File].[Credit Memo #], [2016 & 2017 Master File].[Accepted/Denied], [2016 & 2017 Master File].[Amount Paid], [2016 & 2017 Master File].[Agreed Adjustment], [2016 & 2017 Master File].[Balance net-paid], [2016 & 2017 Master File].[DENIAL CODE], [2016 & 2017 Master File].[Vendor Reply], [2016 & 2017 Master File].[Vendor labor hours], [2016 & 2017 Master File].[Vendor Labor Rate], [Vendor labor hours]*[Vendor Labor Rate] AS TOTAL, [2016 & 2017 Master File].[MIsc Fees USD], [2016 & 2017 Master File].[Handling Fee USD], [2016 & 2017 Master File].[Freight USD], [2016 & 2017 Master File].[Overal Claim Value USD], [2016 & 2017 Master File].[Balance Due], [2016 & 2017 Master File].Complete, [2016 & 2017 Master File].[Submission comments], [2016 & 2017 Master File].[Paid Date], [2016 & 2017 Master File].[Calc Total Labor], [2016 & 2017 Master File].[Calc Total Spend], [2016 & 2017 Master File].[Submit By], [2016 & 2017 Master File].[Credit Entered By], [2016 & 2017 Master File].[Credit Entered Date], [2016 & 2017 Master File].[SAP Entry By], [2016 & 2017 Master File].[SAP Entry Date], [2016 & 2017 Master File].[Last Review by], [2016 & 2017 Master File].[Last Review Date], [2016 & 2017 Master File].[Vendor Material Cost], [2016 & 2017 Master File].[EIS Date], [2016 & 2017 Master File].[Warranty Claim Age], [2016 & 2017 Master File].[Submittal Age], [2016 & 2017 Master File].[EIS Date], [2016 & 2017 Master File].[Warranty Claim Age], [2016 & 2017 Master File].[Submittal Age], [2016 & 2017 Master File].[EIS Date], [2016 & 2017 Master File].[Warranty Claim Age], [2016 & 2017 Master File].[Submittal Age], [2016 & 2017 Master File].[MSTOOL date]
    FROM [2016 & 2017 Master File]
    WHERE ((([2016 & 2017 Master File].Vendor) Like "*" & [Forms]![Front Form]![VendorName] & "*") AND (([2016 & 2017 Master File].[Submit Yes or No]) Like "Yes") AND (([2016 & 2017 Master File].Complete)<>Yes))
    ORDER BY [2016 & 2017 Master File].Claim;



    The query is displayed to the user in a split form as some prefer to use a data sheet view and some like the form that shows one record at a time. The Query in question is on a front end of the split database and is in .accdr format since my users only have the runtime and not the full version of Access 2010 (I don't want them poking around and changing things anyway). Their application has a front form that they can use to specify a vendor to narrow things down. I know the dang thing is big, hairy, and inelegant, but this is what I have.

    What should I Index, and how?

  15. #15
    CJ_London is offline VIP
    Windows 10 Access 2010 32bit
    Join Date
    Mar 2015
    Posts
    11,397
    so for a point of clarity, when I layer the queries, the first query will bring back everything with a certain Vendor name from the table. the next query will bring back all the results of the first query with a particular date range. Is that what you mean by an excel approach?
    yes - and would appear you are using an excel approach to table design as well

    include the dates in your first query and do away with the second query. Plus your vendor and date fields should be indexed. You are also sorting which takes time. So the claim field should also be indexed.

    [2016 & 2017 Master File].Vendor) Like "*" & [Forms]![Front Form]![VendorName] & "*")
    the initial * here prevents the use of indexing and I would think it should be = rather than like anyway. Also the vendor field, being text will also potentially be slow - you should be using a primary key for the vendor - look at the link I provided in post#3 of this thread.

    [2016 & 2017 Master File].[Submit Yes or No]) Like "Yes"
    should be ="yes" assuming this is a text field. Otherwise drop the " as you have here

    ([2016 & 2017 Master File].Complete)<>Yes



    table and field names should not start with numbers and should not include non alphanumeric characters such as #()& - all have meaning in sql and vba. Ideally you should not have spaces either. You are also using reserved words which also will not help

    The query is displayed to the user in a split form as some prefer to use a data sheet view and some like the form that shows one record at a time.
    I suspect it is returning too many records.

    Their application has a front form that they can use to specify a vendor to narrow things down.
    you need to narrow it down further before returning any records. Also consider not returning so many fields until the user requests them.

    you appear to have one big table as downloaded from SAP. Suggest take a look at this link https://www.access-programmers.co.uk.../#post-1714358 for an idea if what you may need to do.

    We used to all work from the on site LAN but now we all get onto the LAN from a VPN. As you can imagine we have taken an enormous hit on performance. Since I have explored and been shut down by the IT gods on trying to migrate to SQL server or some other solution, so I am left with pure access solutions.
    normalising your data, using indexes, minimising network traffic will all help, but moving to sql server will have minimal effect since the problem is the VPN. To address the VPN problem, your IT people need to boost the size of the VPN pipeline, but even then you may only see an improvement of 20-30%, if that. If working from home is to be the new normal, you need to invest in terminal server or citrix.

Page 1 of 2 12 LastLast
Please reply to this thread with any new information or opinions.

Similar Threads

  1. Use of primary keys
    By Homegrownandy in forum Access
    Replies: 6
    Last Post: 06-29-2015, 01:17 AM
  2. Changing primary keys
    By ksammie01 in forum Database Design
    Replies: 4
    Last Post: 02-07-2013, 07:56 PM
  3. Primary Keys & Relationships
    By Njliven in forum Programming
    Replies: 4
    Last Post: 12-17-2012, 09:42 AM
  4. How to create two primary keys
    By Shabana123 in forum Database Design
    Replies: 1
    Last Post: 09-08-2012, 05:55 AM
  5. Multiple primary keys
    By elektrisk in forum Access
    Replies: 5
    Last Post: 02-11-2010, 04:39 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Other Forums: Microsoft Office Forums