Page 2 of 2 FirstFirst 12
Results 16 to 21 of 21
  1. #16
    Minty is online now VIP
    Windows 10 Office 365
    Join Date
    Sep 2017
    Location
    UK - Wiltshire
    Posts
    3,001
    If you need to calculate currency conversions by rate applicable on the specific transaction date and will reuse that information all over the place, I would store the exchange rate at the time as part of your transaction line data.



    That way you have the original data and can easily convert it with a simple inline bit of maths, and if the exchange rate is ever wrong you can amend it, without messing with the original data.

    It may sound slightly un-normalised (it sort of is) but sometimes the pain of complex calculations and multiple look up joins will make things unmanageable.
    If the rates change on a daily basis, then you are only really storing relevant data to be honest.

    Just my 2 cents worth.
    DLookup Syntax and others http://access.mvps.org/access/general/gen0018.htm
    Please use the star below the post to say thanks if we have helped !
    ↓↓ It's down here ↓↓

  2. #17
    Micron is offline Virtually Inert Person
    Windows 10 Access 2016
    Join Date
    Jun 2014
    Location
    Ontario, Canada
    Posts
    12,737
    You'd post a pic of your db relationships in a new thread and at least link to this one for explanations of the business. Might even be better to just copy/paste what's relevant about the explanation into your new thread just to keep everything tidy and together. However, I know nothing about Azure and can only guess that you can migrate these tables (or at least repeat the structure) to it.

    I don't see an issue with the data being in Excel as there are several ways to get it into Access. If that data is manually input, why not get rid of the spreadsheets and input directly into Access, either into your linked Azure tables or ones local to the db? Same if the Excel data is pushed or pulled from another application rather than being keyed in. Just go directly from that app to Access and simply the whole process?

    As for your last question I can't really relate to what you're saying even after I review the above. Perhaps I'm missing the part that explains it, but my eyes just start glazing over when I read a lot of unfamiliar jargon. Must be an age thing. Are you saying that some thing has a cost but the current cost depends on a date range? The cost data needs to be historical as well?
    The more we hear silence, the more we begin to think about our value in this universe.
    Paraphrase of Professor Brian Cox.

  3. #18
    stildawn is offline Competent Performer
    Windows 10 Office 365
    Join Date
    Oct 2013
    Posts
    185
    Quote Originally Posted by Minty View Post
    If you need to calculate currency conversions by rate applicable on the specific transaction date and will reuse that information all over the place, I would store the exchange rate at the time as part of your transaction line data.

    That way you have the original data and can easily convert it with a simple inline bit of maths, and if the exchange rate is ever wrong you can amend it, without messing with the original data.

    It may sound slightly un-normalised (it sort of is) but sometimes the pain of complex calculations and multiple look up joins will make things unmanageable.
    If the rates change on a daily basis, then you are only really storing relevant data to be honest.

    Just my 2 cents worth.
    There is no exchange rate conversions sorry, I was using the name of the data type for the fields, they are just dollar amounts (USD for the FRT rates, and AUD for the locals, but the user already knows this).


    Quote Originally Posted by Micron View Post
    You'd post a pic of your db relationships in a new thread and at least link to this one for explanations of the business. Might even be better to just copy/paste what's relevant about the explanation into your new thread just to keep everything tidy and together. However, I know nothing about Azure and can only guess that you can migrate these tables (or at least repeat the structure) to it.
    I have migrated to Azure for the current version, but its a bandaid solution all I do is pull all data from Azure on application start and populate the "badly designed" local tables.

    Currently my tables have no relationships between them, its the queries that put them altogether.

    Quote Originally Posted by Micron View Post
    I don't see an issue with the data being in Excel as there are several ways to get it into Access. If that data is manually input, why not get rid of the spreadsheets and input directly into Access, either into your linked Azure tables or ones local to the db? Same if the Excel data is pushed or pulled from another application rather than being keyed in. Just go directly from that app to Access and simply the whole process?
    The data isn't manually input, that is the main point of the whole project, to reduce the manual data entry as much as possible.

    We recieve costs from shipping lines (carriers), in various formats (excel, pdf, email etc) and I get that data and using excel push it into access all at once, in less than a minute. We are talking 300+ lines of data per carrier per month at least, so manual data entry is not an option otherwise the whole project has no point.

    Quote Originally Posted by Micron View Post
    As for your last question I can't really relate to what you're saying even after I review the above. Perhaps I'm missing the part that explains it, but my eyes just start glazing over when I read a lot of unfamiliar jargon. Must be an age thing. Are you saying that some thing has a cost but the current cost depends on a date range? The cost data needs to be historical as well?
    I'll try to explain it using a real life example.

    Our client wants to import some TVs to sell, they ask us for a quote to arrange the international shipping. Due to the amount of TV's we need to load them into a full 20 foot shipping container (20GP).

    There are various shipping lines (Carriers) that provide shipping between the factory in Shanghai China (POL) to our client in Sydney Australia (POD).

    We ask the shipping line "ANL" (Carriers) for their costs to us (this actually happens automatically). The send us a base cost for a 20 foot shipping container (20GP Cost) of $200.00 from Shanghai (POL) to Sydney (POD) valid for April (Valid From / Valid To), they also tell us the time it takes for their vessels to sail between Shanghai and Sydney which is 16 days (Transit).

    However this month there is also an extra cost to cover the ships fuel (20GP BAF) of $100.00, this is valid for the last 2 weeks of April (Valid From / Valid To).


    The sales support employee recieves all this data from the shipping lines (Carriers) and put this into our Freight Rate Program (Access database).

    Our salesman who was asked to quote the client looks up ANL (Carrier) rates for Shanghai (POL) and Sydney (POD), valid for April (valid from / to), the Freight Rate Program (Access database) then shows the "ALL IN" which is the 20 foot shipping container base cost (20GP Cost) plus any additionals, in this case the ships fuel cost (20GP BAF).

    Because the fuel was only valid for the last 2 weeks of April, he sees 2 rows, one valid for 1/4/21 to 14/4/21 which is just the base cost (20GP Cost) of $200.00 (20GP All In (this is what all the queries resulting in Master_Data2 does)), and one valid for 15/4/21 to 30/04/21 which is base cost (20GP Cost) $200.00 + fuel cost (20GP BAF) $100.00 = $300.00 ALL IN (20GP All In). Both rows will also show the transit (Transit).

    In real life there would be more than just ANL, there would be around 10 different Carriers not just ANL, and the salesman would pick the best option based on cost and transit time.


    So in summary, the Master_Data2 line of queries adds together the base costs and the additonal costs together to get 1 cost that covers everything. But does this by comparing the valid dates and making sure only the applicable costs are added together.

  4. #19
    Micron is offline Virtually Inert Person
    Windows 10 Access 2016
    Join Date
    Jun 2014
    Location
    Ontario, Canada
    Posts
    12,737
    Thanks for that very detailed and clear explanation. So I'm thinking, why not treat this like a PO and PO Line Items scenario, with the difference being that your PO line items is the carrier details and costs for a move, sorted by Carrier? Thus you might have 12 records for 10 carriers because 2 of them have extras. Summary cost per group might be doable if that query is cross tab that can include calculation as a row header, but I'm probably getting ahead of myself there. Each record would have the valid to/from dates as separate fields. One could even display the balance of days left between now and the expiry of the cost. The parent data here seems to be the transaction (movement of tv's) so I'm seeing that as the PO header aka job quote. Whatever you end up with for tables design, it will likely be far better than what you have now if it's done right and follows/parallels one standard approach or another.

    In these cases, most of the time we have the know-how to set up tables but lack the business process knowledge and the OP has the opposite problem. Sometimes it's a toss up as to who has to learn which in order to figure it all out and get someone started down the right path. Either way, it's a process, I think. At any rate, I can see why this all stems from Excel, and I think you are justified in using Access, because Excel is not a database and will let you down if you try to make it be one.
    Last edited by Micron; 04-12-2021 at 11:40 PM. Reason: clarification
    The more we hear silence, the more we begin to think about our value in this universe.
    Paraphrase of Professor Brian Cox.

  5. #20
    CJ_London is offline VIP
    Windows 10 Access 2010 32bit
    Join Date
    Mar 2015
    Posts
    11,397
    I’m on my phone and cannot see the db but a few points on performance

    1. Ensure all fields used for joins and criteria are indexed
    2. Limit your data using criteria as soon as possible in your query - having a ‘master’ query is the polar opposite. This may mean repeating sql code for different queries
    3. Avoid udf’s and domain functions - they won’t transfer to azure anyway
    4. Avoid use of union queries - they will ignore indexing and are frequently indicative of poor design
    5. Use numeric fields for linking/criteria wherever possible- indexing is more efficient for larger datasets
    6. Avoid the initial * when using like criteria wherever possible, they negate the use of indexing. Train users to enter it if required
    7. Investigate using non standard joins which enables linking tables rather than using subqueries
    8. Avoid using left and right joins wherever possible-applying relationship rules will help to prevent their excessive use

    A lot of this comes from good normalised design, if your design is based on excel structured tables that are not normalised on import you will not be able to apply many of them

    Be aware moving to azure is unlikely to produce any performance benefits - could even be slower due to network performance - unless your data is properly normalised

  6. #21
    Micron is offline Virtually Inert Person
    Windows 10 Access 2016
    Join Date
    Jun 2014
    Location
    Ontario, Canada
    Posts
    12,737
    How would I know my design attempts are not optimal?
    Build and test using only basic queries to start with. Or build tables and relationships and post a pic of the relationships for commentary. Getting the tables right is most important. Once that happens, getting the data from Excel can be tackled. One way is to link the spreadsheets and then use Access queries to get them into the proper tables if the sheets are not normalized (which they are most likely not).
    The more we hear silence, the more we begin to think about our value in this universe.
    Paraphrase of Professor Brian Cox.

Page 2 of 2 FirstFirst 12
Please reply to this thread with any new information or opinions.

Similar Threads

  1. Replies: 8
    Last Post: 07-03-2015, 05:03 PM
  2. Replies: 3
    Last Post: 10-16-2014, 08:49 AM
  3. Replies: 0
    Last Post: 02-15-2013, 07:55 AM
  4. Replies: 6
    Last Post: 09-02-2012, 04:30 PM
  5. Optimising lookups in a large fixed table
    By u38cg in forum Queries
    Replies: 4
    Last Post: 06-22-2011, 08:21 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Other Forums: Microsoft Office Forums