So I have a grid that prices by Width and Height. It's a x/y coordinate for humans to find manually from measurements. I am trying to determine the proper way of normalizing the data.
My unit of measure involves the eighth of an inch or "0.125", no metrics are involved.
I only have about 200 "grids" that I will be normalizing, however, there are often quarterly updates on each "grid".I would like to structure things in a way that makes updating a breeze.
Columns are always widths, rows are always lengths.
While thinking this through, I ran into two choices:
1) Using a query in "steps", where if a width is larger than another, on to the value from a larger width. I have never done this, but I believe it can be done.
Example: 36.125 is priced by Column F as it is greater than 36.
This seems logical as it saves me the time from normalizing 200 grids.
2) Modify the data to incorporate the eighths.
Add 0.125 to each row starting with 15 all the way to 86 leaving me with 569 rows.
Add 0.125 to each column starting with 12 all the way to 96 leaving me with 673 columns.
This would result in over 382K unique values for each "grid".
Our measurements are placed often priced against multiple grids to find the best option. We average about 30 measurements each time we price.
I have added an image from excel to give you a visual on this, hopefully this helps.
On a side note:
I am planning on using this with an access web database (which I have never done), and will have multiple user devices (if that matters).
Thank you for taking the time to read this. We've been using excel with a local file, but I think this is the best way to keep our pricing from getting distributed to our competition.