What is the QuickEst Way of Cultivating Your Cost Data?
When it comes to providing a feasibility report to a client, how do you do it?
Below I have defined two ways of reaching a feasibility estimate, one through traditional methods, the second using a proprietary system. I have also estimated the time spent in doing the said tasks.
Sit back and enjoy the story….
There are two ‘like for like’ mid-sized PQS firms, ‘Redcar Surveying Ltd’ and ‘Bluecar Surveying Ltd’ (Company names are fictional) Tim is an associate QS for the red team, Tom, in a similar position for the blues. Both the QS’s have been tasked by their respective bosses to provide a feasibility estimate for an upcoming project. This ideally will be done using past project data.
Tim starts this task by acquiring the key to the basement, where previous jobs are archived. Once inside the data den, Tim spends approximately an hour looking through hundreds of boxes (blowing the dust to the floor) to find three historic projects that are similar to the proposed. Tim then returns the key and sits at his desk to analyse this data. He then spends two hours getting the information into a spreadsheet. This process took so long due to the fact that the historic jobs did not have a consistent elemental breakdown, Tim had to ‘map’ like for like data and roll this up to a consistent level to report from. The next step for Tim was to use Google to try and find the values for the location and time indices. Once found, Tim spent another 30 minutes recalculating the elements to rebase them to the proposed building’s base date and location. 30 more minutes now to normalise the data to the proposed footprint and yet another 30 minutes is spent running averages on the now re-based data.
Now comes the interesting part, trying to get a report out of the said spreadsheet tool that is presentable and represents the values and standards of Redcar Surveying Ltd. This took 90 minutes. SIX HOURS LATER – we have a PDF to email to the client…
Tom arrives late to work on his ‘feasibility estimate’ day. But Tom is not phased by this. Tom knows that the longest part of his process is waiting for his PC to reboot. Once Tom has logged in, checked Facebook, Twitter, SnapChat, LinkedIn, Tumblr, Pinterest, Flickr, Vine and BeBo (does that still exist!) – he can now start his stopwatch. Tom logs into CATO and creates a new project (1 minute), Tom launches QuickEst (now you thought that the title of this piece was a typo didn’t you – be honest). Tom creates his proposed building (2 minutes). Tom then sets up his search criteria and searches the realms of historic data that Bluecar Surveying have in their QuickEst History Database. (2 minutes). Tom is now presented with a table of like projects, already normalised to footprint and rebased to proposed time and location. Tom can now sort this data to narrow down to 3 ‘very like’ projects. (2 minutes). The 3 projects are now presented –side by side - in a standardised elemental breakdown showing elemental costs per square metre (or per square foot) and Tom is able to cherry pick the elements he wished to omit that may skew his averages. Tom is now happy with his selections and creates the estimate (2 minutes). Tom then creates a report using the company standard template (consistent with all previous feasibility reports) and sends the PDF to the client (2 minutes). Tom spent 11 minutes creating this well presented, professional feasibility estimate showing cost per square metre on an elemental basis. He also has the ability of creating an audit trail of the data that went into the report should the question arise as to “How did you reach that figure?” Just for the record, that will take 1 minute! Tom saved 5 hours and 49 minutes versus his learned opponent.
The moral of the story is – would you rather spend circa £150 on man hours to create an error prone estimate, or perhaps the attractiveness of each estimate costing you in the region of £4 in man hours appeals (based on the associate QS being paid £25/hour)
I urge you to think long and hard about the value of your cost data and how quickly and efficiently this can be farmed to your advantage.
SENIOR COST MANAGER | COMMERCIAL MANAGER | SENIOR QUANTITY SURVEYOUR | SENIOR COST ENGINEER | COST CONTROL | BUILDING | INFRA STRUCTURE | HARDSCAPE & LANDCAPE | HOTELS | HEALTHCARE | FIDIC | COST X | BIM |
9 年Interesting
A2Z Cloud CEO | Global Top Tier Zoho Premium Partner: Where Systematisation Meets Results
9 年Interesting and great justification for computerising and normalising your approach - but you skipped the bit on the cost of getting the historical data normalised and into 'Toms" system in the first place. I presume he didn't just wave a magic wand. This cost needs to be represented as does the investment in QuickEST in the first place. Its clear this type of system is the way to go for any "costing" projects not just QS's. With regard to historical data we can take the view leave it in the basement data den, and only capture new projects into the system so a year down the line we start to see the benefit, or you can import the data from our last 50 projects/1 years work .... either way the above story doesn't give a "true" cost for Tom. Just saying :)
Doing Recruitment *Properly*!
9 年Rummaging in basements can be far more rewarding, sometimes...!
Senior Cost Engineer at Target Engineering Construction Co LLC
9 年Now I will study CATO !