BIM and Digital Twins, if there is no Value, why are you doing it?
From the beginning BIM and more recently Digital Twins were designed to improve Value.
Value to the end user, value to the client, value to the delivery partners, value for all.
Have we lost sight of this? Are we focusing too much on slavishly following standards, writing spreadsheets, creating 3D pictures or defining schemas rather than asking the question “How does this bring Value to me, my organisation, my client and their end users?
I’m not saying that these activities don’t bring value, but if you can’t define that value; measure, monitor and record it, how do we know there actually is value in it and what is the most productive level to take it to?
Whilst looking at whether there is value in something, it is worth stating that value isn’t necessarily something that includes a pound sign. It can come from many areas such as environmental, societal, reputational, safety and so many others, that are difficult to quantify through simple numerical calculations.
Over the last 15 years I have been running an innovations program for UK industry called COMIT, which has been trying to bridge the gap between emerging technologies and the embedded change averse culture in construction. One of the questions that I frequently get asked is:
“we have this cool piece of technology that is going to save lots of time and effort, but no one will put their hand in their pocket to pay for it, who do you think ought to be funding this innovation?”
My response is nearly always the same “Follow the value, if nobody sees a value in your cool piece of technology or you can’t articulate that value to them, then perhaps you shouldn’t be doing it?” The same needs to be said of the data we create or collect to form our BIM model or Digital Twin.
We are all guilty of getting distracted by shiny things that catch our eye and who is to say that our own morale and enthusiasm, which is raised by having and using it has no value to our organisation? It is that soft value which is so difficult to quantify and measure, but I believe could be the key to the future of our society.
Whether your focus is on value of BIM or Digital Twins, they both boil down to one key ingredient, data. These pieces of data have to be trusted and well managed as well as being handled correctly giving them meaning so that they become information, which, when given context becomes knowledge and finally when applied to resolve a problem, task, decision or question becomes wisdom. At each step, that data becomes more valuable to its user and supplier.
There is a great historical example of this through the work of Abraham Wald, a Hungarian-Jewish statistician did during the second world war.
Early in the conflict it was noticed that many of the aircraft returning from missions over Nazi occupied Europe had a significant amount of bullet holes and damage in the wing tips, main fuselage and tail. Scientists originally concluded that these were the areas that were being targeted by the enemy and so therefore the most vulnerable and should be armour plated. This was an obvious but very erroneous conclusion.
Abraham Wald pointed out the critical flaw in their analysis by looking at what that raw data meant, giving it context and applying it. By which he markedly increased its value.
Instead of looking at where the holes were, he understood the meaning of this data; that those aircraft returning safely were not hit in the engines or cockpit. Meaning that aircraft could take substantial damage in these areas and survive.
Putting this into context, it was realised that those aircraft being shot down, were most likely being hit around the cockpit and engines.
Finally applying this knowledge, they set about putting armour plating around the cockpit and engines which greatly increased the survivability of the aircraft. This sequence had turned simple raw data into wisdom.
This is a great example, but we need to delve into that raw data a little further. No matter how good it looks, the value of that data is based on whether we can find it amongst the huge volume presented to us on a daily basis and when we do locate it, can we trust it, or are we going to have to do some sort of time consuming validation exercise?
I would hope that by now most people interested in BIM and Digital Twins will have heard of the NIST (National Institute of Standards and Technology) report from 2008 which stated that an engineer’s time was wasted up to a level of 40% searching for and validating information? In 2017, the COMIT team undertook some anonymous interviews and research in Europe and due to the increased volume of data and the reliance on digital means found that this could be as high as 80%! Potentially 4 days in every 5 being used to find information because it wasn’t well managed or classified and then validating it because they justifiably couldn’t trust what they saw.
When we put this information into the context of buried services in the UK, it is estimated that over £1bn is lost to the economy every year due to not having trusted data on where underground pipes and cables are. This cost is through either the delays of not trusting the data and having to hand dig trial pits to ascertain where and what the service is, or the disruption when that service is hit causing leaks, explosions or regional power outages.
A great example of this was the Sydney Light Rail Project, before which 500 known subsurface utilities where mapped for relocation, but during the construction a further 400 were discovered leading to delays of almost 1.5 years!
This is just one data type, there are many similar examples across all industries, all impacted because data wasn’t procured, managed, trusted or to put it more bluntly valued, by those that interacted with it.
Data can also impact the market value of an organisation. In a recent study undertaken by Highways England the calculated the value of their data at around £60bn! This is a significant amount when you consider the corporate organisation itself is valued at about £300bn and its tangible assets are £115bn. Whilst they will not be selling this data this really helped to focus the business minds on how they could better procure, manage and disseminate it.
What was really interesting to me, was the use of the “Black Hole” metaphor, which has a significant relevance when we are looking at those illusive soft values. This understands that we don’t measure a black hole by observing it, as it reflects no light, but by measuring its impact on the things around it.
In a recent article on the initiative, it was quoted that Highways England had identified £1.2 million of efficiencies that could be resolved by using this “Black Hole” method.
This study on existing data value is important as no matter how good we get at procuring good quality data, if that is brought back into our business and mixed with poor quality data, it will devalue and could become worthless. It just takes one person in an organisation to have a bad data experience and declare what they are getting from the corporate system is at best tainted and at worst dangerous, and others will dismiss whatever is in the system, and spend a significant time either creating their own, validating it or squirrelling away copies on their own machines.
Therefore, what we need is to be both smart at procuring value and also valuing what we already have.
An excellent article written by Julian Schwarzenbach of Data and Process Advantage Limited talked about pragmatic approach to looking at the absolute value of existing data, in not measuring it at all, but to look at the potential incremental benefits made with it.
For example, Julian writes:
If you assume that your data has a value of £X and that this is a large and unspecified figure (using the Highways England example, this could be a significant percentage of your corporate valuation), then you can perhaps think about how this value X will change if:
- If you are able to create new products and services using your data that are worth £A (which should be straight forward to calculate) then the value of your data is now £X+A
- If you identify process inefficiencies using the data with an annualised impact of £B, then removal of these avoidable costs will increase the value of the data to £X+B. Again, it should be straightforward to calculate B
- If you identify poor decision making caused by poor or missing data with an annualised impact of £C, then these are avoidable costs that will increase the value of the data to £X+C. Again, it should be straightforward to calculate C (although this may be reduced by any costs of improving the data)
- If you identify that some of your site survey data can be sold to third parties, then this sales revenue £D could be viewed as increasing the value of your data to £X+D
- If you use improved data governance to reduce the number of data stores (by combining some) then the cost of data migration for future projects could be reduced by £E (along with a reduction in the risk associated with data migration), so again the data value could be viewed as £X+E
- Etc.
Therefore, if you improve how you manage and exploit your data, your overall data valuation could have increased to £X+A+B+C+D+E+....
There is a problem of course, a recent study highlighted by Davin Crowley Sweet from Highways England demonstrated that year on year the top priority for CEOs is data, yet their OPEX and CAPEX fixation continues to be on technology.
Whether data is for CAPEX or OPEX it needs to follow some basic rules for it to be valuable. In 2014 the Bank of England set out five dimensions for measuring quality in their Data Quality Framework document which are applicable to how we value data.
- · Relevance
- · Accuracy and Reliability
- · Timeliness and Punctuality
- · Comparability and Coherence
- · Accessibility and Clarity
Relevance
This is the degree to which the data meets the needs of the end user. Whether that end user is part of the operational, the delivery partner or the consumer if the data isn’t relevant to their needs then it has no value to them. On the flip side to this, if that data is exactly what they need to carry out their primary task in meeting their company objectives, in an easy and efficient manner, then it could be exceedingly valuable. When we see the term “Level of Information Need” when referring to data mentioned in the BIM and Digital Twin standards, this perhaps could be read as “Level of Information Value”.
Accuracy and Reliability
A piece of data needs to be as accurate as it is needed to be by the end user. There is little Value in having a sub millimetre accurate laser scan of a stretch of blacktop on the highway. The more accurate you make something the more it potentially costs to generate, verify, manage and distribute. If data is both relevant and accurate enough for the end user to carry out their primary task, then it is worth more.
The less I can rely on a piece of data to help me make good decisions, the less I value it. As the Gemini principles point out, trust has a significant value of its own.
Timeliness and Punctuality
If I need to carry out a construction activity on a specific date or need to make a financial decision before a public enquiry, I will need the relevant, trustworthy data before that deadline, so it can help me. If it arrives late then it’s value can be little or nothing.
Comparability and Coherence
Comparability increases our understanding of the data in front of us and puts it in the context of its historical or intended values. For example, it lets us know if the data is within an acceptable range, or whether it indicates a gradual degradation over time. This not only allows us to ensure our business is moving in the right direction to achieve its objectives but also to ensure our assets are performing as designed. Comparability increases the value of data through context.
If data isn’t coherent and we struggle to understand it, the chances are we will ignore it and search elsewhere for an answer or we will waste a large amount of time trying to work out what it means. Leaving that data worth nothing!
Accessibility and Clarity
No matter how relevant, accurate, punctual, comparable and coherent the data, if it not accessible to the end user at the time they need to utilise it, then it might as well not be there! Alongside this the data needs to be presented in an unambiguous way that supports and promotes any associated data. When we want to listen to music whilst travelling on public transport, we will probably use noise cancelling earphones, removing the white noise and just presenting the sounds we want. That same process is needed to strip out the masses of data that is just white noise and give us the clarity needed to make quick decisions.
The Human dimension
When considering data for our BIM and Digital Twin models, we can be forgiven for concentrating our efforts around what has been generated by technology be it hardware or software. This does however miss out a large volume produced by humans, whether this is exchanged in a digital way or simply by physically talking to each other.
When dealing with human generated data we must keep in mind how it can be verified as true. In recent history much has been made of False News, delivered over social media platforms to deliberately mislead or influence the population, who might not know any better to the wrong conclusion. This could be done for many reasons, not many of them for the good of society!
To this end, to ensure that human generated data is valuable to its consumer, the following should be taken into consideration:
· Provenance - Are you looking at an original piece of information?
· Source - Who created the original piece of information?
· Date - When was the piece of information created?
· Location – Where was this piece of information created?
· Motivation – Why was the piece of information created?
These checks against human generated data, could be equally applied to any data in an existing system to verify that it has a level of reliability that ensures the information inside your existing business and asset information models can be trusted and therefor has a value.
Value in Capex
The impact data has on a project starts much earlier than many imagine. Our existing assets and business systems are constantly giving us data during their daily operation and maintenance cycles. At its most basic form this is collected by humans just observing how something is performing and when its no longer doing this efficiently it is flagged up through complaints. This situation might be caused by something needing replacement, upgrading, augmenting or simply decommissioning. Whatever intervention will be undertaken is dictated not only by asset data, but also by business data, looking at financial projections, political will and a plethora of other considerations. The value of the data here in making this decision is significant, as it will decide the direction of the CAPEX activities.
Throughout the project decisions, questions and tasks will need to be resolved by many stakeholders, ISO19650 presents this in their lifecycle diagram with packages of information exchanged between them. In this case, these stakeholders are divided into; delivery team, client, user and authority. The authority represents organisations such as local governments and planning departments giving consent or fire regulation officers signing off on construction. There will be multiple information exchanges needed throughout the lifecycle of the project and these are most commonly defined in an EIR (Exchange Information Requirements) package. This document should include the Project Information Requirements to identify project metrics, Functional Information Requirements to set out the basic function data to ensure the outcomes are met and the Asset Information Requirements to set out the more detailed asset specific data.
Data used by the financial exchanges looking at the risk of raising money for the project will have a large economic impact. The less data they see and more importantly can trust, the higher the financial risk and therefor the higher the interest rates and overall cost of delivery as they service the investment loan. This same risk-based data is used by the insurance industry to set the premiums needed to cover all CAPEX activities.
During the start-up phase, the client organisation will hand over to the delivery team a large amount of existing data from both their asset and business systems. Frustratingly even though they use this data for their day to day operational activities, they will nearly always declare that it can’t be trusted, and it is the responsibility of the delivery partner to check and resurvey if required. This devalues the data to a point where it is worth nothing and the delivery partner creates everything from scratch, at considerable cost back to the client!
During the project itself different parties will both create and consume data at different times depending on the actions they need to conduct, the decisions they need to make and the questions they need answering. That data will have varying values depending on their impact on the outcomes defined by the consumer through the client organisation.
For example, a piece of data used to ensure the correct sequencing of the construction which saves the contractor time and money could arguably have little impact on the consumer who will live in the building for the next 50 years, but a simple piece of data, such as a user guide, that ensures that they can operate the air conditioning in their apartment might be the most valuable to them.
It is interesting to note that the data that is initially generated by the end user (consumer or customer) that defines the outcomes for the project and in turn defines the data that is created by the smallest member of the supply chain (e.g. a widget manufacturer) which is the cheapest to create and yet could be the most valuable piece of data back up the line to that end user!
It has been jokingly noted in the past that “Security isn’t a dirty word” and the right level of security for the data you create, manage and disseminate can increase its value through simply providing trust.
Once a piece of data is captured, it needs to be kept secure and the end user needs to be confident that unauthorised modification hasn’t taken place. A couple of the key requirements for a common data environment is good security and an audit trail, allowing those using the data it contains to be confident in what they are using. Think about the many Crime Scene Investigator programs on the television. When they discover a piece of evidence, it gets bagged up into an evidence bag to prevent it from being contaminated or tampered with. This evidence bag must have a solid audit trail from the time of discovery all the way through until it is submitted as evidence in court. If that trail is broken and there is potential for it to have had unauthorised tampering, the evidence is no longer submersible in court and so is greatly de-valued. Exactly the same with data!
This security protected audit trail that ensures we can be confident about the trustworthiness of the data from start to finish is described in the Hackitt Report (Building a Safer Future) which identified a unanimous concern over the creation, handover and maintenance of fire safety information especially in the private building sector. The lack of well-structured and trustworthy information has caused a number of challenges including:
- It is unclear whether any changes have been made between original design and the completion of construction which may have an impact on the building safety strategy.
- The building owner does not have the required up-to-date information to be able to easily and effectively manage building safety across its life cycle.
- When refurbishing a building, it will be difficult to ascertain what effects any changes may have on building safety.
An interim report identified the absolute need for a “Golden Thread” so that information could be tracked from its point of origin, through any changes of content or ownership and kept up to date, so when an incident occurs the emergency services concerned can quickly take action based on the information provided with the knowledge that it is of high quality and they can trust it.
The report pointed out that progression from one stage of design or construction before it moves onto the next, must demonstrate that the information being handed over must be able to demonstrate this Golden Thread to the Joint Competent Authority (JCA) in order to gain permission to progress.
One of the recommendations it makes is to have full material and manufacturer product data for the things that are fulfilling your functional requirements. These product data sheets and templates need to be accurate, standardised, up to date and deliver the information that is needed, not what the salesman wants you to know! It also recommends a whole raft of Disaster related information that is specified by the end user. This means when creating the templates for the information requirements, talking to fire, ambulance, police and other services to discover what information they will need when they respond to an incident at your facility.
The bottom line is that there needs to be full accountability for the information in your Asset Information Model and Project Information Model, and that clients need to be better a procuring that data. To be able to procure it properly, we need to understand its value.
As we reach the end of the CAPEX phase, there will be the inevitable scramble for the finish where the digital asset and the data that has been so carefully created and curated, will be left by the wayside as the demand for concrete and steel to be handed over in an operational state has the priority in the minds of the project managers. This is due to the data not being seen as a valuable commodity that, if not delivered on time and on budget doesn’t have the same impact on the project completion and final payment for the job as the physical asset.
Commissioning
We commission physical asset to ensure we are getting the value for money we expect. But when was the last time you read a commissioning plan for the digital asset? If the client organisation fails to commission both the physical and the digital, then they are potentially devaluing both.
Value in Opex
Operations isn’t just about the operating the built environment portfolio, but how it and the whole business delivers the outcomes that define its very reason for its existence. Outcomes and their benefits are well defined in both the Government Soft Landings documentation and the BS8536 series.
Those business systems and the data that they hold reflect on how the organisation is delivering against its objectives and key performance indicators. But some data will be more valuable than others, so how can we pinpoint what that is and ensure the IT department don’t waste money linking every aspect of every system together?
This process of defining key business data through Outcomes and Critical Success Factors, into organisational Information Requirements and on into Asset Information Requirements is relatively simple with the most important part being able to tie individual pieces of monitored data with the statements made by the executive board in their business strategy. If you need to demonstrate the value of data and get the C Level board to buy into the digital message, then demonstrating how it can ensure they can see the performance of the organisation not just through intuition, but hard data!
Using a simple three column deduction method the client organisation can take the C level executive strategy which will be full of media and marketing language, and extract plain language outcome statements, set critical success factors against them and then identify the data which will help monitor whether the company has achieved them! This has a significant value to the organisation in their ability to make critical and valuable decisions.
As well as defining the data needed in each of the business systems, that also needs to be connected to a central reporting system, it will also help to define any asset specific information.
The plain language questions in here will help the Information Management team to test their BIM model or Digital Twin to ensure it carries out the function and purpose it was designed for and whether it is delivering the value it promised.
The same exercise can be done with the Project Information Requirements during the CAPEX phase.
Maintenance
No, not maintaining the physical, but maintaining the digital. Like any asset, if you fail to do regular maintenance on your data, checking anything that isn’t updated in an assured automated way is up to date, hasn’t become corrupted or accidently deleted then its value will degrade in the same way a physical asset’s. This activity costs money to carry out, so there is even more reason to ensure that your organisation is only collecting, managing and disseminating data that has a value. This maintenance activity is something that all employees need to be involved in and allowing a simple method for reporting potential data issues will help focus the efforts of the maintenance teams, just as much as if a member of the public reported a pot hole in the road!
Soft Value and Sentiment
As previously discussed, the most difficult yet the most important value to realise and prioritise is the soft value, something hard to grasp or measure unless we use methodologies such as the Black Hole method.
Back in 2012 I help set up and run the Crossrail Information Management Academy, which over time morphed into the Digital Advancement Academy. As well as sharing digital knowledge and experiences, we took on the difficult task of creating future good will and sentiment towards the company. Almost impossible to measure with raw data as many of the positive impacts were only felt years down the line when those visiting the academy, heard the Bentley name and remembered their positive experience and therefor looked on us more favourably and trusted what our colleagues had to say and offer. This value is very subtle and almost imperceptible, but that reputation of trust and valuable partnership can be the highest of all.
Conclusion
To move forward in the digital revolution, we must get better at understanding the value of data and how to increase its value through trust, purpose and function. Organisations of all types and at every level must get better at procuring data at the right cost and ensuring it isn’t devalued through poor management, handling or understanding. Without this we will never be able to properly fund the digital aspirations and move to the next phase.
Boosting the In-House Architectural Team's Productivity by Over 20% from the First Month ?? | ?????????????? & ?????? ?????????????? at ???????????????????? ??????
3 年Very informative
Associate at Gebler Tooth Architects BA (hons) FRSA
3 年Digital Twin or Digital Orphan? This is something I have commented upon in another thread. For example, in the Aviation sector where most of my recent work has been. We always have to survey for each new piece of work because no-one can guarentee the accuracy, qaulity or completeness of the record information. I think a point you raise in your article. So how does one ensure that the twin and handover, does not quickly become an orphan becasue works have been carried out day 2, without record? Something I have encountered time and again. The issue is not the quality of the data per se, simply that the data isn't there. I know there are projects in the pipeline like Autodesk Tandem that seek to address some of this. IoT will obviously also play a role, but controlling the variable of the uneducated end-user / operatives, will I think be the biggest challenge.
Consultant, Author & Trainer | Data & Asset management || Data does not have to be difficult!
3 年Iain, thank you for the mention and this useful, comprehensive article. We could probably discuss many areas in further depth, but there is one area that was key for me- data accuracy. Your example of measuring the thickness of blacktop perhaps confuses precision (how tight measurement tolerances are) with accuracy (whether the data correctly represents the real world entity it describes). Accuracy is the hardest of the data quality dimensions to understand and manage. Why? Well, in order to assess accuracy, you need someone to visit the asset (or at least a valid surrogate for it) in order to check that the data is accurate. Data profiling tools will not help you in this task - they will show the format patterns, completeness etc. and will help you spot data that is not plausible. For example, the data about me could indicate that I have a full head of ginger hair - valid and plausible, but comparing this data against my profile pic shows that it is clearly inaccurate!
MEP BIM Coordinator
3 年I agree with