Core Logging and the Digital Revolution.
?J.L.Orpen
Drilling a borehole is an adventure into the unknown, the successful charting of which depends on the skills of two professionals: the driller and the geoscientist – who are assisted by support personnel and specialists.
The diamond driller is tasked to:
Workflows are straightforward. Once the rig is set up, supervisors ensure a constant water supply and that mechanics are on hand to keep it operational. After each drill run, the driller’s assistants empty the inner-tube; add a rod, or rods, to the drill-string (depending on the length of the drill run); clean and pack the core into trays; shovel sludge from the ditch recirculating water to the settling sump, to name a few of their duties. The borehole path is surveyed and is normally also scanned using different geophysical sondes. Drill bit, and drilling mud advisors, are usually on call to help the driller keep the drill-string singing with a steady hum.
Variations in the pitch, or a vibration of the drill string will prompt the driller to check the rig’s gauges to find the cause and rectify any issues. Discipline is maintained on-site for safety reasons, as well as to remain alert for any changes in the system, since reaction time is critical. For example, the drill bit can “burn-in” and close the borehole, requiring a re-drill, if a slowing down of the penetration rate along with a reduced water return, is left unattended.
Finished bits are kept for inspection, together with a record of the meterage each has drilled, since their features show how they have been used or abused. For instance, a rounded crown with no remaining waterways and burn marks on the sides of the bit is not good for a driller’s track record. In contrast, an experienced driller who uses an old bit to penetrate bad ground is respected for trying to get the best possible intact core recovery from a high-risk zone, without ruining a good bit in the process.
Drillers are seldom given the support they require, however, to consistently deliver quality core. As discussed in previous articles in this series, a drilling contract clause requiring 95% minimum core recovery is common. In addition, the driller’s standard key performance indicator is usually the number of meters drilled per shift. However, although both these measures may make good sense to an accountant:
Triple-tube drilling is meant to address many of these issues, for geotechnical projects in particular; but the thicker kerf of the drill bit, which is required to accommodate the three core barrels (outer, inner, and the splits), is not ideal when cutting hard rock. Besides, as a frustrated drill contractor once asked, “Why triple-tube if there are no geos around when the top split is lifted? In the old days, we could never get away with this – the geos understood drilling then.”
The geoscientist gathers data from the core and bore, as specified by the customer’s needs, for example:
Drill campaigns are therefore commissioned for different purposes and, in the past, clip-board workflows were used to reliably log high-quality data to suit investors and engineers alike.
Then late last century, machines driven by algorithms forced a rethink – not only on how data is acquired, used and analysed, but also on the corporate structure of the major companies in particular.
Corporate structure – Outsourcing: As the digital revolution grew, service companies were seen to be more agile than the majors in keeping up to date with hardware and software developments. They also attracted top-flight consultants, for geotechnical projects especially, so that more clients benefited from their expertise than if the consultants were employed by an exploration, or mining company. For these reasons, the majors switched to outsourcing work, which pleased the shareholders as more financial and legal directors could be elected to the company boards, leaving most operational and technical decisions to EXCO management.
This brought with it risk associated with a reduced level of technical expertise on mining company boards. As a result, the oversight of management was not provided at the level necessary and required because of the lack of the requisite technical expertise within the ranks of the board members. Boards then left themselves exposed to actions by management that in the past would have been mitigated by the more technical board members that were then appointed to these positions. Even more importantly, boards were exposed to not even being aware of what it was that they lacked the knowledge to address (Dan Wood, pers. comm. 2022)
Drill core logging was one of the first tasks to be put out to tender, but this was not necessarily a good move:
It is not surprising, therefore, that the quality of data gathering from core and bore has degraded – as has been observed in many fora and by several of the aforementioned top-flight consultants. It must be acknowledged though, that this categorically does not mean that the graduates of the last few decades are any less capable than their predecessors. It is purely the lack of proper training that has let them down.
Data acquisition – Big Data and Deep Dives: In the natural sciences, the characteristics, composition and dynamics of the litho-, hydro-, atmo- and bio-spheres are researched by collecting, measuring and describing fresh representative samples. These data are analysed to create and update predictive models used to decide the likelihood of drought for example, or the optimal slope stability angle for excavating an open-cast pit, or whether the mineralized intersections found in the drill core can be visualized as an orebody.
The influence of digital innovation on scientific research in general has been massive, with increased data collection capabilities requiring rapid advances in Big Data so as to store hyper volumes of information whilst making these data readily available for meaningful, well-structured Deep Dive analyses – without which there is actually little point in obtaining evermore swelling quantities of information.
The impacts on diamond drilling and logging have been no less impressive. These include the declassification of military-grade gyroscopic navigation systems boosting accuracies for surveying the borehole path; progress in accelerometry assisting core orientation tools to more accurately sense the geographic vertical plane relative to the core cylinder; along with major improvements in hyperspectral scanners, downhole geophysical sondes, as well as in core photogrammetry.
领英推荐
These advances generate data to be fed into Machine Learning systems guided by Artificial Intelligence, with the ultimate aim of making the robotic mining of deep ore bodies a reality; as the economics and safety of the temperature and pressure environment at depth preclude any other extraction method.
Hence, there is no shortage of promising tool development supported by relevant software programming. Under normal circumstances, such progress would yield steadily improving results, along with confident modelling and reliable predictions. Yet the trend is the reverse, prompting the question: Why?
Taking stock: Given the above, it is apparent that there is a need to effect change, which requires a discussion as to how to progress the science of subsurface mapping.
Some would argue that this Earth Science sub-discipline might be more advanced had the switch to contract logging not occurred, since experienced, but now redundant geoscientists, would have systematically battle-tested any innovations. However, the counter argument is that innovation would more likely have been stifled, since the majors are mostly reluctant to use new techniques until they have been proven. The mining industry is generally seen as being conservative and slow to adapt to change.
Others recognize that the service companies were strengthened by the switch, and that they have been more open to adopting new technologies in their quest to gain competitive advantage, so that on balance the advances made have probably kept up with the global pace of digital development.
Nonetheless, there are at least two weaknesses to overcome in order to improve matters, so that verifiably top-quality data collection will be the norm from all drilling campaigns. The two overriding factors are:
The reason for first factor is the most perplexing. For reasons unknown, there is a general reluctance to take this weakness seriously – even though it goes against scientific principle that technicians, and not the professionals, are relied on to take the measurements needed to map the subsurface – and often do so without any corroboration. Nowhere, in any other branch of science, is the tool used to collect the most sample for analysis so poorly understood by those whose work is directly affected.
The second came about because, in the early days of the digital revolution, there were no issues with data quality, so that new programming efforts could focus on developing novel analytical routines and modelling packages. However, when data acquisition standards slipped, rather than writing software to try and rectify input deficiencies, creative data validation practices were developed ostensibly to improve data quality. The adage of “garbage in, garbage out” became a truism.
Whereas data validation is essential, to also state that data should be “'cleaned' of errors and further processed in a way that makes them easier to measure, visualise and analyse for a specific purpose” (Paul & Griffin, 2021) whilst also making assumptions to improve perceptions, would seem to be irrational. Hence, by preferring to avoid taking corrective measures to ensure data accuracy and reliability, the direction of software development in the field of subsurface mapping is deemed to be largely rudderless.
It is not surprising therefore that few geoscientists relish the prospect of long-term core logging, in the same way as those who enjoy field mapping. There is little pride to be had in generating data for customers who are already prejudiced against its quality and are predisposed to "clean" it of errors – irrespective of the effort put in to achieve excellence; which largely goes unnoticed anyway because of a dearth of good QA/QC auditors, who themselves have received poor training. What are the chances of honest professional advancement in such circumstances? Where also are the professional bodies that seek to have geoscientists registered in order to practice geology?
Where to from here? The analogy with field mapping is apposite. As was noted in the previous article of this series and paraphrased here: Mapping requires the outcrops to be precisely located, their composition and features to be described, any structures found to be measured up, and representative rock samples to be collected for analysis.
Loggers are required to do exactly the same work, except that their ‘outcrops’ are drill core. Hence the question: Why are the skills of subsurface mapping not similarly taught at undergraduate level?
Comprehending the workings of the drilling process is a lot easier than the first-year exercise of understanding the properties of light and the operation of a polarizing microscope. It would be a great step forward, and one that is long overdue, if only students were taught how to record the measurements needed to properly reference logged data in the framework of 3D space. Distortions in a geological model due to framework errors are a major factor in the failure of many projects.
There are also many opportunities for much-needed research, not only in the broadened horizons opened up by the digital revolution, but also in the engineering of the drill rig – so as to automate borehole depth measurement and core recoveries for example.
That such instruction and research should be conducted at universities is essential – to set and maintain standards and strive for excellence. Without such intervention, the science of subsurface mapping is in danger of becoming moribund, leading to error-prone modelling that is further and further removed from reality.
Moreover, the failure of mining company managements and boards to require this level of knowledge of their geoscientific staff poses a major personal risk to these bodies. Notwithstanding the changes that occurred [as a result of the digital revolution], nothing changed in terms of the previous responsibilities to collect, provide and report accurate data, which since the 1970s have increasingly been accompanied by legal enforcement via the various reporting codes, such as the JORC Code in Australia. The competent persons reporting mineral resource estimates are still legally responsible for these, and mining company CEOs and their boards of directors remain responsible for any mining disaster which can be shown to be the result of poorly collected mine-development data. (Dan Wood, Pers. Comm.)
Reference: Paul, J & Griffin, M 2021, 'The importance of robust validation of geotechnical data and systematic domaining in defining appropriate rock mass parameters as inputs for modelling and design', in PM Dight (ed.),?SSIM 2021: Second International Slope Stability in Mining, Australian Centre for Geomechanics, Perth, pp. 357-372,?https://doi.org/10.36487/ACG_repo/2135_22
?
?
Product and business strategy professional
2 年Hi John, can we discuss the "rudderless software development" comment a bit more please. After 15+ years in the software development business at Maptek, MineSight/Hexagon, Micromine and now IMDEX I question that any of these companies were/are "rudderless". I am a Geologist and have worked with and for many excellent Geologists (some pretty good Engineers and Surveyors as well) in these companies who all have experience with the Geological data collection, management and modelling process. Are you suggesting that a commercial enterprise developing software for the industry is somehow responsible for defining how mining companies collect data and rejecting what we define is poor quality data. I am not sure that the mining companies would accept that kind of oversight or that the commercial enterprise would be commercial for long. The lack of agreed upon data collection and quality standards in mining is, to me, the bigger issue. We have CPs in JORC who sign off on a report but there are no hard and fast rules. What if we have a rule that stated that a downhole survey using an accurate (defined accuracy of xxx) north seeking gyro was required for all holes longer than 100m if it wanted to be used as a measured resource or proven reserve? What if we had a rule that core orientation marks had to have some agreed upon QA process around them before the structural readings were allowed to be used in a geotechnical report? These are not rules that a software company can impose (with or without a rudder) this is an industry issue.
Senior Manager: Geotechnical Support Operations
2 年Great article John (I like your term osmotic learning). I feel that graduate progams in companies are focussing more on ‘how to be a manager’ rather than the critical practical in-field time / hands on experience (osmotic learning) that sets graduates up for the future. I am disappointed in the number of geotechnical engineers who have hardly logged any core or done mapping after 5-10 years in the industry. Companies don’t seem to value these skills of their internal technical people while then requiring these professionals to then scope out and be in charge of data collection programs and the use the data generated for design/ analysis purposes. I don’t believe we set our people up for success sometimes which is leading to a hollowing out of technical understanding within companies, in turn exposing them to risks further down the track.
Director and Principal Geotechnical Engineer at Bastion Geotechnical Pty Ltd
2 年Another quality post John. I am going to revisit after I've digested my half bottle of Pinotage. But yes, in my haze, fully agree with your sentiments. This is great foundation for my current and future contracts "How do we capture appropriate geotech data, balancing cost, resourcing, discipline sentiment, and regulatory requirements?" Keep up the good work mate!