Core logging: Optimizing best practice (Part Two).

Core logging: Optimizing best practice (Part Two).

? Dr. J.L. Orpen 30/09/2021.

Background.

Life exists because Earth is a dynamic, rocky planet with an oxygen-rich atmosphere and a hydrosphere. Its earliest memories are held in the isotopic composition of hardy minerals that date from 200 million years after it was born in the solar system some 4.54 billion years ago. By its billionth birthday, igneous, sedimentary and metamorphic processes had created many different lithologies to form a permanent crust, the growth, destruction and deformation of which continues today. Thus, the biosphere is supported by crustal regeneration processes that resupply the earth’s surface with the minerals that are required for the continuation of life – hence, the total biomass that can exist on Earth is limited by the mineral resupply rate.

However, in a geological instant, Homo sapiens has developed a lifestyle that requires more resources than may be provided through the natural cycle. Extraction of these minerals has artificially increased Earth’s carrying capacity, boosting agriculture and enabling all the other activities that support civilization to grow and flourish, but this extravagance cannot be sustained indefinitely.

Indeed, the exponential increase in the consumption rate of raw materials in just this century alone has created an immediate, burgeoning crisis. Life forms other than domesticated organisms are robbed of their daily needs and the landscape is decorated with mountains of waste, the growth of which are unaffected by recycling attempts. This has brought about adverse global change, causing a mass extinction the impact of which will exceed that of the five previous events. Thus, for humanity to survive for a while longer at least, requires intelligent management of the remaining known resources, coupled with an urgent need to search for more.

The problem.

Exploring for more ore, though, is becoming an order of magnitude more difficult than a few decades ago, since most of the bodies that outcrop have been found. Hence, apart from the deep sea, the exploitation of which is likely to precipitate our demise if only from the point of view of strategic defense ( Race to the bottom: the disastrous blindfolded rush to mine the deep sea (msn.com) ), the focus is on locating hidden ore deposits.

Discovering a ‘blind ore body’ though poses serious prospecting challenges. Mineralization processes only enrich infinitesimally small parts of the Earth’s crust, so that finding pods that do not have any surface expression requires a detailed understanding of the regional geology, combined with a healthy dose of reasoned imagination to define drilling targets – many of which will be barren.

Thus, the time and expenditure required to locate a potential deposit, prove its feasibility, then commit to its development, is long and costly. Added to which, deep mining techniques call for much larger blocks of ground to be prepared for caving and extraction, lengthening the lead time before any ore is processed – which is the first point at which the economics of a project can be somewhat reliably estimated.

Relevance to logging.

What has this to do with drill core logging?

Simply put, the data derived from diamond drilling underpins most geological models developed for exploration through to mine planning. Unfortunately though, as was pointed out in Part One of this paper (https://www.dhirubhai.net/pulse/core-logging-optimizing-best-practice-part-one-john-orpen ), despite a worldwide daily production of kilometers of drill core, experienced loggers remain few and far between. As a result data quality is generally sub-optimal so that by inference, the modelling is sub-standard.

That this situation cannot persist is amply evident. Indeed governments recognize the seriousness to the extent that several are actively compiling critical mineral lists (CANADA'S CRITICAL MINERALS LIST 2021 (nrcan.gc.ca) plus 35 Minerals Absolutely Critical to U.S. National Security (visualcapitalist.com) and Backing Australia's critical minerals sector | Export Finance Australia ), yet many geoscience departments globally are being defunded. The decision-makers are very badly informed.

Considerations.

Two matters need to be resolved to ensure top-quality data is consistently obtained from diamond drilling. First, it is necessary to establish why there is a reticence to study the finest rock samples that money can buy whilst they are still at their pristine best? Secondly, it must be asked why the activity is not viewed as a career?

The first question is difficult. In any natural science, except geology, it is axiomatic that to avoid data loss from decomposition, all specimens, whether biological or physical, have to be described and preserved intact immediately after acquisition. This is especially true when examining rocks, as it is those rare, delicate intervals that are altered by shearing, metasomatism and the like, but which weather rapidly as a consequence, that mark the conduits for ore-bearing fluids. No cogent argument can be made for ignoring this fact.

The second question is easier, as it relates largely to the methodology of data capture. Although procedures are improving, on far too many projects core shed recruits are still given an archaic manual along with log table templates that mix point and interval logging together on the same sheet. Edits to the log formats and look-up codes are often not possible because of an equally archaic, fixed database structure. Such prescriptive logging does not encourage recording information outside the column headings. Comments are provided for, but are seldom read by the modellers. Typically it is only after escaping the core shed that the ‘scientists’ are schooled in data analysis and begin to appreciate the implications of sub-standard data gathering.

On top of this, many are erroneously taught how to correct so-called errors made by the drillers, especially where standard depth registration of the core produces overlapping intervals. This is a classic case of two wrongs not making a right on the part of the geologist, and not the driller – at least not always, unless the latter is cleverly under-reporting core loss (https://www.dhirubhai.net/pulse/drill-core-quality-borehole-depth-measurement-missing-john-orpen ).

Thus a career path with promotion scales does not exist. Instead, there are commercial assertions that advances in core sensing technologies, downhole geophysics, artificial intelligence and machine learning, will reliably replace the human element, when in reality these innovations should be welcomed for augmenting the development of top-class observational skills – for which reward mechanisms should be clearly evident.

Perceptions.

Drill rigs are never deployed on projects where little is known about the geology of the ground being sampled. Boreholes are targeted to investigate specific objectives – ranging from testing a mineralization model during greenfield exploration for example, to establishing the optimum slope angle for the continued excavation of an opencast pit. Thus, the decision to mount a drilling campaign is discussed and budgeted for by all parties – the senior geologists/geotechnical engineers responsible for implementing and managing the project, together with the mining/civil engineers and metallurgists, who require the detailed knowledge of the ground to be delivered within a fixed time frame to plan their works effectively.

Then, once the infrastructure has been established to support the operation (drill sites surveyed, water requirements for the rigs sorted, core shed constructed to a design that enables a clear workflow to log, then split the core for assay, as well as sample for geotech tests, etc.), the loggers are brought in and are often set to work with little introduction as to the importance of their job, as well as with little interest being shown on the part of the engineer customers by their lack of visits to the core facility, even though they are invested in the accuracy of the results. Furthermore, productivity is measured in terms of meters logged per shift, QC routines can be erratic and the findings of QA audits seldom followed up – hence the incentive to strive for excellence is low.

In such an environment logging tasks become mechanical. Frustrations rise as backlogs build because of the time taken to complete clerical data capture, followed by validation after transcription from paper logs to digital. This easily turns to exasperation, especially if it is discovered after the rigs have left, that silly errors were made – like the oriented core was not consistently marked top or bottom-side, or inspection of the drillers’ stick-up logs reveals that core barrel lengths were seldom corrected after extensions were added for the orientation tools.

This pen-picture of a logger’s experience summarises the worst-case findings that have been encountered over years of auditing, mainly in Southern Africa, although similar conditions are known to exist elsewhere. A small case study serves to round off the sketch.

On geotechnical programs it is not uncommon to find two logs, called Joints and Structure Sets, recording exactly the same features, except that the first log records the descriptors and angles for each and every natural fracture, whereas the latter summarises these data by drill run.

Hence, for oriented drill runs, it would be sensible to generate the Structure Sets log from the Joints log by:

1.??????Counting the joints per run occurring in each alpha angle set (0 degrees to 30, 31 degrees to 60, and 61degrees to 90).

2.??????Selecting the most ‘high-risk’ descriptor code for each set from the range logged.

This is seldom done, however, and on a few projects I have semi-quantified the logging quality by comparing the results. Those shown for the borehole depicted in Figure 1 are typical. This graphs the difference between the number of individually measured Joints per drill run, subtracted from the number counted for the Structure Sets. The marked positive bias is due to mechanical as well as natural fractures being counted for the sets, which error is surprisingly common.

No alt text provided for this image

Figure 1: Comparing the number of fractures logged for the same drill runs in two tables: Joints and Structure Sets tables.

With regard to matching descriptor codes for the four characteristics: Infill Texture, Fill Type, Micro Roughness, and Wall Rock Competency, this was evaluated by comparing the code used for each of the characteristics against all the codes recorded in the Joints log for each drill run, where oriented.

Figure 2 graphs the results, the spread of which is again quite typical. A 100% similarity shows equivalent codes were found for all four characteristics, 75% = 3 codes matched, 25% = 2 codes and 0% = no codes matched.

No alt text provided for this image

Figure 2: Comparing descriptor equivalence between two logs of the same fracture sets.

Quite how such poorly correlated data is useful for meaningful geotechnical analysis is a mystery. A question that many an astute logger has also puzzled over.

Conclusion.

Most field-oriented geoscientists dream of mapping in areas with near 100% exposure. Drill core offers that experience, with the added bonus that the rock is fresh with an abundance of textures and structures that have urgent tales to tell, ranging from hades to oases, before they fade and crumble into dust. All that is wanting is the landscape – but this is no drawback when relationships, be they unconformable, intrusive, thrust, or metasomatized, are often laid bare with more clarity than in the field.

So there is plenty to excite the inquisitive. Part Three will look at setting up a logging system that encourages as much “peripheral” information as possible to be recorded, within a time frame that does not detract from acquiring the information required to fulfill the purpose for which the diamond drilling was commissioned.

In closing, the history of Olympic Dam is a perfect example of the benefits of such an approach – as a few quotes from an article on its discovery (The Olympic Dam Story | Australasian Science Magazine ) clearly show:

  • “Western Mining made some great discoveries in the 1950s, 1960s and early 1970s by defying conventional thinking and going wherever science and the imaginations of its geologists told it to explore.”
  • “Woodall scoured Australia’s universities for the most brilliant young geoscientists before they even graduated, and implemented a host of what today’s human resource experts would call innovative talent management programs.”
  • “Looking back from here, it’s easy to think that the sheer size of Olympic Dam made its discovery inevitable. But it is sobering to realise that other Olympic Dams might still lie beneath the South Australian outback, their existence unknown because the nearest exploration hole glided by just a few metres away.”

DALLAS DAVIS

President, Dalmin Corporation & Edge Exploration Inc.

3 年

Very useful articles, John.

Very nicely put, a great report. On top of all of this there is “ScanLine orientation bias” to be taken into consideration. In fact a lot of the data that geologists collect from drill cores is biased and inaccurate.

回复
Joe Seery

Director and Principal Geotechnical Engineer at Bastion Geotechnical Pty Ltd

3 年

Nice work John. My only point of contention is grouping 0-30, 30-60, 60-90, preferring to do this following calculation of dip and dip direction from alpha/beta (and really can put the AI to good use summing all alphas within these classes). The objective is to avoid where one class in one borehole (say 0-30) may be one other class in another borehole (say 30-60) on account of hopefully off-perpendicular to orebody drilled borehole orientations being different between the two boreholes. But agreed, if only drilling in one dip and azimuth, this is a good method that will suit certain ore deposit types.

要查看或添加评论,请登录

John Orpen的更多文章

社区洞察

其他会员也浏览了