An Overview of Drillhole Spacing Analysis (DHSA)
Julio Solano MAusIMM CP(Geo), CCRR (Colombia)
Mineral Resource Manager at Sun Valley Investments? (SVI)
During the lifetime of a mining project, uncertainty is closely tied to data availability, with data spacing serving as a crucial factor in decision-making. In the exploration phase, drilling is often targeted at specific areas to define the deposit's boundaries, faults, and geological structures, often without regular spacing. Regular spacing becomes necessary when assessing global resources and classifying resources and reserves (Pinto, 2015).
To build a reliable model of uncertainty, sufficient data is required for geostatistical methods. At this stage, data spacing becomes critical, as it helps determine the optimal spacing needed to achieve a defined level of uncertainty. For instance, transitioning from global resource calculations to annual production requires more drilling, and moving from yearly to daily operational scales necessitates even more drill holes. Thus, data availability directly influences the level of uncertainty at each step. Data spacing studies aim to establish the relationship between regular drill hole spacing and uncertainty, allowing predictions of uncertainty levels based on the spacing used (Pinto, 2015).
For to define a complete uncertainty model in resource classification we need to consider three elements (Rossi and Deutsch, 2014):
In this article, we will explore three approaches to measuring estimation uncertainty related to drill hole spacing and the size of the volume being estimated. Each method yields distinct outcomes depending on whether global or local estimation uncertainty is being assessed.
Relative and Absolute Precision
Precision is used as reference of a distribution around the mean, it is a measure of the narrowness of a distribution (Wilde, 2010). Precision is also known the probability of a value to fall within an interval of the mean. Precision is calculated after the calculation of the expected value. It counts the number of realizations that the simulated variable fell within a range of the expected value. Uncertainty is as low as many times a value falls inside that range (Pinto, 2015). The relative precision is measured by comparing the results of the estimates with each other and the absolute estimation precision is determined by utilizing an exhaustive data set of close spaced data which was generated from a single conditional simulation realisation.
Proportional Effect
A linear relationship between the mean grade and the standard deviation of the grades contributing to the mean indicates that the variance is dependent on the local mean and since kriging generates a kriging variance based on a single variogram it is a poor estimator of local grade variability unless the proportional effect is accounted for.
A simple method of testing for the proportional effect is to compute the means and variances within a moving window of a given dimension, first in the raw (untransformed space) and then after a Gaussian transformation, and to do a scatter plot of these two values.
Merging Simulations
Simulations should be conducted for both geology and grade estimation. Merging of simulations should be performed at the point scale to preserve simulation variability. For simplicity, merging should be done by aligning the realizations (i.e., matching grade realisation 1 with categorical realisation 1). This process is repeated for each zone and each realisation, resulting in merged simulations that remain at the point scale.
Which Attributes?
Depending on where the risks are, the attribute(s) could be one or several of the following (Verly, 2014):
Validation of Simulations
The realizations have to be checked carefully:
How many Realisations?
Precision is proportional to the number of realizations (Deutsch, 2002). The number of realisations will affect the uncertainty and precision, with uncertainty decreasing with increasing number of realisations (Pinto, 2015). The number of realisations required for conditional simulation studies depends on several factors, including the objectives of the study, the complexity of the spatial variability, and the level of uncertainty you are willing to accept. If the goal is to quantify the uncertainty of estimates (e.g., for resource estimation), a larger number of realizations (typically 50 to 100) is often recommended. This provides a robust assessment of the variability and helps in calculating confidence intervals. For scenario-based studies (e.g., optimizing mine planning or processing strategies), fewer realizations (10 to 30) might suffice, especially if computational resources are limited.
In deposits with high variability or complex geology, more realisations are needed to capture the full range of possible outcomes. This ensures that the simulations adequately represent the uncertainty and variability of the deposit. For more homogeneous or less complex deposits, fewer realisations may be sufficient.
Finally, the number of realisations can also be constrained by the available computational power and time.
You can estimate the number of realisations needed for a conditional simulation study using the coefficient of variation (CV) and the standard error of the mean (SE). You need to decide on the desired level of precision, often expressed as a percentage of the mean (E).
1. The Combination of Elementary Extensión Variances (C2EV)
C2EV is a global measure of estimation variance (KVAR) that is better suited to 2D accumulation. However, by reducing a 3D model to a series of benches, it can also be applied to 3D problems. This method is particularly useful when drill holes are widely spaced and an estimate of the global estimation error is required. It is important to note that the method assumes the errors made for all blocks being estimated are independent of the errors for other blocks and that there is typically only one sample per block, usually located at the block center.
In the presence of a proportional effect, C2EV would be an inappropriate estimator of estimation variance. The methodology described is largely derived from Davis (1997) and Verly, G., Postolski, T., & Parker, H. (2014).
Procedure
领英推荐
Limitations
2. Conditional Simulation (CS) versus Ordinary Kriging (OK)
Conditional simulation based DHSA studies take many forms but they are all aimed at quantifying the reduced risk resulting from increased drilling. DHSA based on simulation offers advantages over other methods, such as C2EV, as it can also be used to quantify uncertainty, including uncertainty around tonnage and grade for selected cut-offs.
This method involves performing a dense-grid conditional simulation (SGS or Turning Bands) over the area of interest. This dense grid, also known as the 'exhaustive data set,' consists of 'true' values used for comparison with block estimates (e.g., OK). The exhaustive data set is sampled multiple times to generate several fictitious drillholes, which are then used as input data for block estimation across various block sizes required for mineral resource evaluation and mine planning. This method is suitable for computing both global and local estimation uncertainty (Wilde 2010, Pinto 2015, Rossi and Deutsch 2014).
This is a method used to map resource uncertainty and is useful for designing optimal sampling or drilling grids. However, the term "optimal" is used somewhat loosely because if the quality and quantity of available data are not truly representative of the orebody characteristics, the outcome may not be optimal. For example, studies conducted during the exploration stage (resource definition) often lack the dense drilling necessary to capture the short-scale variability observed in short-term models (an issue known as the "information effect").
Procedure
3. Multiple Conditional Simulation Realizations
This approach builds on previous methods by applying conditional simulation (SGS), which typically requires the computation of a large number of realizations, often in the order of hundreds. The method provides both local and global uncertainty estimates. However, the computations and post-processing results can be very time-consuming.
Procedure
Others Approaches
Recent novel approaches have emerged to address the challenge of optimizing drillhole spacing in mining. Pinto and Deutsch (2017) introduced a high-resolution method for evaluating drillhole spacing, focusing on the calculation of data spacing where spacing must be inferred from limited, either regularly or irregularly spaced drilling data. This approach integrates various geological, geostatistical, and economic factors to optimize the drillhole spacing, with the aim of determining an equivalent regular drillhole spacing that would provide the same data density.
Afonseca and Silva (2022) propose an integrated methodology for determining optimal drillhole spacing, particularly emphasizing the transition from exploration to ore control phases in mining projects. This integrated approach simultaneously analyzes both raw uncertainty and model uncertainty. The key justification for this method is that existing DHSA workflows in the literature often overlook the distinctions between these two types of uncertainty. Commonly, DHSA algorithms are selected without a thorough analysis of their uncertainty outputs, which can lead to misleading results and suboptimal decision-making. Unlike available solutions that typically focus on either raw or model uncertainty, this approach examines both simultaneously, as well as their interrelationship, to enhance decision-making for models at different stages of mine development.
General Limitations
The following limitations must be understood:
References
Afonseca, B.C. and Miguel-Silva, V. (2022). Defining optimal drill-hole spacing: A novel integrated analysis from exploration to ore control. Journal of the Southern African Institute of Mining and Metallurgy, vol. 122, no. 6, pp. 305-316.
Boucher, A., Dimitrakopoulos, R. and Vargas-Guzman, R. A. (2004). Joint simulations, optimal drillhole spacing and the role of the stockpile. In: Leuangthong,O. and Deutsch, C.V. (Eds), Geostatistics Banff 2004. Springer, (the Netherlands), pp.35-44.
Cabral Pinto, F. A. and Deutsch, C. V. (2017). Calculation of High Resolution Data Spacing Models. In J. L. Deutsch (Ed.), Geostatistics Lessons. Retrieved from https://geostatisticslessons.com/lessons/dataspacing
Mario E. Rossi, Clayton V. Deutsch (2014). Mineral Resource Estimation. Springer, Dordrecht, Heidelberg, New York, London. Pp. 332. ISBN 978-1-4020-5716-8. ISBN 978-1-4020-5717-5.
Journel A.G. and Huijbregts C.J. 1978. Mining Geostatistics. Academic Press, New York.
Pinto, F. (2015). Guide to Data Spacing and Uncertainty Analysis. Edmonton: CCG-University of Alberta.
Verly, G., Postolski, T., & Parker, H. (2014). Assessing Uncertainty with Drill-Hole Spacing Studies - Applications to Mineral Resources. In R. Dimitrakopoulos, Orebody Modelling and Strategic Mine Planning (pp. 109-118). Perth, WA, Australia: Australian Instituye of Mining and Metallurgy.
Wilde, B. J. (2010). Data spacing and uncertainty (Master’s thesis). University of Alberta, Edmonton, Alberta, Canada. Wilde, B.J., & Deutsch, C.V.
Mine Resource/ Exploration Geologist Muscat, Oman Middle East
2 个月Interesting
COO in Aziwell, The Specialists in Directional Drilling
2 个月Very interesting, do one also consider how directional drilling can help to determine spacing or reduce amount of platforms to gain similar results, or even try to intersect the ore body with different angles or approaches giving more valuable information?