An Overview of Drillhole Spacing Analysis (DHSA)
Adapted of: https://www.geologyforinvestors.com/drill-hole-planning/

An Overview of Drillhole Spacing Analysis (DHSA)

During the lifetime of a mining project, uncertainty is closely tied to data availability, with data spacing serving as a crucial factor in decision-making. In the exploration phase, drilling is often targeted at specific areas to define the deposit's boundaries, faults, and geological structures, often without regular spacing. Regular spacing becomes necessary when assessing global resources and classifying resources and reserves (Pinto, 2015).

To build a reliable model of uncertainty, sufficient data is required for geostatistical methods. At this stage, data spacing becomes critical, as it helps determine the optimal spacing needed to achieve a defined level of uncertainty. For instance, transitioning from global resource calculations to annual production requires more drilling, and moving from yearly to daily operational scales necessitates even more drill holes. Thus, data availability directly influences the level of uncertainty at each step. Data spacing studies aim to establish the relationship between regular drill hole spacing and uncertainty, allowing predictions of uncertainty levels based on the spacing used (Pinto, 2015).

For to define a complete uncertainty model in resource classification we need to consider three elements (Rossi and Deutsch, 2014):

  • Volume related a production period,
  • A precision error, and
  • Probability to be ±error.

In this article, we will explore three approaches to measuring estimation uncertainty related to drill hole spacing and the size of the volume being estimated. Each method yields distinct outcomes depending on whether global or local estimation uncertainty is being assessed.

Relative and Absolute Precision

Precision is used as reference of a distribution around the mean, it is a measure of the narrowness of a distribution (Wilde, 2010). Precision is also known the probability of a value to fall within an interval of the mean. Precision is calculated after the calculation of the expected value. It counts the number of realizations that the simulated variable fell within a range of the expected value. Uncertainty is as low as many times a value falls inside that range (Pinto, 2015). The relative precision is measured by comparing the results of the estimates with each other and the absolute estimation precision is determined by utilizing an exhaustive data set of close spaced data which was generated from a single conditional simulation realisation.

Proportional Effect

A linear relationship between the mean grade and the standard deviation of the grades contributing to the mean indicates that the variance is dependent on the local mean and since kriging generates a kriging variance based on a single variogram it is a poor estimator of local grade variability unless the proportional effect is accounted for.

A simple method of testing for the proportional effect is to compute the means and variances within a moving window of a given dimension, first in the raw (untransformed space) and then after a Gaussian transformation, and to do a scatter plot of these two values.


Gaussian transformation Au grades and standard deviations within the moving windows. The plot shows that the correlation between the Gaussian variables is low; the proportional effect has been removed.

Merging Simulations

Simulations should be conducted for both geology and grade estimation. Merging of simulations should be performed at the point scale to preserve simulation variability. For simplicity, merging should be done by aligning the realizations (i.e., matching grade realisation 1 with categorical realisation 1). This process is repeated for each zone and each realisation, resulting in merged simulations that remain at the point scale.

Which Attributes?

Depending on where the risks are, the attribute(s) could be one or several of the following (Verly, 2014):

  • Grade,
  • Deleterious element,
  • Thickness,
  • Proportion of ore (indicator variable),
  • Metal (tonnage x grade).

Validation of Simulations

The realizations have to be checked carefully:

  1. Departures from strict stationarity (all statistical properties remain constant).
  2. Bi-Gaussian hypothesis testing: lag cloud scatter plot, indicators variogram, or? madogram/variogram ratio.
  3. Careful visual inspection of the realisations can reveal numerical artifacts, edge effects, high grades in known low grade areas (and vice versa), unrealistic continuity or randomness.
  4. The simulated values at the data locations should be extracted and plotted (scatterplots) against the data values. The mean values should be similar, the mean squared error should be low, the slope of regression should be close to one. The points on the accuracy plot should be close to the 45 grades and the average variance (a measure of precision) should be low.
  5. The histogram, variogram, and other statistical parameters, such as the correlation coefficient with secondary data or other simulated variables, should be reasonably well reproduced.
  6. The variograms of the simulated realizations should be closer to the experimental points than the fitted model.
  7. Swath plots in principal directions show that the realizations reasonably match gradational trends.
  8. Run the simulation for the full grid and avoiding clipping by domain boundaries (edge effects).
  9. The resources from the realisations reconcile with production data and have understandable differences from legacy models (when reconciliation is possible).
  10. The average of many realizations matches a kriged model constructed with a reasonable search.

How many Realisations?

Precision is proportional to the number of realizations (Deutsch, 2002). The number of realisations will affect the uncertainty and precision, with uncertainty decreasing with increasing number of realisations (Pinto, 2015). The number of realisations required for conditional simulation studies depends on several factors, including the objectives of the study, the complexity of the spatial variability, and the level of uncertainty you are willing to accept. If the goal is to quantify the uncertainty of estimates (e.g., for resource estimation), a larger number of realizations (typically 50 to 100) is often recommended. This provides a robust assessment of the variability and helps in calculating confidence intervals. For scenario-based studies (e.g., optimizing mine planning or processing strategies), fewer realizations (10 to 30) might suffice, especially if computational resources are limited.

In deposits with high variability or complex geology, more realisations are needed to capture the full range of possible outcomes. This ensures that the simulations adequately represent the uncertainty and variability of the deposit. For more homogeneous or less complex deposits, fewer realisations may be sufficient.

Finally, the number of realisations can also be constrained by the available computational power and time.

You can estimate the number of realisations needed for a conditional simulation study using the coefficient of variation (CV) and the standard error of the mean (SE). You need to decide on the desired level of precision, often expressed as a percentage of the mean (E).


1. The Combination of Elementary Extensión Variances (C2EV)

C2EV is a global measure of estimation variance (KVAR) that is better suited to 2D accumulation. However, by reducing a 3D model to a series of benches, it can also be applied to 3D problems. This method is particularly useful when drill holes are widely spaced and an estimate of the global estimation error is required. It is important to note that the method assumes the errors made for all blocks being estimated are independent of the errors for other blocks and that there is typically only one sample per block, usually located at the block center.

In the presence of a proportional effect, C2EV would be an inappropriate estimator of estimation variance. The methodology described is largely derived from Davis (1997) and Verly, G., Postolski, T., & Parker, H. (2014).

Procedure

  1. Define a large, idealized block, representing a production volume e.g., quarterly (measured category) or annual (indicated and inferred categories). This initial block should not be too small or else it may invalidate the independence of errors assumption. Consider block size and geometry.
  2. For a particular domain and variable, calculate the CV of the composites and the standardized variogram model. If a correlogram is used, then the kriging variance (KV) in step 5 below should be multiplied by the square of the co-efficient of variation (CV2).
  3. Set up a number of grids with dimensions equivalent to grid spacings to be tested inside domain. Practical consideration should be given to the orientation of the drillholes in relation to mineralization.
  4. Krige (OK) the value of the blocks for the selected production volume and drillhole spacing. Block discretization is required. Recall that the estimation variance requires only a variogram model and the geometry of the sample points, it does not require necessary actual grades of the variables.
  5. Compute the estimation variance (KV) for that single block using only the central sample (100% of the weight). For example, if using a monthly volume: the KV for an annual period is KV(Yr) = KV/12 and for Quarterly KV(Q) = KV/3 (this assumes that the errors between the production periods are independent).
  6. Calculate the Global Relative Estimation Precision for monthly volume at the particular drill spacing = sqrt(KV)*CV*1.645 at 90% C.L. For a quarterly production volume RSE = sqrt[KV(Q)]*CV*1.645 at 90% C.L.
  7. Repeat steps 4 to 6 for each drill hole spacings.


Global relative estimation precision (Annual, Quarterly) at 95% C.L

Limitations

  • It does not account for geologic behavior.
  • The variogram model can be very subjective.
  • The methodology assumes that the error on the production volumes has a Gaussian distribution.
  • The method is unable to account for material above a cut-off grade.
  • It is particularly useful and simple to apply when quantifying the risk for large volumes such as quarterly or yearly production volumes.
  • It important that the variable being analyzed does not display proportional effect.

2. Conditional Simulation (CS) versus Ordinary Kriging (OK)

Conditional simulation based DHSA studies take many forms but they are all aimed at quantifying the reduced risk resulting from increased drilling. DHSA based on simulation offers advantages over other methods, such as C2EV, as it can also be used to quantify uncertainty, including uncertainty around tonnage and grade for selected cut-offs.

This method involves performing a dense-grid conditional simulation (SGS or Turning Bands) over the area of interest. This dense grid, also known as the 'exhaustive data set,' consists of 'true' values used for comparison with block estimates (e.g., OK). The exhaustive data set is sampled multiple times to generate several fictitious drillholes, which are then used as input data for block estimation across various block sizes required for mineral resource evaluation and mine planning. This method is suitable for computing both global and local estimation uncertainty (Wilde 2010, Pinto 2015, Rossi and Deutsch 2014).

This is a method used to map resource uncertainty and is useful for designing optimal sampling or drilling grids. However, the term "optimal" is used somewhat loosely because if the quality and quantity of available data are not truly representative of the orebody characteristics, the outcome may not be optimal. For example, studies conducted during the exploration stage (resource definition) often lack the dense drilling necessary to capture the short-scale variability observed in short-term models (an issue known as the "information effect").

Procedure

  1. Utilizing an existing data set, determine the block size (SMU) to be estimated. Define the number of spacings to evaluate.
  2. For each domain compute and model 3D variograms for each attribute to be simulated/estimated.
  3. Generate n (e.g., 30 or 50) whole orebody Sequential Gaussian Simulations - SGS (merging geology + grades). The points (nodes) should be sufficiently fine (e.g., 1×1×1 m) so that a SMU will comprise of at least 8 points.
  4. Validate simulations (as defined above).
  5. Create fictitious drillholes by extracting drill patterns from each combined simulation at the required spacings.
  6. Composite the simulated drillholes to the same composite size as used in resource modelling.
  7. Re-block the simulation to the required SMU size.
  8. Estimate the grades in models at the same support as the re-blocked simulations (SMU) and using the same estimation methodology as used for resource modelling.
  9. Validate the resource models.
  10. Calculate the difference (reconciliation) between each re-blocked simulation and the estimated tonnes, grade and metal for each period, designs, cut-off grade, or material types.

Workflow for Conditional Simulation (CS) versus Ordinary Kriging (OK) method
Example of results for annual variability - 25 m spacing

3. Multiple Conditional Simulation Realizations

This approach builds on previous methods by applying conditional simulation (SGS), which typically requires the computation of a large number of realizations, often in the order of hundreds. The method provides both local and global uncertainty estimates. However, the computations and post-processing results can be very time-consuming.

Procedure

  1. Utilizing an existing data set, select the production volume (e.g., annual, quarterly volume). Define the number of spacings to evaluate.
  2. Compute and model 3D variograms for each attribute to be simulated/estimated. A common practice is to use a variogram omni-directional with a range equal to half of the domain.
  3. Generate 10-30 whole orebody SGS (merging geology + grades) from the existing data set.
  4. Validate simulations (as defined above).
  5. Select one realisation that best represents the input data statistics and variogram – this then becomes the “target” realisation ("truth model").
  6. Create fictitious drillholes by extracting drill patterns with the simulated grades from the "target" realisation at the required spacings.
  7. For each of the sub-sets of data drawn generate K SGS realizations (for example K = 100). Note: depending on the required precision it may be necessary to run more than 100 realisations.
  8. For each volume of interest rank the K realisations for each grid by increasing grade and read off the values at the 5th percentile and the 95th percentile to give the central 90% confidence interval (±5%).
  9. For each grid size subtract P05 from P95 and divide the difference by the P50 (median) value to arrive at the difference between the Lower and Upper 90% CI's and the median value.
  10. Alternatively, the simulated nodes can be block-averaged using SMU support to represent production volumes, such as annual or quarterly. The standard deviation of the simulated blocks and the probability of the simulated grade in a block to be within for example 5% or 10% of the expected value could be used as measures of uncertainty.

Workflow for Multiple Conditional Simulation Realizations (adapted from guide to data spacing and uncertainty analysis (Pinto, 2015)


The conclusion from this example is that to achieve ±5% at 90% level of confidence a maximum spacing of 125x125m is required.

Others Approaches

Recent novel approaches have emerged to address the challenge of optimizing drillhole spacing in mining. Pinto and Deutsch (2017) introduced a high-resolution method for evaluating drillhole spacing, focusing on the calculation of data spacing where spacing must be inferred from limited, either regularly or irregularly spaced drilling data. This approach integrates various geological, geostatistical, and economic factors to optimize the drillhole spacing, with the aim of determining an equivalent regular drillhole spacing that would provide the same data density.

Afonseca and Silva (2022) propose an integrated methodology for determining optimal drillhole spacing, particularly emphasizing the transition from exploration to ore control phases in mining projects. This integrated approach simultaneously analyzes both raw uncertainty and model uncertainty. The key justification for this method is that existing DHSA workflows in the literature often overlook the distinctions between these two types of uncertainty. Commonly, DHSA algorithms are selected without a thorough analysis of their uncertainty outputs, which can lead to misleading results and suboptimal decision-making. Unlike available solutions that typically focus on either raw or model uncertainty, this approach examines both simultaneously, as well as their interrelationship, to enhance decision-making for models at different stages of mine development.

General Limitations

The following limitations must be understood:

  • The presence of a proportional effect indicates non-stationarity, as the variance is not constant across the study area. This directly impacts the decision of whether to assume stationarity in the model. The proportional effect will affect uncertainty based on the histogram, negatively or positively skewed. Depending on the proportion of values of high or low uncertainty, the expected uncertainty may increase or decrease (Pinto, 2015).
  • Most simulation methods have difficulty dealing with non-stationary behaviors, such as trends.
  • Reliance on the modelled variogram, which can be very subjective even in the presence of sufficient data.
  • The nugget effect also plays an important role in the uncertainty. As variables are simulated in a high resolution grid the nugget effect in the scale of simulation may not be very important. However, when the model is scaled up, low and high values are average out and uncertainty decreases. A high nugget effect will decreases uncertainty faster than a low nugget. Care must be taken when modelling a nugget effect in the variograms for data spacing and uncertainty studies (Pinto and Deutsch, 2017).
  • Computationally, a higher nugget effect can increase the number of required realizations (Afonseca and Silva, 2022).

References

Afonseca, B.C. and Miguel-Silva, V. (2022). Defining optimal drill-hole spacing: A novel integrated analysis from exploration to ore control. Journal of the Southern African Institute of Mining and Metallurgy, vol. 122, no. 6, pp. 305-316.

Boucher, A., Dimitrakopoulos, R. and Vargas-Guzman, R. A. (2004). Joint simulations, optimal drillhole spacing and the role of the stockpile. In: Leuangthong,O. and Deutsch, C.V. (Eds), Geostatistics Banff 2004. Springer, (the Netherlands), pp.35-44.

Cabral Pinto, F. A. and Deutsch, C. V. (2017). Calculation of High Resolution Data Spacing Models. In J. L. Deutsch (Ed.), Geostatistics Lessons. Retrieved from https://geostatisticslessons.com/lessons/dataspacing

Mario E. Rossi, Clayton V. Deutsch (2014). Mineral Resource Estimation. Springer, Dordrecht, Heidelberg, New York, London. Pp. 332. ISBN 978-1-4020-5716-8. ISBN 978-1-4020-5717-5.

Journel A.G. and Huijbregts C.J. 1978. Mining Geostatistics. Academic Press, New York.

Pinto, F. (2015). Guide to Data Spacing and Uncertainty Analysis. Edmonton: CCG-University of Alberta.

Verly, G., Postolski, T., & Parker, H. (2014). Assessing Uncertainty with Drill-Hole Spacing Studies - Applications to Mineral Resources. In R. Dimitrakopoulos, Orebody Modelling and Strategic Mine Planning (pp. 109-118). Perth, WA, Australia: Australian Instituye of Mining and Metallurgy.

Wilde, B. J. (2010). Data spacing and uncertainty (Master’s thesis). University of Alberta, Edmonton, Alberta, Canada. Wilde, B.J., & Deutsch, C.V.




Jeremie Nguani- MAusIMM CP(Geo) , GSSA

Mine Resource/ Exploration Geologist Muscat, Oman Middle East

2 个月

Interesting

回复
André Hauderowicz

COO in Aziwell, The Specialists in Directional Drilling

2 个月

Very interesting, do one also consider how directional drilling can help to determine spacing or reduce amount of platforms to gain similar results, or even try to intersect the ore body with different angles or approaches giving more valuable information?

回复

要查看或添加评论,请登录

Julio Solano MAusIMM CP(Geo), CCRR (Colombia)的更多文章

社区洞察

其他会员也浏览了