Impact of #Reblocking on #Pit #Optimization
Reblocking, aka super-blocking, is a block aggregation or model size reduction technique. Super-blocking has become a common practice when it comes to pit optimization. More often than not, grade models created by geologists far exceed the ability of most pit optimizers on the market today. A grade model with 100 million non-air blocks is not uncommon whereas most pit optimizers can only handle block models with fewer than 10 million blocks. As a result, mine planning engineers often need to super-block their grade models to a much smaller dimension before a pit optimization run is attempted.
This begs a question: what is the impact of super-blocking on NPV (Net Present Value)?
In general, there are two types of reblocking: 1) Reblocking by grade which aggregates several blocks (up to hundreds) into a super-block and assigns the average grade to the super block. It introduces dilution due to averaging (aka the smoothing effect); 2) Reblocking by dollar value which aggregates blocks into a super block and assigns the sum of the dollar values of the individual blocks to the super block. This type of reblocking does not introduce dilution.
Type 1) is usually done by the end user prior to a pit optimization run while Type 2) is normally carried out by a pit optimizer internally at the scheduling stage. Type 1) is required when the individual block size is much smaller than the SMU (smallest mining unit) of the project and hence not reblocking would artificially inflate the value of the project. Type 2) is recommended only if the block size is the same as or close to the SMU of the project.
This article is about the impact of Type 1) on pit optimization.
Usually, comparisons shall be made based on the NPV (Net Present Value). However, there is currently no consensus on the standard for the scheduling algorithm. So as a valid fallback, we can look at the effect of super-blocking on the profit of the optimal ultimate pit as determined by the industry's standard Lerchs & Grossman algorithm.
Even with the slightly changed criterion, i.e., profit (instead of NPV) vs relocking factor, it is still not an easy task to quantify the effect as most commercial pit optimizers are not designed to handle big block models, a capability that is a must for this exercise.
Using FlowPit, ThreeDify's ultra-fast and scalable pit optimizer, we conducted a series of ultimate pit optimization runs on a real-life block model with about 94 mil (2m x 2m x 2m) blocks using a 45 degree slope angle. The result is shown below:
With the reblocking factor of 1, ie., no reblocking at all, the profit in the ultimate pit is around $44mil. With the reblocking factor of 2 being applied in each of the 3 directions, the total # of blocks is reduced to around 12 million blocks and the profit is reduced by 6.4%, a sizable reduction. Then at the reblocking factor of 3, the total # of blocks is reduced to around 3.5 mil and an accumulative profit loss of 11.2% is incurred. Further reblocking incurs less severe profit loss afterwards, but is still sizable.
What is the effect of reblocking on ore/waste/metal tonnage then? Take a look at the figure below:
There is just a slight change in ore tonnage (the red curve) between different reblocking factors, due to the effect of dilution. The changes in metal quantity (the purple curve) and waste tonnage (the greenish yellow curve) are more obvious between different reblocking factors. There is an obvious drop of both metal quantity and waste tonnage on 1st reblocking. Except at reblocking factors of 7, 10 and 11 (which are not of practical usage), waste tonnage decreases as the reblocking factor increases. This implies that in general (although not always), less waste is mined as the reblocking factor increases, and hence resulting in a steeper pit slope angle (since ore tonnage remains more or less constant over a range of reblocking factors). The steeper slope angle is generally expected as the block size increases due to the discrete nature of block models and the fact that slope angles are calculated using block centers.
What to take from this exercise: reblocking can materially affect the profit of your project and hence should be carried out only if you know its impact to your project.
This exercise uses a uniform reblocking factor in all three directions. No attempt is made to analyse the impact of non-uniform reblocking factors on the profit and pit slope angles. Also as each project is unique, it is best to conduct a series of reblocking runs to find the best reblocking factor for your project. Tools for such reblocking analysis are already available. Just be sure to choose a pit optimizer that can handle tens of millions of blocks in a timely manner.
Happy informed reblocking...
SUPERINTENDENT TECHNICAL SERVICES at NORGOLD (BISSA GOLD SA)
9 年In our line of work what I think is key is how your block model compares with the actual (ie the Mill vs Block model recon), and what ever block model used in that exercise can be used for your optimization, because you know and understand how to bring it close to the actual. So in short I don't think Reblocking should affect the out come of an optimization exercise provided recon is fully understood.
Retired Gentleman
9 年I have had more than 25 years experience in pit optimization which has shown over and over again that appropriate reblocking has little impact on the selection of a robust pit design. So I was initially surprised by this post! The process described in above article as reblocking is actually the process which used to be called regularization in Datamine. Whereby all the individual blocks are averaged together to find a new average for this larger “superblock”. As the blocks get bigger it is inevitable sub-grade material will get included in the average. Since the bigger blocks just has one value when it comes to any full block by block evaluation (not just pit optimization) it will clearly show that the blocks become heavily diluted often to the point they can no longer be considered ore grade and the block size increases. The combined extra dilution (material that has to be processed but generates insufficient revenue to pay for its processing) and loss of resource because the large block is now below cut-off will quickly compound losses. So I do understand the outcome described here. The important piece of understanding missing is that during any successful pit optimization process you need to be considering the grade distribution (as well as you can model it) that can be expect when you come to mine the material. There are two aspects to understand this, the volume variance affect and SMU (selective mining unit). There is a fair bit of earlier geostatistical literature that clocks these concepts in magic, but they are pretty simply. Firstly the volume/variance affect, if you collect any smaller samples they will show more variability. Drill hole intersections measuring perhaps a metre of core length can be expected to show variability in grade and perhaps some very high grades. If however when you come to mine this and the mineralization is narrower than the excavator bucket some low grade will inevitably be included in a bucketful, by the time the truck is filled the grade will become even more averaged. In mathematical terms the variance decreases with the size of the unit being sampled and this relationship probably holds over a whole geological domain if not the whole mineral body. The selective mining unit is a concept, rather than physical entity, representing the smallest unit you are able to successfully mark out and the mine as a single defined unit. Unfortunately selective mining units can vary by mining technique, benching size, equipment size, mineralization orientation and continuity, just to mention a few factors. They can be difficult to precisely predict, however they are likely to be small, perhaps a few truck loads. Often this is smaller than the block model size on which the geostatistcal estimate can be safely based. So the pit optimization process needs a mechanism that can preserve the grade distribution at this SMU size, and a widely used approach is the idea of parcels (or partial blocks), These are smaller units within the bigger block, but preserving the grade distribution (they may not necessarily occupy a known position, it is just know that they are somewhere within that block). When the pit optimizer is evaluating the block, it still has to decide to mine or not mine the whole block, but you can read through the parcels and decide which parcels or above cut-off, therefore mined at a profit, and which are below cut-off, therefore discarded as waste. The safe step, now widely known as reblocking, is to maintain all the parcels as the block size is increased. So it doesn’t matter what is the super block size your estimate of the above cut-off material will be the same. I am not familiar of with ThreeDify’s flowpit software, however if it is not capable of handling parcels (partial blocks) then clearly it should not be used for reblocking. This however in no way should this be extrapolated to the many other pit optimisation packages, that do honour parcels, dominantly those based on the Lerchs-Grossmann algorithm and specifically the Geovia/Whittle programs. Most established mining packages (eg Surpac, Vulcan, Minemap, Micromine, Datamine) will do this safe reblocking (and create parcels) as the model is passed out to the pit optimiser.
Mathematical Programming | Algorithm Development | Production & Strategic Planning for Optimized Decision-Making
9 年In addition to the dilution issue, the efficiency of the solution algorithm (as I found it is a new implementation of the LG) should be taken into account. When you use fast solution techniques, you need to verify the capability of the solution techniques in obtaining optimum (or at least near optimum) solution. That can be done either by comparing FlowPit with re-blocking factor of 1, with those package which use re-blocking, or comparing FlowPit with standard solvers (e.g, CPLEX) for small to medium size instances.
Principal Engineer/Office Manager RESPEC.Reno
9 年I would be careful to say here that the project is making more using small blocks than it would using the larger blocks. This is more of a matter of dilution, and the right answer there is to dilute the model to a selective mining unit, or include either dilution and ore loss that will acceptably model this (I prefer the first method). When you do the re-blocking in a program like Whittle, then you don't actually dilute the model on the interior as it maintains the partials of the various materials inside of the model, you just lose the ability to state where exactly it is in the model. Thus, when re-blocking in this manner, you really don't lose value. What you lose is the resolution on the outer portions of the pit shells. Just a couple of thoughts.