Skip to main content

By Wendy Ware, Principal Geometallurgist and Brenton Crawford, Chief Geoscientist

Across keynotes, presentations, side conversations, and panel sessions, one pattern was impossible to miss: geometallurgy’s future hinges on three things – Trust in the inputs, Transparency in the models, and Translation between disciplines. These themes formed the quiet spine of the conference, shaping every discussion from sampling and QA/QC to digital twins and team capability.


The 2025 SAIMM Geometallurgy Conference in Johannesburg brought together more than 80 professionals from mining companies, METS providers, academia, and research groups – all centred on one question: how do we make geometallurgy future-ready?

The collective answer was straightforward: shift from collecting data to connecting it, and from describing the rock to predicting how it will behave – and then using that insight to guide decisions across the value chain.

First cohort of Datarock’s Data Science Introduction for Geometallurgists course, delivered on 13 October 2025.

Trust – the hard work beneath every model

Sampling and the foundations of trust

Talks from the Global Mining Guidelines Group (GMG), represented by Tricia Scott, and Anglo American Kumba Iron Ore (Lorena Tafur), highlighted something everyone knows but few operationalise: sampling rigour and QA/QC are the bedrock of predictive modelling.

Confidence in data – and its spatial context – is now as important as the models themselves. Without trust, ML-augmented geomet simply cannot scale.

The role of leadership in building trust

Presentations from First Quantum’s Kansanshi Mining PLC (Lucy Little and Axel Kottgen) reinforced that leadership buy-in and team integration transform geomet from isolated, reactive testwork into a shared organisational system. Their journey demonstrated that culture and structure – not just technology – are essential to making geomet “stick”.

These stories anchor a simple truth: trust is built before the model, not after it.

Transparency – from uncertainty to “living models”

Understanding the fog

As Glen Nwaila from the University of the Witwatersrand put it:

“Your model is a scout, not an oracle. Every model lives in a fog of uncertainty – our task is to know where that fog begins.”

It was a reminder that transparency is not about exposing weaknesses; it’s about making uncertainty usable. As machine learning becomes embedded in geomet workflows, clarity on what the model knows, what it infers, and where its blind spots lie becomes an operational requirement – not a courtesy.

Transparency is what turns a model from a black box into a decision tool.

Living models and digital twins

Machine learning, geostatistics, and implicit modelling are converging into what many are calling living models – dynamic, spatially aware representations of the orebody and the host rocks that evolve as new information arrives.

Valterra Platinum (Caillan Govender and team) provided a compelling real-world version of this future: a continuously updated, geomet-enabled block model that operates as a digital twin, showing how classical geostatistics and modern analytics can coexist within one spatial framework. Their work showed that transparency is not just philosophical – it is architectural. When model lineage, update logic, and validation flows are clear, different modelling approaches reinforce rather than contradict each other.

Where this is heading

The trajectory is already visible:

ESG, tailings, geology, processing, and reconciliation data will increasingly sit inside the same transparent model lineage, allowing teams to trace how each data stream influences the next decision.

Transparency is no longer about disclaimers. It is about:

  • Model integrity: coherent architecture, consistent assumptions
  • Lineage: knowing how a model was created and how it has evolved
  • Update logic: understanding how new data shifts predictions
  • Uncertainty clarity: being explicit about confidence, limits, and risk

In short, transparency turns complexity into something that organisations can interrogate, challenge, and trust.

Translation – the human system that makes geomet work

If trust is the foundation and transparency is the frame, translation is the human system that makes geomet work. It is the capability that turns technical insight into organisational action, and it was one of the strongest recurring themes across the conference.

From siloed experts to T-shaped capability

A consistent thread – and central in Datarock’s keynote – was the growing need for T-shaped capability within geomet teams. As workflows integrate geology, metallurgy, geostatistics, and machine learning, the traditional model of isolated specialists is no longer enough. The discipline now demands people who can go deep in their domain and work confidently across others.

The diagram above captured this shift clearly.

  • The vertical of the T represents depth: the specialised knowledge that underpins technical credibility.
  • The horizontal represents breadth: systems thinking, communication, and the ability to work together across the value chain.

When these two dimensions meet, the real translators emerge:

Geologists who understand data science; metallurgists who think spatially; data scientists who grasp orebody behaviour. These people form the connective tissue of modern geomet – linking datasets, reconciling perspectives, and building the shared context required for integrated decisions.

The human foundation of ML-Augmented Geometallurgy

These T-shaped practitioners are the human foundation of ML-Augmented Geometallurgy. They bridge the gap between models and meaning, ensuring that predictions are not just technically sound but usable. They understand both the mechanics of the model and the operational decisions it informs, and they can translate between the language of geology, processing, data, and planning.

In an environment where models update continuously and uncertainty must be communicated clearly, translation is no longer a soft skill – it is a strategic capability. The organisations that invest in it will be the ones that turn connected data into confident decisions.

Bringing the three together – how the industry moves forward

The future of geometallurgy will not be defined by who has the most data or the flashiest technology, but by who connects it best and uses it. Across every talk, three requirements emerged:

  1. Trust – in the data, models, and teams interpreting them.
  2. Transparency – in assumptions, uncertainty, and model lineage.
  3. Translation – between disciplines, systems, and people.

As Adam Johnston from Cancha reminded us: 

“The sophistication of your geomet model should match the technical experience of your team and the decision-making sophistication of your stakeholders – build confidence first, add complexity later.”

Datarock’s contribution – ML-Augmented Geometallurgy in practice

Datarock delivered both a keynote and a hands-on workshop at SAIMM, outlining how machine learning, computer vision, and spatial modelling are reshaping geomet workflows in practice. The sessions introduced two foundational frameworks that underpin what we describe as ML-Augmented Geometallurgy – an approach designed to build on, not replace, traditional testwork and classical geoscience.

What ML-augmented geometallurgy is

At its core, ML-Augmented Geomet extends conventional, testwork-driven workflows by integrating machine learning, computer vision, and data fusion. These tools allow teams to infer rock characteristics and processing behaviour from datasets not traditionally considered geomet inputs, such as high-resolution imagery, spectral data, bulk chemistry, and modal mineralogy.

The approach does not abandon classical methods; it strengthens them. By combining geostatistics with modern analytics, it becomes possible to:

  • increase spatial coverage without proportionally increasing testing costs,
  • improve prediction confidence by blending complementary datasets, and
  • better align geological understanding with downstream value-chain decisions.

In short, ML-Augmented Geomet uses the strengths of both domains to deliver more consistent, scalable, and decision-ready information.

Two conceptual frameworks introduced

Datarock’s keynote introduced two frameworks that help formalise this shift:

  • Rock DNA: defines rock behaviour through three quantifiable components: whole-rock geochemistry, bulk modal mineralogy, and multi-scale texture. Together, these form the “genetic code” of rock behaviour and a consistent foundation for model-driven prediction.
  • Geomet meets material science: adapts the Material Science Tetrahedron to geometallurgy. ML-Augmented Geomet links the behaviour of rocks or their response (structure, processing, properties, and performance) with their characteristics (the primary variables, or Rock DNA). It provides a physics-aware bridge between cause (rock attributes) and effect (rock behaviour and value outcomes).

These frameworks enable two key solution pathways:

  • Classification – defining geomet domains using measurable characteristics, not subjective boundaries.
  • Prediction – forecasting behaviour and performance across the value chain from those same characteristics.

Datarock Chip – broadening the data people already have

Datarock also showcased Chip, a new capability within tactical geomet workflows. Chip converts standard RGB photographs of RC chips into numerical descriptors of colour, shape, and morphology – providing a rapid, low-cost textural signal that can support domaining and ML-based prediction.

For many attendees, this was the “aha” moment: realising that datasets long considered peripheral – or ‘non-geomet’ – can carry real predictive power. When integrated properly, they can strengthen models, tighten uncertainty, and provide earlier visibility into rock behaviour than traditional methods alone.

Closing reflection – from measurement to meaning

The next phase of geometallurgy is here – one where machine learning and geoscience work together to bridge measurement and meaning.

Our challenge is no longer whether we can collect the data. It is whether we can connect it, interpret it, and trust it enough to make decisions that matter.