The future of many mining organisations is not in the hands of the hardnosed CEOs or the bushy bearded, sunburned geologists.
No, in the coming mining industry the data miners will be the ones to rise up and lead the industry forward.
These computer jockeys are the ones that will be guiding the mining industry into the promised land of higher productivity, lower costs and better recoveries.
This became clear on Monday when the Australian Institute of Geoscientists held a symposium on big data.
The AIG was trying to get its members to think about ways they can make use of the tools that will be coming their way but the talks held wider ramifications for miners.
Much of the talks, particularly those immediately pre- and post-lunch revolved around using the technology for geoscience – just the sort of thing to put someone who gets excited about big diggers and trucks to sleep.
However, as has been shown recently, large companies such as Rio Tinto are throwing a lot of intellectual – as well as actual – capital at this.
One of the symposium attendees, SGS Resource Management director Dr Julian Vearncombe made the comment that he had heard through the course of the day a lot of what could be done with big data but had heard few examples.
Clearly Vearncombe did not see the Supply Side column earlier this month on what Rio Tinto is using it for.
Here is another example from the oil and gas world.
Woodside had a failure on one of its production facilities. As the story goes, it locked two of its best plant engineers and two data scientists in a room together to see if there was any way the failure could have been predicted.
The project did not start off well, with the plant types sure there was no way the calamity could have been foretold.
However, once the furniture stopped flying and discussion started flowing it was found that there was a lot of data from plant systems that could be used to help predict failures.
In the end they were sure that not only could that particular failure have been predicted two or so weeks in advance, there were a host of other potential failures they could identify.
This is the type of approach Rio Tinto has taken with its Analytics Excellence Centre in Pune, India.
It is also what Caterpillar is trying to do with its step into big data mining with information technology start up Uptake.
Maxwell Geoservices owner Viv Preston said while the big players were starting to leap into the big data fray, Australian firms seemed to be slow on the uptake.
He said an Accenture survey found that 82% of organisations globally recognised big data could have value to them.
“In Australia its 52%,” Preston said.
“Globally 55% of senior managers recognise the importance of data management. In Australia it’s 38%.
“Productivity is a major issue in the mining industry at the moment.”
However, as Preston points out, labour productivity is at the same level it was 25 years ago.
“Are the data managers of today operating at the same level we were 25 years ago?” he asked.
“Do we see value for money spent on data management?”
Preston said in the mining industry had a generation gap problem, which was leading to a range of silos in the business.
“The left hand doesn’t know what the right hand is doing.
“We’ve got to try to get all the units into a central database and get people working off that central database.”
Preston argues that anything miners do with big data has to start from the CEO.
“It can be driven by middle and lower management but at the end of the day CEOs have to let it and as we’ve seen from the Accenture data, the CEOs aren’t getting it,” he said.
“There has to be a strategic approach to big data.”
Getting into big data is not a cheap or easy exercise for any company.
Data storage is a big issue.
These days when people talk about big data they are talking about petabytes of information.
A petabyte is one thousand million million bytes. That is one thousand terabytes. That is more bites than there were in all of the Jaws movies.
Imagine the size of the device needed to store that.
An top-end USB drive, basically not the thumb drives given out at trade shows, hold 5TB.
The latest in tapes from IBM hold about 7TB.
Then there is the software required to make sense of all of that data.
As CSIRO research group leader Dr Robert Woodcock put it, making the best out of big data was about “finding the signal in the noise”
That means a lot of computing power and clever code to sort through all of the information streaming in to an organisation from its systems and making sense of it.
However, it is in finding that signal and making sense of the data streaming in that will take miners forward.
And it will probably also mark the rise of the geeks.