Today, oil producers are in a position to capture more detailed data in real time than ever before, at lower costs and from previously inaccessible areas. Oilfields, for example, are essentially connected from end-to-end, enabling companies to harvest and analyse the ever-larger amounts of data generated by people and assets along the oil value chain at ever-higher frequencies.
As oil and gas firms venture into harder-to-reach and more challenging environments, while at the same time connecting more and more hardware to the internet, these data volumes will continue to grow and become more complex.
Big data is a term for data sets – structured, non-structured, relational, non-relational – that are so large or complex that traditional data processing applications are inadequate. Such data is usually generated from sensors and machine-to-machine (M2M) technologies within a facility.
The number of M2M technologies in use in the oil and gas sector could rise from the 423,000 in late-2013 to 1.12mn by 2018, according to analyst Berg Insights. Each ‘dialogue’ between the machines feeds into the wider Internet of Things (IoT). The greatest value of this data gathering lies in the compounding impact; building an organised portal of useful data today places a company at the forefront of innovation in five or 10 years. These systems have evolved and matured to operate alongside today’s IT systems and standards.
These are impressive numbers by any standard. The ability to produce and harvest growing amounts of big data with the intention of finding correlations and trends, presents a unique opportunity for companies across the globe to optimise operations and drive efficiencies to unprecedented levels, thus producing major cost savings. This is also true for the global oil and gas industry, particularly in the Middle East.
For oil and gas companies, the key will be to develop strategies and systems that integrate and manage these increased data volumes to use them in smarter, faster ways. If they don’t manage to make this data work for them, they risk becoming less competitive. The reason is simple: access to data ultimately affects and determines project economics, and plays a crucial role in the application and operation of technologies, such as enhanced oil recovery (EOR) in oilfields.
This is of particular relevance at a time when oil prices have fallen by about half over the past three years – from levels of above $100 a barrel – undermining the economics of many projects currently under implementation, and putting new ones into doubt. With oil prices likely to remain in the $50- to $60-a-barrel range for some time, companies will seek to boost efficiencies and optimise operations to get maximum value out of their investments.
Putting in place the technology infrastructure needed to harvest big data is one thing; making sense out of this data quite another, even for the largest of businesses. While gathering, storing, and processing data has become increasingly sophisticated over the past few years, arguably the greatest challenge today is turning the massive amounts of raw data into insightful information. Yes, technology is critical to handling these streams of data but, by itself, it isn’t a silver bullet. What big data needs is robust analysis that is relevant to a particular business, in this case, the oil industry.
And even though data analysis itself isn’t anything new for companies — it has, for example, always played an important role in decision-making processes in the oil industry —analysing the vast volumes of information being generated in today’s world of big data, on a daily basis, brings about a different set of challenges. People with the appropriate skills are critical today for companies seeking to exploit big data. This includes everything from specific expertise in big data and being able to understand, collect and preserve it, to knowledge of statistics, maths and data visualisation techniques.
Meeting these requirements won’t be easy, however. McKinsey Global Institute has estimated that, by 2018, the US alone will be facing a shortage of 140,000 to 190,000 people with deep analytical skills across various industries, as well as 1.5mn managers and analysts to analyse big data and make decisions based on the findings. This skills shortage means that, according to consultancy Gartner, more than 85% of Fortune 500 organisations won’t be able to effectively exploit big data this year.
One important way of addressing this issue is for oil companies to place more emphasis on developing analytical, big data, and other relevant skills, internally. For the national oil companies (NOCs) in particular, such as the ones operating in the GCC region, the benefits would be twofold: they would potentially develop their national workforce on the one hand; on the other, they would reduce their outsourcing requirements to third parties and build up their capabilities to analyse the data that sits at the core of their operations.
This latter point is of particular importance because, ultimately, it’s the people who work in a company – and who understand how the oil and gas supply chain integrates, and what impact any actions along it may have – that are best placed to use the knowledge they gain from analysing big data to identify potential optimisations and efficiencies. The more people from within an NOC are engaged in this process, the more the company at large can potentially benefit.
Educating employees on the value of the data they are using, and the need to handle it systematically and rigorously will, therefore, be seminal to reap the benefits that come with the application of new technologies. Beyond this, there has to be more involvement between industry and academia on a general level, while partnerships between solution providers and NOCs could lay the foundations for a long-term solution to the skills shortage in big data.