Mark Claxton, director of Tessella’s energy division, has spent more than 30 years in IT, the majority of which in oil and gas operations worldwide. Today, he says, the role of technology in the sector is ever more important.
I worked in the UK developing models for the use of the nuclear industry and that was the point when people started to think about doing reservoir modelling. The same fluid dynamics, three-phase models for oil and gas just started to be created then so I was working with various people that were involved in that space.
I then spent over a decade working on the joint European torus, the nuclear fusion experiment in Oxford. The relationship with that and the oil industry is that there is a very large amount of data that needs to be processed and managed.
My role was to look at this large amount of data. For nearly 20 years it has primarily been in oil and gas and mostly looking at risk assessment codes, a lot of upstream activity, subservice codes, petroleum prediction and calculations.
One of my jobs in the early days wastrying to make codes run as quickly as possible so you could get the results in 12 hours instead of 48 hours. This problem still exists today. People want their answers quickly and because we have ever larger amount of data you still have to do things in parallel.
Oxy are now running hundreds of multi-colour simulations simultaneously. Thirty years ago you were lucky to run five simultaneously. We are seeing quite a lot of work downstream, particularly around corrosion, which is of interest to the upstream community.
Some of the challenges are around different modelling being done to deal with quite interesting topography and how to actually access the reservoir to get the most out of it, rather than leave 50% or 70% of the oil behind.
Going back over the last 30 years, I have come across situations, where we were presented with tapes and it was absolutely impossible to find out what was on them since the software that we used to create them no longer existed. All we had was basically a lot of useless data.
Today we have a digital preservation issue. The concept of that is, if you spend millions of dollars gathering data that has value, don’t just put it on a CD or store it away in an archive and then expect to be able to use it in 20 years’ time.
The low price of oil has everybody thinking about doing things incredibly efficiently. There is a lot more planning and preparation in the industry.
People are not going in blind and they have got a very good understanding of what they expect to find.
Particularly around the risk assessment modelling, there is a lot of fantastic work being done in trying to predict what would happen if an event occurred and how to make sure that the impact on people or the environment is minimal or non-existent.
I think automation is clearly going to be where it is. If you think about it, there are interesting drivers there.
If you were somewhere incredibly harsh and you didn’t want many people there, you would want to automate as much as possible. Similarly, if you are in an area that is unstable you want as few people as possible there and you would want to have a plant, which effectively looks after itself.
I think that the legacy of the dropping price of oil over the next few years will be people wanting to revisit how to get a barrel out as safely, efficiently and cost-effectively as possible with a minimal impact on the environment.
What else do you need to know about Mark?
Claxton heads up Tessella’s business in the Oil and Gas and Nuclear industries. He joined Tessella back in the early 80’s as a programmer.