In this special guest feature from Scientific Computing World, Gemma Church writes that the Oil and Gas market aims to use more simulation to ensure sound decision making s the world makes better use of renewable energy.
The oil and gas industry is at a tipping point as we move into a brave, renewable world and begin to leave fossil fuels behind.
This trend has been particularly prevalent in the UK where, for the first time, fossil fuels are expected to make up less than half of all the nation’s generated electricity this year. The country also reached a record stretch of 18 consecutive coal-free days in May.
What does this mean for the simulation sector? Well, increasing global energy needs coupled with a greater pool of energy resources could increase demand for modeling techniques, as Richard Yen, senior vice-president, strategic solutions team and global automotive business at Altair, explained: ‘”We are at a very interesting crossroads with regards to energy. We see companies are being conservative in evaluating their projects. This has actually put a greater emphasis on simulation, as a means of ensuring that they are making sound decisions.”
This is an important point. Simulation and modelling are helping businesses unlock the true potential of traditional ways of working in the oil and gas industry. They will also help us evaluate unconventional energy sources, including renewables, nuclear and natural gas, and their associated processes. But, to harness the true power of energy sources old and new, simulation and modelling needs to take a more prominent and earlier role in the processes governing the energy sector. Yen explained: “As the industry becomes more safety and cost-conscious, and as it blends into a mix of conventional to unconventional sources, it has become more important for the industry to have a good grasp of the field characteristics before they start production. For this, the companies need to rely more on upfront simulations, and tie those with the field data.”
This change is happening, albeit at a slow pace, with companies focusing on a handful of novel use cases, as Ahmad Haidari, global industry director of energy and process and power industries at Ansys, explained: “While we have not seen a major sudden shift in our work, we have seen in the modelling and simulation space, a bigger emphasis on reducing waste, preventing spills and a greater use of electrified systems and energy production from a distributed power generation. Furthermore, as the industry’s focus shifts to the use of electro-mechanical systems, replacing pure mechanical and hydraulic systems, we’ve found an increasing adoption of Ansys’ solutions for electromagnetic systems,” Haidari added.
Increased efficiency and improved cost control are also ‘the focus of simulation and modelling’, when it comes to optimising the traditional methods of exploration, production, transport and refinement of fossil fuels, according to Andy Cai, an applications engineer for Comsol.
Customers are more interested in the integration of numerical simulation with the everyday workflows to predict, improve or correct their operational effectiveness. Trending concepts, such as smart drilling and modularization production, are also heavily relying on numerical calculations,” Cai explained.
Haidari agreed that the biggest focus is ‘driving down costs’ for oil and gas companies. ‘Besides workforce reduction and reducing capital expenditure, these companies are also reducing operating costs through standardization, digital transformation and increasing the performance and liability of their existing assets.’
To realize such cost reductions under increasing budgetary challenges requires a fresh approach. Guy Gueritz, energy industry manager EMEA at Nvidia, said: ‘There is a greater focus and investment on more accurate methods of simulation for improved reservoir recovery rates, and in ways to reduce carbon emissions. The industry is investing in renewables and our initiatives in AI are relevant to all energy production sources and their value chains, from upstream oil and gas, power generation, production, distribution and even retail use cases.’
Nvidia recently worked with IBM to jointly develop the Pangea III 28-Pflop supercomputer, which will be used by energy company Total to ‘more accurately locate new resources and better assess potential revenue opportunities,’ according to a statement.
The nodes are based on the IBM Power AC922 node architecture with six Nvidia Tesla V100 Tensor Core GPUs per node and two IBM Power9 processors each. With more than 3,000 Tesla V100s providing more than 90 per cent of the compute power, Total will be able to process the vast amount of data required in seismic modelling to get more accurate insights, faster than ever before.
Machine learning
This need for computational speed is widespread and revolutionizing the way many experts in the oil and gas sector now work, with automation taking a more dominant role in the market.
Haidari said: “Oil and gas companies are increasingly relying on digital technology to improve their operational efficiency, reduce manpower through automation and developing autonomous systems for drilling, offshore platforms and increasing use of remotely operated systems and equipment for inspection.”
EPCC, the supercomputing and data-services centre based at the University of Edinburgh, is collaborating with geoscience consulting firm Rock Solid Images (RSI) on one such machine learning program, to reduce exploration drilling risk for oil and gas companies.
Nick Brown, a research fellow at EPCC, said: “There is a wealth of subsurface data collected by borehole drilling, but as this is real-world data for each specific well, the data itself is noisy and bits can be missing. Also, the petrophysicists aren’t necessarily interested in the raw data from the tool, but instead higher-level information, such as mineralogy composition, porosity of the rock and fluid saturation.”
It currently takes more than seven days for one expert petrophysicist to manually interpret each well to perform a full petrophysical interpretation and generate this high-level information. Brown added: “When you bear in mind there is raw data for many thousands of wells, then the fact that it is such a manual and time-consuming process to interpret each one becomes a major barrier to being able to exploit all this information appropriately.”
A machine-learning algorithm has been developed to speed up the process. But, due to the nature of geology, different measurements can change significantly from one level to the next. As a result, data was missing from the system and this was a tricky challenge to overcome, as Brown explained: “Techniques such as interpolation were not particularly useful here, and instead a machine learning technique, which supported missing data values, was needed.”
“We adopted boosted trees using the XGBoost library, otherwise known as gradient boosting. This approach relies on the idea of decision tree ensembles, where a model consists of a set of classification or regression trees and features of the problem are split up among tree leaves.”
Here, each leaf holds a score associated with that feature, and as one walks the tree, scores are combined which then form the basis of an overall prediction answer. Usually, a single tree is not sufficient for the level of accuracy required in practice and so an ensemble of trees, where the model sums the prediction of multiple trees together, is used.
“As one trains a boosted trees model, the trees are built one at a time, with each new tree helping to correct the errors made by previously built trees. This is one of the factors that makes boosted trees so powerful, and they have been used to solve many different machine-learning challenges,” Brown said.
While boosted trees can handle the missing data values, there are still some challenges to overcome. For example, there is still uncertainty in the resulting predictions and this information needs to be available to the petrophysicist when they are making a decision around the risk of drilling in a specific location.
The boosted tree models are also extremely sensitive to hyperparameter selection. Brown said: ‘There are around 10 interconnected hyperparameters that significantly impact the accuracy of prediction of the model and it’s really hard to find the right combination. We used the hyperopt library that automatically searches the hyperparameter space to find optimal hyperparameters, but had to add in extra parallelization functionality via an MPI (message passing interface) to run it on our Cray supercomputers.’
It was well worth doing this extra parallelization, however, as the model training time went down from a couple of days to around an hour or less. So, this significantly improved our productivity,” he added.
As a result, RSI’s petrophysicist experts no longer need to complete this mundane and time-consuming work. Instead, the machine learning approach can ‘very quickly’ highlight whether a well is likely to be of interest, according to Brown.
Speed matters
Speed of calculation at the scales used in the oil and gas industry is another key issue, as Yen explained: “As you can imagine, oil and gas works on really large scales. As a result, the ability of the simulation tools to handle large models with a very high fidelity is important.”
He added: “It is one thing for the products to have the ability to solve these massive problems, and another to be able to solve them quickly. With this in mind, Altair has invested significantly in providing our customers access to our solvers on appliance, which can be hosted on-premise and in the cloud. These technologies enable our customers to run these huge models in a seamlessly stretchable environment.”
However, the oil and gas industry also works on small scales too, and it can be difficult to realize the speed of calculation required because of the multi-scale nature of the systems requiring simulation. Cai said: “The system scale varies from the atomic level to kilometres of grids. The multi-scale modelling capabilities and the flexibilities of coupling arbitrary physics in Comsol Multiphysics are in high demand for this purpose.”
Comsol recently modeled corrosion of a deep-water pipeline system, which is a highly non-linear case at large scale. The project involved electrochemistry modeling with external currents, pipe coatings and galvanic cathodic protection.
Cai said: “The major challenge comes from the geometric dimension/scales of the physics system. Pipelines are hundreds of kilometres long in a 3D water environment. One has to capture as much detail as possible in the variation of the applied current and electrochemistry over-potential, in order to capture/simulate possible failure in the system, as well as proposing preventive solutions. This poses great challenges to the computational cost and convergence of the numerical simulation,’\” he added.
The Pipeline Flow Module and infinite element domain were used to simplify the most expensive part of the numerical problem, which includes the pipeline systems and the far-field water environment, into a lower-dimensional calculation without sacrificing the accuracy of the result.
Human first
While machine-learning techniques start to gain a foothold in the market, there are human challenges to address too. Cai explained: ‘We have observed a great number of scientists and engineers are leaving their jobs due to budget issues in traditional exploration and production. Talents are also attracted to trending sectors, such as internet companies, which slows down the utilization of latest technologies in traditional operations.’
This, in turn, adversely affects the oil and gas industry’s drive for cost efficiency. Cai added: “Regarding numerical simulation, the frequent movement of modelling experts and engineers affects the usage of established models and simulations negatively, and the high turn-over rate leads to higher cost regarding training and development of newer models.”
However, somewhat counterintuitively, machine learning could help address this human issue. Automation frees experts from time-consuming and mundane tasks to find new ways of working, as we have seen with the EPCC project. As a result, the oil and gas industry could benefit not only from increased efficiencies but also from the development of exciting, new frontiers, in the real world and in the simulation and modelling domain.
This story appears here as part of a cross-publishing agreement with Scientific Computing World.