HPC at the Leading Edge: Oil Exploration
Latest News
June 1, 2018
By Addison Snell
High-performance computing (HPC) isn’t just about science; it’s about competitiveness, innovation, and industrial growth. Any look at the use cases of HPC proves it.
Companies use enterprise computers for a wide range of tasks. Many of these are about the day-to-day tasks inherent to running a business, such as mail and web support, sales force organization, customer relationship management, payroll and human resources. But many in industry also deploy computing in the pursuit of advancement in their core businesses: auto manufacturers virtually crash-testing new minivans, pharmaceutical companies exploring new potential drugs, financial institutions pursuing econometric models to reduce risk.
With pervasive HPC usage and the largest supercomputers in the commercial sector, the oil and gas industry is at the summit of industrial HPC. The primary application is seismic modeling, interpreting readings from controlled detonations to find and define the subterranean, ancient lakes and riverbeds that are likeliest to contain the former dinosaur matter that has become petroleum.
For the 2014 U.S. Council on Competitiveness report, “Solve. The Exascale Effect: The Benefits of Supercomputing Investment for U.S. Industry,” Intersect360 Research interviewed leaders in supercomputing from multiple industries, including energy. As one representative of an oil company explained for that study, “Seismic imaging is critical for oil and gas exploration. There’s not a really good way around it.” Others expounded further:
We’re using [HPC] to find oil, and if you’re not finding oil, you’re not very competitive as an oil company. … Where you set the well, and how the well is set, and how the well performs, for 20 or 30 years in many cases, is critical to the economics, and being able to have high-resolution images increases your NPV [net present value] associated with the asset. … If you’re talking about a $200 million well, getting it perfect the first time is really important, and having a really good picture of the subsurface and how those fluids are sitting is critical to that.
If you go back to analog days, there was a person whose job title was called “computer,” and the kind of data quality – the kinds of [oil] fields we were able to discover – were very simple. The fields that we’re looking for today are not possible to explore effectively without computing. When it costs a few hundred million dollars to drill a well, you can’t afford to just go out and do that randomly.
The primary reason HPC continues to grow as an IT segment is that challenges keep getting tougher. When one problem gets solved, we naturally progress to the next, harder problem. Until we reach the end of science, there will always be a harder problem to solve.
Excited by Exascale
As we approach the exascale era, with computers up to a thousand times more powerful than the top supercomputers in use today, it is worth considering who can make use of such computational might. And while the bleeding edge of adoption takes place at government-funded national labs, companies in the oil and gas industry will be among the first to push simulations to exascale.Further quotes from oil companies in the Solve report support this notion:
I think exascale is a milestone that’s a cool road mark in front of us, and having the government drive those projects is absolutely valuable for industry. If it brings a technology a year or two years earlier, it has a huge value for us.
[Academic and government labs] are willing to do some things that are much more experimental at big scale. We need to feel like it’s going to work before we make that kind of investment. But we’re not very far behind most of the things that they’re doing.
In short, seismic analysis is not a “solved” problem that doesn’t get any harder. New techniques capture orders of magnitude more data for the production of higher-fidelity images. On the development side, there is particular ongoing interest in achieving higher-resolution simulations for “sub-salt” reservoirs. Because of how they formed, many oil reservoirs are buried underneath salt domes. This poses a potential problem for simulation, as the crystalline structure of the salt can distort seismic waves. Companies are pursuing techniques for modeling oil fields that are obscured by salt.
One oil company representative perfectly captured this competitive challenge in the Solve report:
We have two ways to be competitive. One is to have software that we create that does the science faster and cheaper and better. And the second is to have larger [HPC] capability [to use the] software faster and better and cheaper.
Addison Snell is CEO of Intersect360 Research.