top | item 21529985

Big Oil’s Favorite Toy: Supercomputers (2018)

39 points| ganeumann | 6 years ago |wsj.com

36 comments

order

technofiend|6 years ago

Shell Oil had a Cray 1 and an ncube when I worked there in 1989. Chevron had a Cray 1 and a raft of IBM mainframes in 1990. Chevron's DC was set up for visitors so every cluster had a sign describing the computing power for that system. Total compute power was very important because it meant they could do more analysis on seismic data to refine their oil and gas lease bids.

Chevron had so much seismic data they required (from memory) six robot tape libraries. They were ganged together so the picker robots could hand tapes between cabinets in case all drives in a given cabinet were in use. It was cool as hell to watch the camera mounted right above the picker flying around this dark wall of tapes to go grab one.

One of Shell's Quality training videos was about The Guy Who Lost The Tape; seems a seismic data tape was mislabeled and lost, causing Shell issues with bidding on a lease. Their Deming-style quality training was all about preventing that sort of thing happening again. I dare say their data was more valuable in total than the hardware.

beerandt|6 years ago

If you knew the effort and cost that goes into large scale near and offshore seismic and non-seismic surveys then you wouldn't think twice about putting the value of the data well above the value of the hardware.

Offshore deepwater (~2500ft+) non-seismic surveys might cost 6-7 figures per day to operate, and might fill up a hard drive every 1-3 days.

Depending on how many drives fit on a tape, the raw data could get very expensive, very quickly, even before it's been processed, analyzed, etc.

eru|6 years ago

Do you have any good resources on that Deming-style quality training?

CharlesColeman|6 years ago

Aren't they Big Oil's old favorite toy? In the 90s, a local university near me had a Cray XMP that was a gift from some oil company after they'd decommissioned it.

52-6F-62|6 years ago

My grandmother was an analyst for Gulf/Shell going back into the 60s. She was a resident FORTRAN/COBOL expert that was recalled in the 90s after retirement.

Data and systems analysis at scale is definitely not new to them.

hackcasual|6 years ago

I'm in software dev, my dad was in geophysics, he's always had a much better work computer than me.

baroffoos|6 years ago

Exactly. I have been hearing about this stuff for ages. Oil companies optimized raping the land decades ago.

petschge|6 years ago

When going to training classes on optimal I/O strategies for new supercomputers there is usually a bunch of people working for oil companies. The simulations I do for plasma physics are considered "easy" there because they need little input, generate only a few 10 terabyte output and are by-and-large compute bound.

tyfon|6 years ago

I was working as a hired SGI/SUN admin via ABB for Statoil in 1997 in Stavanger. The HQ was packed with Ultra 60s with creator3d or elite3d cards. Some even had both. We had to keep a few hundred of these running and they were used to explore seismic data visually.

Those machines were pretty powerful for those days, almost at supercomputer level. One cost more than a years salary for me back then.

But Oil has always loved computers.

mikorym|6 years ago

What kind of signal analysis do the oil companies do these days (like wavelet transforms)?

My (in my opinion failed) HonsBSc project was on signal analysis of GC-MS (Gas Chromatography coupled to Mass Spectrometry) signals.

If I had to do it again in 2019, then machine learning would be a much more pertinent focus. However, without the computing power of today and ubiquitous presence of programming libraries for that purpose, there are actually other ways of approaching such data (like wavelets).

Wavelet transforms were invented quite a while ago [1] but I think the seismic data analysts were some of the first to really investigate the applications of that field. The other application is for compression (and loading over an internet connection) [2].

[1] https://en.wikipedia.org/wiki/Haar_wavelet

[2] https://en.wikipedia.org/wiki/Lifting_scheme

Jordanpomeroy|6 years ago

Given a sparse data set recorded by methods that have known constraints and limitations, generate a physical model that could conceivably produce measurements that match the given data

dreamcompiler|6 years ago

Schlumberger was a major force in AI in the 1980s when I worked there. So was Texas Instruments, which started as an oil well instrumentation company. Computing has always been important to Big Oil.

mogadsheu|6 years ago

Strangely enough, I had a couple of classes with Xukai.

Former energy VC with the Norwegian state energy co here. Two of the companies we invested into were HPC related—one of them developed node controller solutions for parallel processing, the other developed a platform to analyze massive amounts of subsurface data more quickly.

When the cost to drill a single exploratory well costs as much as some IPO’s, in some cases with <10% expected chance of success, there’s plenty of love for supercomputing in this industry.

kjs3|6 years ago

Geotechnical users were always the second or third biggest users of supercomputers after national security users. Big oil always had the most astonishing data centers.

riskneutral|6 years ago

Nun new. Supercomputing has been used in Oil & Gas exploration for decades.

eru|6 years ago

Actually, the news for me is that Oil & Gas exploration still has needs for computing that's considered significant today.

Basically, I can see that they needed super computers in the 80s. But I might have assumed that their computing needs wouldn't keep scaling with supply. So that by now, just renting some GPUs from Amazon might be enough.

johnthescott|6 years ago

can i hear an amen? and a huh? who did the research at wsj for this article?