Pages

Monday, May 14, 2007

Deep down, Making sense of it all one bit at a time


Last month, Andrew Gould, the CEO of Schlumberger gave a prep talk at an open house.

SCHLUMBERGER OPEN HOUSE
Schlumberger businesses and technologies demonstrations will include subsurface fluid sampling, integrated well completions, robotic tractors in a wellbore, reservoir modeling software, and geophysical seismic exploration.
10:00 a.m. to 4:00 p.m., Zachry Lobby

OPEN PRESENTATION
Andrew Gould
Chairman and CEO, Schlumberger
TITLE: Engineering Challenges (and Successes) in the Search for Oil and Gas
4:00 p.m., Room 102 Zachry


The open presentation attracted a large crowd. During the presentation, I was intrigued by the statement by Andrew that Schlumberger was positioning itself to be a provider of service for Carbon burying technology. But when you think about it, it makes sense as they have devised many services and technologies that are needed for this type of undertaking.

The room was full of people who looked like they wanted to be hired and so it was difficult to have any of them ask questions at the very end of the talk. Pissing off the CEO of the company you want to join, is a very compelling argument to not talk or ask question, or so they believe.... So I ended up having to do the dirty deed, but I was in fact really interested in several answers.

I have mentioned Schlumberger in this blog a while back, it was because of their ability to get signals from 3000 meters underground by using pulsed mud telemetry in the process generally known as Logging While Drilling. The main point was that, in order to save about 200 to 300K$ per day, they had to gather data at the drilling post in real-time so that they could steer the drilling bit (yes, drilling bits can go horizontal). Some people at Sandia have devised a Disposable Fiber Optic Telemetry System but it does not seem to have gain any traction in that industry. Pulsed mud bit rate is equivalent to an astonishing 30 bits per second transmission rate last time I checked. My question to Andrew was: have you guys done better in the past few years ? and the answer looked like a big maybe. He mentioned a new technology that uses some type of radio transmitter between each of the drilling rods but it did not seem to be a system that was yet currently used in the field. The mud communication system is an amazing piece of inventivness and the communication aspect of it is one of the most interesting problem to work on. Because of the very harsh constraints on the system (pressure, temperature,...) I am barely surprised that there isn't a better solution but I also think they should think outside the box on this one. My take would probably include using compressed sensing so that the amount of power generated in the measuring bit can be decreased tremendously. Heat generation (by the computers/electronics of the measuring bit) is non-trivial as there is little in the way of cooling when producing heat in these depths (the soil surrounding the bit is already warmer than the inside). Because of the high temperature environment, one also has to develop some better electronics to deal with these high temperature environment (see Sandia's presentation on electronics development and the need for new technology (SOI))

I then asked a question about the Canadian tar pits and the use of technology such as heat pipe to transfer energy from geothermal wells all the way up to the tar pits in order to warm them up so that they become liquid (i.e. less viscous and therefore more enconomical to retrieve from the ground). The answer looked like there is already have a program called "HTPT" that looks at that. HT may mean high temperature but I am sure what PT stands for.

And then I asked the "forward looking" question: if you wanted to differentiate yourself from your competitors in the next two or three years, where would you put your money in ? The answer was interesting because I was not expecting it. The way I interpreted what he said was: Data fusion, how do you combine the large amount of data produced in the field to have a clearer picture of your oil field (not just in three dimensions but also including time). When I went to talk to each of the engineers present at the different booth after the presentation, it did not seem that they had a view of what that entailed. One of the reasons mentioned was that most customers were not willing to put money into this type of analysis and so the company did not have a specific research team dedicated to that. The company itself is known to be dealing with very large amount of data and making sense of them for their customers. Yet summarizing that knowledge seems to be a difficult undertaking that most customers are only willing to do in-house. I am sure that an enterprising person with views on this issue could help them out. There is no reason to believe that developments in dimensionality reduction in the past few years should not be considered for those gigantic datasets.

Data fusion is also some kind of buzzword, so it may be productive to define what that means. In the measuring bit, there are different kinds of instruments, including neutron generators, radiation detectors, NMR and electromagnetic. Some of the current work seems to have been able to correlate seismic and flow measurements in order to provide a better assessment of the borehole condition. Therefore, a data fusion scheme would be aimed at correlating all the measurements from several types of sensors in order to provide additional information about either the location of the measuring bit and the time dependent geological conditions around that bit.

In order to do that, one has to compare measurements with computations. One of current generic concern is the ability to do inversion with Monte-Carlo codes such as MCNP (This is a very difficult problem because the solving of this inverse problem requires several many runs of forward computation by MCNP) or faster but coarser deterministic methods. You have many different parameters that you change (sensitivity studies) in order to figure out the distribution of parameters for the situation of interest.

Since MCNP or deterministic codes have many different parameters and are running in a finite time, one needs to have tools that provide a way of "interpolating" between parameters family you have not explored computationally. In the end, this problem is not unlike the problem faced in nuclear engineering when one runs a complex thermal hydraulics code: The Experimental Probabilistic Hypersurface tries to help in that respect.

No comments:

Post a Comment