Home

Details Page

18 January, 2017

Breakthrough Big Data and Deep Learning in Today's Oil Industry: Interview with Kamal Hami-Eddine

Innovators in Geoscience Series

New analytical techniques that work with a massive volume of data of extremely wide variety are enabling geoscientists and engineers to understand the nature and extent of reservoirs in ways never possible before. Welcome to an interview with Kamal Hami-Eddine, Paradigm, who explains big data and deep learning as they are being used today in the petroleum exploration and production. Kamal will be presenting at the AAPG Deepwater / Big Data GTW.

Extracted reservoir bodies using Democratic Neural Network Association
What is your name and your relationship to big data?

My name is Kamal Hami-Eddine, Paradigm, and I studied applied statistics, probabilities and stochastic processes. This how I got introduced to big data problematics. I was studying in a city where the plane industry is big, and a big challenge for them was to monitor and learn from all the measures they take during flights, to limit maintenance cost. At that time the problem was unsolvable, but lots of research was done to find ways to transform idle data into information. That being said, I worked a lot on machine learning and neural networks more specifically, so naturally these days, it is all about big data and deep learning.

What is big data and how is the concept often misunderstood?

Big data I guess gets misunderstood, simply because of its name. We often only retain “big”. This often leads to common thinking that we already do big data in the oil industry since ever. We have big data, but we do not leverage much information from it yet. Big data is not only big. It is about its volume, velocity, variety, and the capacity to solve business problems using it right.

How did the concept of big data come about? What were the needs it addressed?

The first paper mentioning big data was published in 1997, but many researchers were working on these topics since long time. “Saving all the bits” (1990) from Peter J. Denning states the problem clearly and talks about the hardware issues they encountered at that time. In the introduction, the need to handle big data for earth sciences is expressed. The explosion of data volume and variety, accelerated exponentially thanks to internet democratization. It has driven the need to store more and more data. So big data is not born to address a problem, it was the problem. The first problem was therefore to store data, and then being able to retrieve it from anywhere on the planet in a second. In parallel, we all realized the huge value of this massive available data and focus was made to be able to perform data mining at such a scale to benefit from it in all domains.

What were some of the early applications of big data in the oil industry?

Early applications have been seen to try to monitor maintenance of rigs elements based on captor measures. The idea is to try to anticipate any break in production due to a maintenance failure. It is basically trying to benefit from a huge amount of measures to perform an Artificial Intelligence based on time series analysis. This application leverages the infrastructure implemented for big data and relies a lot on machine learning. It still does not fully deal with the data variety, as still mostly relying only on structured data.

Some major big data and deep learning actors have started to develop some prototypes for the oil industry to analyze geophysical and geological data, however there is no publication yet as far as I know, which can be qualified as the first big data application in field exploration and development. On our side we started to investigate on how deep learning could assist interpreters on their day to day tasks.

What are some of the most exciting applications for big data that you see today?

I am biased, and big data cannot go without deep learning for me. For daily usage, the autonomous vehicles development is to me the application which will have the biggest impact on the non-digital world. However the ones which look really impressive to me, are the ones dealing with the live treatment of satellite images. This can stack up to several tera-bytes a day, to transfer, process and analyze. For example Copernicus project and Sparkin data, analyze land movement from satellite images, which could have an impact on mining, civil engineering or even in the oil and gas industry.

How can big data open up areas for oil and gas exploration and development that were previously not open?

Reducing costs, keep information as people move and assist geoscientists. It can provide tools to help managing our huge databases, get the most of exabytes of documents available. A lot of unstructured information contains lots of knowledge, and therefore would help reducing overall risk, if it could be presented in an intelligible way. All decision makers would have access to an unprecedented level of business information. On another aspect, geoscientists will be empowered through easier access to analogs, and assistance tools to ease fastidious tasks so they can concentrate on what makes their value.

Where is big data in oil and gas going?

Everywhere. The question is more about the pace. It can go very fast in the domain of geoscientist assistance, thanks to deep learning. It is the same in production analysis, and text mining. User experience with software may also be impacted by the huge improvements made in Natural Language processing.

Can you recommend a few articles and books?

Several very good MOOC are available on the main platforms to get an introduction to Big Data and Deep Learning. This may be a good way to get into the topic in a structured manner.

Awesome deep vision on Github provides links to very interesting scientific papers in this domain which helped me a lot to follow the trends.

Otherwise as an introduction: