Keeping a Lid on Exploration Costs: The Art of Collaboration

Posted by Dan Pigat on Mon, Jan 19, 2015

Jan16-2015-Post

With the price of oil cut in half over the last half of 2014, there is renewed and increased pressures on the costs of oil & gas exploration.

With a deep water well costing anything up to US$100 million plus and with many of the ‘low hanging’ fruit of the world’s oil & gas fields long since depleted, operators are pushing into more remote, technically complex and capital-intensive fields to sustain long-term production.

According to Reuters, the cost of discovering a barrel of oil has risen three-fold over the last decade (http://uk.reuters.com/article/2010/02/11/idUKLDE6191WK) with the Canadian tar sands, the Arctic and deep water fields in the Gulf of Mexico and off Brazil far more technically challenging than the more conventional predecessors in the North Sea or Middle East.

However, all is not lost! Just as exploration challenges have increased, so have the technologies available to more accurate map out prospective oil & gas reserves and reduce the operator’s nightmare scenario - dry wells.

Such technologies include broadband and wide azimuth seismic, dual sensors in towed streamers through to reservoir modeling innovations and other data sources that complement seismic, such as electromagnetic data and gravity gradiometry.

It’s technologies such as these that - if combined together - are able to map out prospective reservoirs more accurately and in greater detail than ever before and identify fields that were previously invisible.

Yet, the issues over cost remain. How can one draw these different technologies into one integrated and cost effective workflow? How can operators create a workflow that that is both flexible enough to incorporate these new tools, yet also robust enough to deal with the huge and varied amounts of data?

IT giant, EMC (http://uk.emc.com/collateral/emc-perspective/h9521-lever-cloud-oil-gas-ep.pdf) in a recent report on the oil & gas sector puts the challenge well:

“As the volume of data grows and the analytical tools for learning it increase in sophistication, the challenge of marshaling rapidly growing data can be overwhelming. It’s a problem of both scaling (database, computing power and applications) and integrating (assembling and harmonizing data from multiple sometimes new sources).”

For me, the answers to these challenges are both cultural and technological.

On the cultural side, there needs to be a change in mindset – an understanding that the whole is very much greater than the sum of its parts and that it is only through a focus on collaboration and through an exchanging of ideas and data within the workflow that a true picture of the oil & gas subsurface can be generated. The days of expensive niche technologies and fragmented, unproductive asset teams must be consigned to the past.

Similarly, there needs to be technological change as well. This requires an embracing of new technologies and the creation of a flexible and accessible exploration workflow. It also needs to mark an end to the dependence on large, unwieldy mainframe computers; the huge terabytes of data they generate; and the proprietary, stand-alone software packages that make workflows so fragmented.

The good news is that many of these enabling technologies are now available meeting both the scaling and integration requirements that EMC references in their report.  Much of this can be seen in cloud-based computing.

There’s the collaborative nature of the cloud, for example, where different data sources and technologies can be accessed via different devices and different members of the asset team. This can be crucially important in harmonizing multiple data sources.

There’s also the elastic capacity of data and the computer power on demand that the cloud offers with a mitigation of the high costs of ownership of complex IT and multiple servers so often seen in oil & gas exploration.

Finally, there’s the scaling issues where huge exploration files need to be accessed by different people in a flexible web-based and mobile applications environment. Again, the cloud and technologies such as PureWeb are enabling this by transferring data-intensive business applications into a mobile environment without a compromising of the data.

Of course, many of the advanced technologies used in exploration today can be eye-wateringly expensive. There’s no doubt to me, however, that the costs of exploration can be brought down dramatically if collaboration and a combining of different data sources is at the epicenter of exploration activities moving forward. 

Learn about PureWeb

 

Tags: Blog