By Joshua Thorp, Geoscience manager, Searcher Seismic

THE never-ending search for oil and gas supplies is a high-stakes, high-risk endeavour that requires some of the largest capital expenditures of any industry. Despite prior successes, the Australian oil and gas sector faces increasing complexity and pressure on several fronts including operational, environmental and political. As such, any tool or service that can reduce risk, lower cost and/or accelerate results is highly valued within the industry.

While recent advancements in technology such as the Internet of Things (IoT) have significantly raised standards in exploration, drilling, and production; using technology such as machine learning (ML) and deep learning (DL) holds even greater potential for the sector. Together, these technologies allow O&G organisations to better manage and understand their massive data sets, which are being generated daily.

Where is the data coming from?

Over the last decade we have seen the oil and gas industry undergo a massive a digital transformation – with data at the heart of the evolution. In fact, a recent report found that global big data in the oil and gas market is expected to reach a value of over US$10 billion by 2026.

Data is being created faster than ever before as companies expand their sources, gaining access to sensor data, well data, oceanographic data, geological data, environmental data, and more. Innovations in 3D imaging and drilling operations, for example, have seen IoT sensors in particular, come into their own – now being used to record key metrics such as fluid pressure, temperature, and composition during oil and gas production.

The more equipment being connected; the more data is being generated. As a result, the industry is swimming in petabytes of data and faces a new problem – understanding how this data can be used more effectively to meet global demand for resources.

The answer lies in the handling and processing of data sets. For example, advancements in ML and DL have been critical to successful resource discovery and extraction. As a result, companies are now prioritising big data in an effort to improve oil and gas operations.

Through the use of ML and DL, data can be used to understand the impact of salt deposits on machinery, helping to gain efficiencies across oil and gas sites. In addition, DL can help teams to locate deposits that may have been overlooked, or were once inaccessible, due to legacy systems and lack of innovation in the field.

Drilling into the power data
According to the government’s latest Resources and Energy Quarterly, Australia is set to become the world’s largest gas exporter by 2020 and is forecast to export 78 million tonnes of LNG in 2019-2020. As the industry expands to meet supply needs, technology and data will continue play a pivotal role in the sustainability and success of oil and gas exploration.

However, it’s not as easy as it sounds. On the one hand, oil and gas companies are dealing with highly sensitive, incredibly valuable data. On the other, it’s completely useless without the IT infrastructure and tools to understand, use and store that data effectively.

This has paved the way for multi-client data providers such as Searcher Seismic to take the data, clean it up and give it to oil and gas companies to draw insights from. Gathering and packaging the data is often a long, labour-intensive process that involves a lot of steps, manual labour and time. In order to better service the industry, organisations like Searcher Seismic are constantly looking at ways to shorten the data handling process whilst garnering the highest level of insights from their data.

In a bid to maximise compute performance, Searcher Seismic turned to the cloud to seek the storage it needed to house all of its seismic data. However, the organisation found cloud-based data storage was slow, inflexible and expensive. In particular, the slow pace and high cost of data transfer could not meet its client’s needs. In fact, at the cloud transfer rate, it would have taken years to ingest all of their data, and hundreds of thousands of dollars a month to store the sets. This is the ‘large data-set challenge’ in practice, highlighting the unique demands and requirements of the oil and gas industry.

Searcher then approached Pure Storage, a leading data solutions provider, who provided a bespoke solution – the FlashBlade data hub – which puts big data and analytics at its core. Compared to the cloud, FlashBlade provided a 20 – 30x improvement in our data processing speed, allowing them to ingest 300TB of data a day. This also allows Searcher to have a flexible hybrid cloud platform which can easily send data to any location, including all the major cloud providers, without the incurring additional costs.

Working with the right solutions provider has enabled Searcher to introduce valuable new services for clients such as web-based visualisation. For example, Searcher now have a web-based platform, Saismic, where clients can log in and view their data, even before the data has been delivered to them.

Where previously, seismic data could take up to three months before it is integrated into a client’s system, Searcher can now provide geoscientists with the ability to look at that data in real time. In addition, a 20-step, people-intensive process has been transformed into the click of a button – vastly improving data delivery times, reducing the potential for errors, and most importantly, freeing up employee time to work on higher value projects.

Where do we go from here?
The “digital oil field” is about more than delivering cost-effectiveness to the oil and gas industry. In addition to boosting the access and supply to resources, data from the field is also vital to addressing safety and environmental concerns. Enhanced processes, generated through data, have been crucial to managing risk, enhancing safety indicators, and addressing compliance. However, in order to be successful, it’s vital that O&G organisations have clean, usable data available across their operations.