Digital transformation and advancements in data has provided a wealth of information for companies in the oil and gas sector, enabling project teams to derive real business value and make better business decisions. However, having the correct skills, processes and methodologies assigned to undertake a data clean-up or migration project is essential to drive results.
In today’s fast-paced environment, it’s crucial to overcome challenges with data and analytics and utilise digital transformation more effectively. Getting this right can potentially lead to new business opportunities, enhanced efficiency, and significant savings. However, when it comes to data, Subsurface and Wells teams are often faced with a range of problems to solve, from inherent data quality issues to internal resourcing challenges.
During a project, vast and expanding quantities of data will be acquired over time in disparate formats, and then distributed, loaded, and duplicated repeatedly among different databases. Information may be incomplete, inaccurate, or uncertain, and access to critical and accurate project data will often be too slow for real time decision making.
In data management, data accuracy is the first and critical component of the data quality framework. Nonetheless, with technology advancing at speed and a relentless appetite for data, the temptation is to keep populating new datastores without considering its state of accuracy.
Whilst data clean-up initiatives may be undertaken once a project is completed, the results often require ongoing maintenance, which can become neglected. As a result, the same process begins with the same data silo or database, and often the same low-quality data.
In addition, a system of record or master datastore with Subsurface and Wells is often not clearly defined from the outset, leading to duplication issues and sometimes even loss of original data. A reluctance to save just the final copy is also quite common, increasing the risk for errors when referencing outdated versions.
The protracted absence of a centralised integrated, subsurface system of record over many years is highly problematic, and by settling on a system largely without relational integrity, the data management process is considerably more difficult.
With an increasing focus on data science and the use of scientific methods, processes, and algorithms to extract knowledge and insights from data, the need for accurate data is paramount.
One of the first tasks performed when doing data analytics is to clean the dataset you’re working with. The insights you draw from your data are only as good as the data itself, so it’s no surprise that analytics professionals spend an estimated 80% of their time preparing data for use in analysis.
These data irregularity issues are not new or unique to a single company; they are universal. What is new is the many efficient and accurate ways that the sector is tackling the issue.
A cross-industry collaboration has solved the challenge of storing, organizing, migrating, and accessing subsurface data, with the OSDU Data Platform. Developed by The Open Group OSDU™ Forum, this Open-Source software is enabling unlimited flexibility in the use of data between applications and domains.
Despite the OSDU Data Platform addressing several of the most common data management issues for the sector, the time and skills needed to take advantage of this may be limited, or even missing completely in some companies.
Since the 2014 oil price decline, the industry has been pushing forward with recovery. However, the pace of recovery has been slow, and many companies have been forced to review and cut OpEx costs including adapting projects, changing existing business models, and revising staffing levels. Over time, data management staff levels have been reduced and teams are now significantly leaner; often only having time to do functional work.
Whilst the oil price decline has continued to impact business operations, not helped by the Covid-19 pandemic and now the evolving situation with Russia and Ukraine conflict, these economic factors are not the only drivers for change. Many companies are now proactively outsourcing digital domain expertise to acquire new skills and digital knowledge at pace, as well as bolster their existing teams.
Highly skilled digital consultants are being appointed to work with internal staff, passing on use of the tools, as well as expert technical knowledge and providing clear and well-defined processes and documentation.
Having effective communicators who can articulate clearly across all technical levels within the organisation is essential and can expedite any quality improvement to clean and transform data. The deployment of experienced, skilled subsurface petroleum data managers and geotechnologists in this area is enabling companies to achieve high quality data and more accurate decision making.
If the industry is to embrace platforms such as OSDU, a potentially complete solution for efficiently storing, managing, and publishing raw and edited data in one integrated system, ensuring that data input is complete, accurate reliable and accessible is more important than ever. Getting this right, requires the right people, with the right expertise, to ensure the data provides the right results and real return.
Tina Roberts is responsible for leading the Data Services team within E&P Consulting’s Oil and Gas business, and for developing and promoting E&P services to clients in these areas. A geologist with 25 years’ experience as an explorer and technical subsurface data specialist for Oil and Gas Companies, Tina has a rare insight into the demands of the business in addition to the opportunities new technologies can bring.