McKinsey reported that most oil and gas operators have not maximized the production potential of their assets. A typical offshore platform, according to the 2017 report, runs at about 77% of its maximum production potential. Industry-wide, the shortfall comes to about 10 million barrels per day, or $200 billion in annual revenue.
To help optimize production, operators might consider adopting advanced analytics, which combines engineering, data science, and computing power to enable businesses to forecast yields or maximize industry assets.
In adopting analytics, it follows that AI would find its way into the oil and gas industry. As of now, numerous companies claim to assist engineers and data scientists in aspects of their roles, such as predictive maintenance on equipment, predicting supply and demand, and streamlining routine processes.
We researched the space to better understand where predictive analytics comes into play in the oil and gas industry and to answer the following questions:
- What types of predictive analytics applications are currently in use in oil and gas?
- What tangible results have predictive analytics driven in oil and gas?
- Are there common trends among these innovation efforts? How could these trends affect the future of oil and gas?
This report covers vendors offering software across two applications:
- Predictive Maintenance
- Business Intelligence
This article intends to provide business leaders in the oil and gas space with an idea of what they can currently expect from AI in this industry. We hope that this article allows business leaders in the oil and gas industry to garner insights they can confidently relay to their executive teams so they can make informed decisions when thinking about AI adoption. At the very least, this article intends to reduce the time business leaders spend researching AI companies with whom they may (or may not) be interested in working.
Predictive Maintenance
GE Digital’s Predix
GE Digital, a subsidiary of General Electric, offers Predix, which the company claims can help oil and gas businesses create automated analytics models that could help in the predictive maintenance of its industrial equipment using machine learning.
Predix explains that the application’s machine learning algorithms are able to process data that sensors collect, such as equipment or parts performance, environmental data, and weather conditions, among others. The algorithms then compare these against the ideal performance data contained in the database. If the algorithms find discrepancies between the current and ideal state, the application is triggered to send an alert to technicians. who in turn conduct predictive maintenance or part replacement.
For instance, in a wind farm, one turbine is performing below optimum levels. The technician gathers the data collected by sensors about the parts of the turbine, such as the wind blades and turbine axle. The technician might also consider reviewing the wind speed. The turbine’s performance history will then be compared with that of other turbines at the wind farm.
The in-house technician may share the information with the field technician through a connected tablet or smartphone, who is prompted to inspect the underperforming turbine in person. Repairs are then conducted if needed.
Below is a short 3-minute video demonstrating how Predix works:
GE Digital claims to have helped the Administracion Nacional de Combustibles, Alcohol y Portland (ANCAP) of Uruguay. The state-owned company provides the fuel that heats homes and business and fuels cooking equipment, agricultural machines, and transportation. The company was faced with the challenge of managing large amounts of data and needed a solution to make processes more efficient, optimize energy consumption, integrate operational data from various sources, and ensure the sustainability of the company.
The company turned to Predix’s human-machine interface and supervisory control and data acquisition (HMI-SCADA) application called iFIX. The project was implemented in five distribution plants, with more than 1,000 screens deployed. Employees at each location learned to use the system to input critical data.
With the centralized system, ANCAP was reportedly able to track field data, such as gas and liquid flow rates, composition analyzers, and tank levels and volumes. The system also enabled the company to monitor the performance rates of equipment, daily throughputs from process units, and weather conditions. This data was automatically processed, formatted to a spreadsheet, and uploaded to the government’s website.
With this data, ANCAP was able to calculate and project the efficiency of its furnaces using this data. although the case study did not explain how efficiency was defined. The data was recorded in the server to enable the team to study trends related to the furnace’s performance.
According to GE Digital, implementing Predix allowed ANCAP to cut the time spent on routine processes by 60% and save 20% more fuel. Management also reportedly stopped asking employees for manual reports
GE Digital also lists Exelon, Gerdau, Spomlek, Lek Pharmaceuticals, and the City of San Luis Obispo as some of its clients.
Vincent Yates is the Chief Data Scientist at GE Digital. He holds a PhD in Statistics from the University of California, Berkeley. Previously, Yates served as Head of Data Science in Seattle at Uber, Director of Analytics Engineering at Zillow Group, and Office Analytics Team Manager at Microsoft.
Maana
Maana is a California-based company with over 100 employees. The company offers a software called the Knowledge Platform, which they claim can help oil and gas companies predict operational outcomes and help employees make better-informed decisions using machine learning and natural language processing.
Maana claims its software can mine, process, and analyze a company’s unstructured data, such as job reports, maintenance records, equipment sensors, weather data, call center records, and other types of media from disparate company sources. Then, Knowledge Platform’s natural language processing algorithms interpret the data from these reports. The machine learning algorithms crawl through the platform’s database to find patterns that are similar to the current problem the company is aiming to solve.
When the algorithms find similarities, the system then provides feedback in the form of graphs to show the trend of, for instance, equipment performance or expenditures. This allows the company’s subject matter experts to interpret the graphs and make recommendations on how to solve the problem.
Below is a short 2-minute video demonstrating how the Knowledge Platform works:
Maana claims to have helped an unnamed Fortune 20 oil company make optimal pump selections, increase billable hours and reduce overall costs using the application. The company’s maintenance experts used the platform to collect data related to existing pump operations. The data came in the form of run-and-pull reports, pump failure reports, pump sensor data and high-frequency data flows. Much of the data described past inspections of failed pumps retrieved from wells and was reported by field employees.
Aside from language-based data, the company also collected detailed sensor data during pump operations. The application was able to classify the data and recognize the patterns that could lead to pump failure.
According to Maana, this enabled employees to validate their hypotheses and identify the causes of pump failures, predict similar scenarios in the future, and choose the right pump for each kind of well. Overall, this allowed the company to conduct maintenance that would reduce the risk of pump failures and production downtime.
Maana also lists Shell, General Electric, Airbus, Maersk, Chevron as some of their clients.
Steven Gustafson is Chief Scientist at Maana. He holds a PhD in Computer Science from the University of Nottingham. Previously, Gustafson served as R&D Leader, Knowledge Discovery Lab at General Electric Global Research.
Business Intelligence
HortonWorks
HortonWorks is a San Francisco-based company with about 1,300 employees. The company offers a software called Hybrid Data Platform (HDP), an open-source application that processes large datasets from multiple sources, which they claim can help oil and gas companies predict well yield do maintenance on equipment.
Hortonworks claims that HDP can store and process its structured and unstructured data, such as sensor or seismic data, weather, drilling and completions data, geolocation, text files, video, social media, email, and more. These are stored in a data repository.
For instance, an oil and gas company may want to set standards of yield per well to achieve the highest margins. To set the benchmarks, the engineers might use seismic data, pump rates, fluid temperatures, and other factors that influence yield.
The software’s machine learning model would then take these predefined benchmarks, search the HDP database for similar standards, and compare the corresponding well yields. If the algorithms find that the current yields are lower, based on the influencing factors, the system informs the users through the dashboard.
Below is a short 2-minute video demonstrating how HDP works:
Horton claims to have helped Noble Energy predict and prevent downtime in its hydrocarbon infrastructure. The case study reports that HDP was used to look for potential lost opportunity through downtime, but did not provide details. However, we can infer that the application’s machine learning model reviewed the data from sensor streams, drilling reports, location data, engineering notes, and technical manuals, etc.
Then, the algorithms may have matched these with historical data related to factors that could affect production, such as equipment failure and decline in the output of existing oil wells, among others. This would have enabled Noble Energy to take measures to maintain equipment and possibly discover new wells.
In the future, Noble Energy intends to use HDP to improve safety and prevent injuries among employees in its locations.
Hortonworks also lists Fuso, Johns Hopkins University, Nissan, Yahoo! Japan, Mayo Clinic, SoftBank, Expedia, and Symantec as some of its clients.
Scott Gnau is CTO at Hortonworks. He holds a BS in Electrical Engineering from Drexel University. Previously, Gnau served as President of at Teradata labs for nine years, where he provided direction for research, development, and sales support activities related to Teradata integrated data warehousing and big data analytics.
SAS
SAS offers a predictive software called Enterprise Miner, which they claim can help oil and gas businesses streamline the data mining process to develop predictive models using deep learning, computer vision, and natural language processing (NLP).
SAS claims that NLP algorithms are capable of extracting business insights and emerging trends from texts, speech and sound, while computer vision algorithms determine the objects inside images and videos. The information extracted by these technologies are then processed and analyzed by deep learning algorithms, which recognize patterns in the data to create predictions and preventive recommendations.
Below is a short 3-minute video demonstrating how SAS artificial intelligence-driven applications work:
SAS does not have oil and gas-related case studies but claims to have helped Old Dominion Electric Cooperative (ODEC) forecast energy demand and save its utility customers millions in its first year of using SAS Analytics, according to the case study.
ODEC provides wholesale power to 11 distribution cooperatives in Virginia, Maryland, and Delaware that serve 1 million member customers. For energy purchases, the cooperative must contract energy months in advance to ensure affordable supply at wholesale prices. Wrong forecasts might force ODEC to buy energy at higher spot prices
In the past, ODEC used traditional spreadsheets to create forecasts. SAS enabled ODEC to forecast more accurately using a variety of industry-specific models that support system analysis, hedging, financial forecasts, and future resources for energy and demand.
According to the case study, using SAS allowed ODEC to understand each cooperative’s market while providing a big-picture look. This enabled the client to plan for its markets’ power needs five, 10, or 20 years into the future. The company did not provide further details or specific numbers.
SAS also lists Honda, Bank of America, Nestle, Lufthansa, Konica Minolta, and World Wildlife Fund as some of its past clients.
Wayne Thompson is Chief Data Scientist at SAS where he has served for 26 years. He holds a PhD in Agronomy and Statistics and another in Plant Sciences; Minor Statistics. Both are from the University of Tennessee.
Takeaways for Business Leaders in the Oil and Gas industry
Most of the companies covered in this report offer solutions for preventive maintenance and production-related analytics. Worthy of note is that all of the companies’ AI efforts are spearheaded by PhD-level talent except Hortonworks.
That said, only Maana offers solutions exclusively for industrial production and oil and gas companies; the other companies covered in this report service multiple industries. Maana’s specificity could work in its favor, as its machine learning models might be trained exclusively on industrial databases, which could increase its models’ accuracies. This is speculation, however.
Although no company provided hard times on how long their software would take to integrate, based on one case study, we infer that the integration product could be long.
Overall, predictive analytics applications for the oil and gas industry seem to legitimately involve AI, unlike other sectors that are more nascent in terms of AI.
Header Image Credit: Oil and Gas People