Artificial Intelligence at Procter & Gamble

Nicholas DeNittis

Nick DeNittis writes and edits AI industry trends and use-cases for Emerj's editorial and client content. Nick holds an MS in Management from Troy University and has earned several professional analytics certificates, including from the Wharton School.

Artificial Intelligence at Procter & Gamble@2x-min

Procter and Gamble (P&G) is an American multinational goods corporation. The company is well known for its fabric & home care, family care, beauty, healthcare, and grooming products. P&G owns several of the most well-known brand names in the world, including Pampers, Tide, Gillette, Always, Head & Shoulders, and many more. 

In 2023, P&G reported net earnings of $14.3 billion on $82 billion in net sales and 107,000 employees.

Per an article written by CIO Vittorio Cretella and published on P&G’s blog, P&G’s own R&D department researches and implements in-house across departments. Regarding strategy, Cretella wrote that P&G is focusing on scaling AI initiatives by clearly articulating their business purposes, building organizational AI fluency and skills, and standardizing AI development throughout P&G enterprises for speed and efficiency.

Specific business areas Mr. Cretella mentions where AI and machine learning have been and are being integrated include distribution and retail, media planning and buying, product and package innovation, and manufacturing and back-office operations.

“The true impact of AI can only be felt when it’s used pervasively across the entire organization,” Cretella wrote for the company vertical. “When we step away from one-off initiatives and move towards scaling algorithmic solutions across multiple categories and markets globally.”

This article examines two timely, actionable AI use cases that are of high interest and applicability to executives in the retail or consumer goods sectors:

  • Supply chain optimization: Automating the integration and analysis of vast supply data via predictive analytics and machine learning to decrease inquiry response time and reduce labor costs.
  • Consumer behavior analysis: Utilizing and analyzing real-time usage pattern data via predictive analytics and machine learning models to predict future consumer behavior, inform product development, increase sales, and improve customer satisfaction.

We begin with how P&G uses the KNIME platform to produce supply and demand forecasts, including during the early days of the pandemic.

Use Case #1: Supply Chain Optimization

P&G has not been immune to significant supply chain challenges, with unexpected events such as hurricanes, canal blockages, and the COVID-19 pandemic causing considerable disruptions. Reportedly, the complexity of handling over 5,000 separate products and 22,000 individual components made it difficult to quickly identify the impact of such events on their products, supplier networks, and plant equipment.

The company has traditionally used varied, isolated data systems for supply chain management, making data integration a tedious and likely expensive task. Data handling necessitated experts from five divisions – manufacturing, supply chain, marketing, quality assurance, and laboratory information systems – consuming hundreds of labor hours per project, according to an article by project partner KNIME.

To address the above challenges, the KNIME report tells the public that P&G collaborated with analytics provider phData to develop an AI-powered solution using KNIME’s open-source analytics platform. Reportedly, phData chose to use KNIME to fully automate the integration of complex data from the five divisions mentioned previously. 

With the data fully integrated and accessible, next was the enablement of real-time forecasting and superior visibility throughout the supply chain using said data.

The 3-minute video below shows an overview of the KNIME user interface and canvas:

P&G’s data inputs for the KNIME platform included:

  • Bill of materials data for 5,000 products and 22,000 components
  • Supply chain data, including vendors, manufacturing plants, warehouses, and distribution centers
  • Current inventory data to assess supply and demand risks

Users of the KNIME platform appear to be able to perform several vital functions, each producing its own output:

Data integration

  • Consolidation, standardization, and harmonization of disparate data streams into a single repository. 
  • The output of this function is an integrated data set that provides a unified view of information, presumably through dashboards or reports.

Real-time analysis

  • Automation of data collection and integration, combined with data analytic capabilities, enable the analysis of supply, demand, and inventory data in real time. 
  • The output is a live analytical report that can include current inventory levels, supply chain bottlenecks, and other critical metrics. 
  • The output from this function is a set of supply projection reports that forecast future inventory requirements and highlight potential supply chain disruptions.

Supply projections: 

  • Machine learning models for predictive analytics are used to, assumingly, ensure product availability, manage inventory, and identify potential risks to supply. 
  • The output from this function is a set of supply projection reports that forecast future inventory requirements and highlight potential supply chain disruptions.

Demand forecasting: 

  • Combined data across departments and machine learning algorithms analyze historical data and identify patterns or trends to produce demand projections. 
  • The output is a demand forecast report that projects future product demand.

Reportedly, the solution enhances the company’s supply chain visibility and increases resilience to disruptions. Specifically, it claims that the enterprise achieved the following business outcomes:

  • Elimination of 10-plus experts to zero for data verification
  • Decrease in the response time for supply chain inquiries from over two hours to “immediate” results.
  • Consolidation of multiple regional meetings into one global meeting, implying a streamlined decision-making process.

Use Case #2: Customer Behavior Analysis for Product Development

On its R&D website, P&G cites the challenge of accurately capturing customer behavior to inform product development. Traditional data collection techniques, such as surveys, discussion panels, and focus groups, primarily consist of self-reported information, which could be flawed due to errors. 

A data “gap” between reported and actual consumer behavior resulted in skewed data P&G was using for product development and innovation.

To bridge this gap, P&G’s R&D team leveraged AI to analyze real-time usage data from smart products, such as the Oral B iO toothbrush. P&G’s smart products are equipped with sensors that collect real-time usage data. This data is often used to create new products or customize product lines to customer preferences. 

For example, the Oral B iO toothbrush was the result of algorithms revealing the average brushing time as only 47 seconds, compared to the two minutes reported by users. 

Other products that may have data collection, analysis, and sharing capabilities similar to those of the Oral-B iO toothbrush include:

  • Olay Skin Advisor
  • Febreze NEW-AIRIA Smart Scent Diffuser
  • Lumi by Pampers (discontinued)

Data inputs included real-world, real-time sensor data from households using P&G’s smart products. AI and machine learning technologies process these “granular, computer-generated data points” through predictive analytics, data mining, and pattern recognition. 

IoT analytics further analyzed the sensor data to understand customer interactions. The outputs of this processing were new ingredient combinations, real-time feedback for R&D engineers, and consumer behavior insights.

Referencing an article written by CIO Vittorio Cretella, end users apparently access generated insights and other relevant data through platforms such as Google Cloud’s BigQuery. The article also implies P&G’s use of Google Cloud services tools such as Dataflow and Vertex AI for building and deploying machine learning models, suggesting a streamlined interface for R&D teams to interact with the data.

Cretella states that the AI-driven modeling and simulation techniques expedited algorithm development from months to weeks. He claims that this rapid iteration led to tangible business outcomes, including enhanced product development and more timely and accurate consumer insights. He offers no specific, quantitative results, however.

Subscribe
subscribe-image
Stay Ahead of the Machine Learning Curve

Join over 20,000 AI-focused business leaders and receive our latest AI research and trends delivered weekly.

Thanks for subscribing to the Emerj "AI Advantage" newsletter, check your email inbox for confirmation.