AI Knowledge Retention in the Enterprise – Making the Most of Lessons Learned

Daniel Faggella

Daniel Faggella is Head of Research at Emerj. Called upon by the United Nations, World Bank, INTERPOL, and leading enterprises, Daniel is a globally sought-after expert on the competitive strategy implications of AI for business and government leaders.

AI Knowledge Retention in the Enterprise - Making the Most of Lessons Learned 950x540

Novice AI project leaders measure projects entirely by (unrealistic) near-term financial benchmarks.

Relatively experienced AI project leaders understand that AI initiatives must be measured by well-considered measurable benchmarks along with relevant progress on larger strategic goals for the organization.

Many expert enterprise AI project leaders would argue that deploying initial AI projects is more about the ROI of learning than it is about the ROI of any first AI application itself.

We might think about a variation of the “teach a man to fish” proverb:

Implement a successful AI initiaive in an enterprise, and they’ll derrive near-term value.

Level up the AI fluency and understand of an enterprise team, and they’ll have the ability to discover and deliver AI projects into the future.

Firms that retain AI knowledge can spin the flywheel of innovation faster than their competition. Digitally native tech firms like Google or Amazon aren’t just more nimble at deploying – and seeing the value from – AI because they have more data scientists and a better data infrastructure (though these factors certainly help).

They’re able to nimbly deploy AI because their teams have a lot of rich experience iterating with AI-related projects, and both the technical and non-technical team members are able to contribute in terms of finding AI opportunities and bringing those projects to life.

In legacy enterprise firms, this AI-related learning won’t happen unless a cross-functional AI team strives for a measurable ROI. Learning for it’s own sake is an investment few firms will make – but “learning on the job” as part of an existing initiative.  Even if that application produces no financial return, a team can be better off than it was before the project started and more able to make the most of future projects.

This “learning ROI” (which we refer to as Capability ROI) results from improving an enterprise’s AI maturity, which breaks down into three broad categories: Skills, Culture, and Resources. Refer to our Critical Capabilities model of AI maturity below:

Prerequisites to Successful Enterprise AI Adoption
Source: Emerj Plus Best Practice Frameworks

From the outset of any project, a cross-functional AI team should have:

  1. A strong grasp of the problem to be solved or opportunity to be exploited
  2. A reasonable understanding of how success will be measured, and what kind of tangible return would make the project worthwhile (see our full report on Generating AI ROI).
  3. An outline and initial plan for how to approach the project, phase-by-phase
  4. A process for recording lessons learned from this particular AI project

Each company manages knowledge differently, and the particular system or repository isn’t as important as the habit of tracking and maintaining a categorized list of insights.

The person in charge of retaining these lessons learned will sometimes be the project manager themselves, or a subject-matter expert assigned specifically to the task of tracking lessons learned.

Categories of Retained Learning

Retained insights will generally be categorized according to the data science life cycle step or the phase of AI deployment that they pertain to. Alternatively (or additionally), insights can be sorted or categorized along the lines of the AI maturity element that they relate to.

3 phases of AI deployment
Source: AI Deployment Roadmap

In the sections below, we’ll explore a number of examples of the kinds of insights that a cross-functional AI team would want to retain over the course of iterating on and implementing AI projects.

The representative examples provided below are intended to serve as jumping-off points for readers.

Phases of AI Deployment

  • PoC – An AI team may discover that in order to run a PoC successfully, they need to handle data access, legal, and compliance issues upfront, rather than waiting to ask those questions when heading into the incubation phase, when insurmountable issues may come up after months of work on a seemingly viable PoC. The team may devise a specific legal and compliance checklist for all PoC projects which can be used to screen future vendors and “sandbox” projects of any kind, preventing future waste.
  • Incubation – An AI team in the Incubation phase may discover that an Incubation project cannot be fully deployed unless a maintenance team can be established ahead of time. The team may learn that team members are reluctant to become part of the maintenance of an AI application (say, a fraud detection system within an eCommerce company) if that isn’t the job they signed up for with the company. Hence, future Incubation periods may involve determining recruiting needs for the maintenance team and a robust process of recruiting subject-matter experts from within the company to flag themselves with interest in deploying the project, well before it can actually be deployed. A process or checklist might be developed for this, to be tweaked and used across future AI projects.
  • Deployment – An AI team may learn that the deployment of some AI systems requires new types of interoperability with existing technology systems. A machine learning integration map might be created to determine the required integrations and potential risks and challenges of different kinds of deployments, allowing teams to better establish IT and data infrastructures that could sustain a live AI deployment.

Steps of the Data Science Lifecycle

  • Step 2 – Data Understanding – An AI team that works continuously with its most important data may discover important patterns or errors within that data. For example, a bank may discover key patterns in it’s bank transfer data, or common errors that cause some data to be stored differently or incorrectly. Recording these common issues, or repairing them at their source, would help future AI teams who will work with the same data later on.
  • Step 4 – Data Preparation – An eCommerce company may discover critical insights about how to prepare its transaction and behavior data for recommendation-related AI applications. This data preparation routine could be recorded and documented for future teams and could be used as a template for helping to improve analytics systems in the future.
  • Step 6 – Evaluation – An AI team may determine that the Evaluation phase must require agreement on the next steps on the project from data scientists, subject-matter experts, and executive leadership. Developing a set of Evaluation meeting templates could help the company to have the right Evaluation “checkpoints” for its data science lifecycle interactions. This ensures that successes and failures are agreed upon, and the project can continue on the best possible path forward.

Critical Capabilities

  • Contextual AI Understanding for Business Leaders – A company may learn over time that without a specific amount of basic AI understanding, and use-case understanding functional business leaders are unable to properly handle conversations with AI vendors, or properly allocate resources (with realistic expectations) for AI projects. For this reason, the firm may develop an internal AI curriculum for any functional business leader (a Head of Compliance, a VP of Marketing, etc) who will be involved with AI initiatives, or with vetting AI vendors.
  • A Culture That Values Data – A company may discover, after a number of AI projects, that the data assets in some departments or silos are more uniform, more reliable, and better kept than the data in other departments or silos. Diagnosing those differences and building a set of protocols for handling data and ensuring data quality could help make future AI projects easier, and could help shift the corporate culture for the better.
  • Data Science Skills and Talent – After even just a few AI initiatives, a company will get a sense of the data science talent mix that is required for different kinds of projects (data science lead, ML engineers, data engineers, etc). By developing a set of norms and standards for AI teams, a company can better estimate its hiring and recruiting needs, and onboard the right kind of staff to contribute meaningfully to existing projects.

Above are a number of hypothetical retained lessons from AI deployments, but they should serve to showcase the kinds of specific frameworks and best practices that can be developed by a team that keeps their eyes open to areas of improvement, and specific insights about AI adoption within their business.

For any of the lessons above, a company may have to make a painful mistake. Many lessons will be learned “the hard way”, and in the nascent world of enterprise AI, it will be hard to avoid all such mistakes.

The ROI of initial AI projects is, in large part, a company and team’s ability to better leverage AI into the future. The company with the best Critical Capabilities, is more likely to win in the marketplace than a company with one high-ROI AI application, but no core foundational skills and resources to support it.

Deliberately turning AI deployments into learning experiences is an advantage that most companies will neglect.

Those who bolster their core skills and retain the lessons of applying AI to their specific data assets and their specific business processes will be setting themselves up to win in the years ahead. This flywheel of learning will ultimately spin the flywheel of innovation and ROI better than any one successful AI project ever will.

Stay Ahead of the AI Curve

Discover the critical AI trends and applications that separate winners from losers in the future of business.

Sign up for the 'AI Advantage' newsletter:

Subscribe
subscribe-image
Stay Ahead of the Machine Learning Curve

Join over 20,000 AI-focused business leaders and receive our latest AI research and trends delivered weekly.

Thanks for subscribing to the Emerj "AI Advantage" newsletter, check your email inbox for confirmation.