This is a contributed article by Luigi Congedo, and edited by Emerj. Luigi is Venture Capital Principal at San Francisco-based BootstrapLabs, an AI-focused VC firm. To inquire about contributed articles from outside experts, contact firstname.lastname@example.org.
Editor’s note: I first met Luigi over four years ago when I was starting Emerj (then TechEmergence) in San Francisco. There were only so many AI startups, and our paths inevitably crossed – initially through the venue of BootstrapLabs’ in-person AI events. Luigi has seen a huge volume of AI firms in the Bay Area and beyond, and as an investor, he’s forced to think more broadly about future trends than more founders are. In this article, Luigi lays out his perspective on four AI trends for the year ahead.
1. Off-the-Shelf or Long-Term Investment
Today’s AI technology owes its existence to public and private sector (i.e. Big Tech) initiatives that built its foundations and created its initial breakthroughs through enormous effort and investment.
Today, the real opportunity lies at the intersection between AI technology advancement and industries. It’s not about machine learning by itself – it’s about the use of these useful algorithmic approaches to business problems. It’s about: AI applied to X.
Two options are open to today’s enterprise firms:
- They can buy generic AI-enabling APIs (Google, Amazon, etc..), share their data and lose control of your margin over time.
- They can invest in their own innovation roadmap, build in-house AI/ML talent teams and work with applied AI vertical solutions. In this case, they own and control the data, their team can perform fast product iteration and customization, and they can maintain control of your margins.
Actionable Insight: Off-the-shelf solutions can serve a purpose, there’s no shame in using them for the right point solution, to explore the technology and begin gaining experience solving problems. In the long-term, however, smart companies won’t use plug-and-play APIs on their own, but as part of a rich AI ecosystem – including many in-house tools and new data infrastructure – all working to solve problems and unlock bigger capabilities.
Building an IP is important, but your differentiation lies in having unique training data. Enterprises will go from understanding to prediction to prescription to autonomous processes – and in order to unlock that full transition, in-house engineering and proper infrastructure are a must.
2. AI Talent the Real Global War
Talent is a critical asset for success in today’s extremely competitive economy. As years go by, AI is likely to automate the routine and repetitive aspects of many workflows. But the industry requires more AI/ML talent to build complex expert AI systems, software architectures, and new hardware (AI chips) today. Some of the critical aspects I notice in the industry are the following:
- Contrary to general assumptions, AI research is significantly less open. In fact, only 15% of papers are publishing their source code (source).
- Only about 53% of AI projects successfully make it from prototype to full production, according to a 2020 research by Gartner.
- AI professors are being picked up by Big Tech companies, creating a critical gap and consequential shortage of new talent entering the market from academia.
Actionable Insight: Applying AI is not about hiring PhDs, it’s about teams that can work together and get things done – mixing AI expertise and deep subject-matter expert knowledge. Google and Facebook want to buy up teams in their acquisitions, not just individual AI PhDs, and that’s what other enterprises need to understand, too.
Big tech is generally very good at integrating acquired product teams and letting the teams keep doing their job. Enterprises need to learn how to do AI M&A. Retention of top AI talent is hard, but with M&A, acquired talent often needs to vest equity, so you have time to retain top talent – and use their experience to upgrade your team as you work together on important AI projects.
3. AI & Engineering Strategy
Are AI businesses different from the more traditional software businesses?
For many years, we have been building software based on codes; up and until a few years ago many of the software companies were based on predefined rules-based systems. Over the course of time, we went from elaborative complex software development workflows to the more recent introduction and diffusion of no/low code development.
As per a recent Forrester report, the low-code market is all set to reach an annual growth rate of 40%, with spending forecasted to grow to a whopping $21.2 billion by 2022. Performance, scalability, interpretability, and reliability of AI models are going to determine and play a critical role in business growth, speed, and innovation.
Actionable Insight: Large corporations want to work with cutting-edge AI vendors, but they don’t have the infrastructure in place to use it. Leaders need to think about if they can even integrate and use AI.
Spinning up an IT or data environment just to use a single AI point-solutions often means building technical debt. Investing properly in AI maturity allows firms to work well with vendors, and integrate solutions into a cohesive ecosystem for IT and data, not a series of arbitrary sandboxes.
4. New Data Ecosystems
Both the European Union and the US Government are pushing for more policies and regulations to establish new standards and principles for data sharing, privacy, and security. Application programming interfaces (API) are automating communications between separate applications, creating new opportunities for innovation.
In my opinion, API is one of the most underestimated elements of the so-called digital transformation or the Fourth Industrial Revolution. Moving data around to serve different parts of your system independently requires abstracting specific and structured data points from complex systems, and APIs created new ways of doing it, some of which were termed impossible in the past. APIs are today rapidly evolving as well, thanks to the advent of cloud computing and consequential reduction of processing costs. Static and fragmented industry data protocols have now been replaced by scalable plug-in solutions.
Transferring data and knowledge to and from different applications will also impact the amount of data we need to train new models to create new services.
This means taking an AI algorithm that’s pre-trained for a task where there is an example-labeled data available (for example, identifying cars in images), and transferring that knowledge to a different application for which there is little data (like identifying trucks).
Actionable Insight: There are challenges with cleaning data, but there are also challenges with moving, storing, and using data – and there’s no one size fits all answer.
Some enterprise leaders presume that once they take their data into the cloud they’re done – but the specific orchestration of data storage and access methods is going to vary greatly depending on use-cases. Enterprise firms need to be able to operate in multiple clouds, and at-the-end processing – being nimble and open to those options is critical.