Combining LLM Agents to Drive ROI in Business Workflows – with Babak Hodjat at Cognizant

[email protected]
Combining LLM Agents to Drive ROI in Business Workflows-2x

One of the key strategies for future-proofing an organization is recognizing the limits of AI models and filtering out the misconceptions that AI is a one-solution-fixes-all narrative for businesses. Enterprises build an AI strategy supported by use cases and how LLMs can change the workforce going forward.

Data-based strategies, developed over years of strategic planning, lead to more effective and realistic business expectations that will “fruit” the need for AI consultants to service the needs. As more enterprises deploy AI across their organizations, the need for customized out-of-the-box large language models (LLMs) rampantly grows. 

Emerj CEO Daniel Faggella recently sat down with Babak Hodjat, Chief Technology Officer AI at Cognizant, on the ‘AI in Business’ podcast for a deep dive into the dilemma of LLMs in enterprises and what the future looks like. In our 30-minute interview, we explore the various AI strategies that Cognizant and other enterprises are implementing and experimenting with LLM agents to drive ROI in business workflows.

While legacy thinking may undermine or underestimate straightforward applications that are built for a specific, narrow use case (e.g., chatbots for customer service), the use of LLMs to enhance everyday tasks and performance is generally overlooked until the potential of LLMs is revealed through use cases applicable to the industry. 

So, how can business leaders overcome LLMs’ limitations and apply them in a way that drives real ROI in their workflows? 

The article below explores the following actionable insights for AI consultants and business leaders across industries:

  • Augmenting human based on LLMs: Packaging out-of-box and bespoke LLMs as knowledge workers that allow customers to more easily navigate various service pipelines.
  • Improving the ‘common sense’ of GenAI-based systems: Refining reliable outputs from end-to-end systems by building multiple iterations of GenAI applications trained on synthetic data from AI use cases based on specific customer segments.
  • How to combine LLM agents to transform business workflows: How “thinking LLM first” improves the building of systems like network design by stringing together agents designing and modifying templates for new workflows.

Listen to the full podcast below:

Guest: Babak Hodjat, Chief Technology Officer AI of Cognizant.

Expertise: Artificial Intelligence, Machine Learning, Technology and Data Sciences.

Brief Recognition: Prior to being the Chief Technology Officer AI at Cognizant, Babak has tremendous experience in AI and technology as the Founder, CTO, and acting CEO of Dejima Inc.

Augmenting Human Workflows with LLMs

Babak advises enterprise leaders to be sure to identify clearly what the business concerns are and how LLMs will solve these problems when manifesting workflows with LLMs. If adoption teams can identify and solve how LLMs will build systems such as network design, the logistical requirement will be to structure tasks and agents in an automated solution that will automate workflows and seamlessly speed up operations.

He believes consultants must follow one of the three methods below to accomplish a successful LLM integration: 

  • Package your offering as a bespoke out-of-the-box LLM solution specialized for the industry of the client you’re pitching to;
  • Provide examples of used cases of how this worked before with a previous customer, and the AI ROI yielded profit;
  • Cross-department functionality that serves to streamline the business operations and speed up performance to increase performance output.

As an AI consultant serving clients in industries, company leadership places much emphasis on previous project successes. One of the many pain points of an AI consultant is, “How do I communicate my value and worth in a manner the client will want to pay for?” 

You provide the result of the AI focus on the business process that will:

  • (A) grow the business,
  • (B) mitigate and reduce labor costs to ensure company survival in the future,
  • (C) show a previous stats project of the before, during, and after inception results and how that small case study expanded the company’s LLM capabilities.

Consultants have identified this hurdle, which has preceded them to the next step in the balancing act.

Improving the ‘Common Sense’ of GenAI-based Systems

Refinement of reliable and actionable outputs is a significant step for an organization to undergo. Refinement also builds AI confidence with the C-Suite regarding the precedence of new and unfamiliar technology that will change their business in the future.

Babak emphasizes that organizing outputs through building multiple iterations of GenAI applications that are specifically trained through synthetic data to build success points of tried-and-tested tasks. These success points, as Babak mentioned, act as testers for the AI integration.

According to Babak, if the testers fire off and perform their tasks accurately on the synthetic data to yield the desired results without any errors, it can be applied to real-time data.

Babak goes on to share some key points with regard to AI implementation in order to create a great “common sense” for GenAI-based systems:

  • AI having more “common sense” than before, than previously thought, and can be used out of the box for many use cases;
  • Babak envisions AI transforming existing workflows in various industries, with tasks automated and suggestions provided instead of decisions based on boring information;
  • Hodjat shared with Emerj how AI is being used to optimize business processes for clients, with a focus on identifying specific use cases and workflows for each client;
  • Babak expressed how Gen AI is used to identify use cases for clients, interact with users to refine scope, and build end-to-end systems with synthetic data;
  • Babak highlights the importance of common sense data estimates and interactive user feedback in the Gen AI workflow.

Application of an LLM Agent for Growing AI-designed Workflows

Babak and Faggella spoke about the potential of AI agents to streamline research processes by automating tasks such as data collection and analysis. Babak explained how AI agents can scope opportunities, produce synthetic data, and string together ML models, all while considering multiple angles and perspectives.

As an AI consultant, Babak recommends steering the paradigm and approach of the client away from legacy thinking with machines and people as two separate entities. Instead, consultants should espouse a view of machine learning and humans integrated with AI. A foundational structure must be made through an AI roadmap and define:

  1. Systems and network designs;
  2. String together agents;
  3. Design and modify templates for new workflows.

Babak mentioned that AI is being used to optimize business processes for clients, with a focus on identifying specific use cases and workflows for each client. He emphasized the need for AI transformation across multiple industries and using AI agents to streamline processes and automate tasks, i.e., leveraging LLM’s capabilities, is a step towards having an “LLM first” approach:

  • Babak stated that leaders should leverage LLMs and Gen AI to build more Gen AI apps; even the harshest critics acknowledge their powerful natural language direction and code writing capabilities;
  • Babak mentioned that generative AI can be used to build systems, such as network design, by stringing together agents and tasks, designing and modifying templates for new network designs;
  • Babak and Faggella mentioned that an “AI-based system” integrating LLMs into workflows can improve efficiency and robustness and contribute to ROI.

These ideas are necessary tools for enterprise AI strategies and explain why “thinking LLM first” is essential for any business that wants to adapt to GenAI in its organization. LLMs enhance the design process by automating and streamlining tasks, improving adaptability and customization, and facilitating natural language interactions.

By integrating GenAI tools like LLMs into network design and other complex systems, organizations can achieve more efficient, scalable, and responsive solutions that are better aligned with evolving needs and contexts.

Subscribe