AI/MLCloud Native ComputingDevelopersDevOpsFeaturedLet's TalkOpen SourceVideo

Open Platform for Enterprise AI (OPEA) aims to foster collaboration in Enterprise AI

0

LF AI & Data Foundation recently announced Open Platform for Enterprise AI (OPEA), a project aimed at fostering collaboration among ecosystem players to set an open standard on how to develop next-generation AI solutions. In this video recorded at the Open Source Summit North America 2024 in Seattle, Arun Gupta, Vice President and General Manager for Open Ecosystem at Intel, talks about the platform and how it aims to democratize generative AI (GenAI) and simplify the complexity enterprises are facing with deploying GenAI solutions. He says, “We believe an open ecosystem creates a level playground that allows multiple players, partners, and developers to collaborate and compete together, because it sets up an open standard.”

Open source AI challenges, including transparency and ethics

  • Gupta discusses the launch of OPEA, which aims to bring together ecosystem players for collaboration and competition on next-generation AI solutions.
  • Gupta explains that this will help set up an open standard that allows people to work out what the components mean for them and what the actual implementations would look like.
  • The conceptual framework or specification defines what it means to have an open platform for enterprise AI along with reference examples.

Open source AI, ethics, and geopolitics

  • Gupta talks about the challenges in open source GenAI, including transparency and ethical considerations such as bias and attribution.
  • Gupta believes open source LLMs are more trustworthy and deployable than closed source ones. He mentions initiatives like the Open Source Initiative (OSI) and its work toward defining open source AI standards.
  • Open source has potential to build models that cater to a more diverse, global audience. He discusses the advantages of open source in fostering collaboration and addressing ethical concerns in AI development.

Democratizing AI through open source projects, including OPEA, and collaborating with industry partners

  • Gupta discusses the need for significant compute resources in creating and fine-tuning LLMs. He explains about RAG pipelines and the importance of defining a standard architecture for the pipelines to help democratize GenAI solutions.
  • OPEA aims to define a set of microservices to address the complexity of deploying GenAI solutions for enterprises. Gupta explains how these microservices will be deployed on a cloud-native platform.
  • OPEA is part of LF AI and Data Foundation. Gupta emphasizes the importance of neutral governance under LF AI and Data.

Open source AI and cloud native AI, with a focus on OPEA and its potential to enable AI workloads

  • Gupta shares the current focuses for OPEA stressing that they will be concentrating on composing the existing components to create reference flows and blueprints for deployment.
  • The Technical Steering Committee (TSC) will be set up within the next few weeks.
  • Gupta discusses the work underway in the cloud-native space to enable AI workloads, sharing examples of AI developers leveraging Kubernetes for deployment.

Guest: Arun Gupta (LinkedIn)
Company: Intel (Twitter)
Show: Let’s Talk

This summary was written by Emily Nicholls.