AI/MLCloud Native ComputingDevelopersDevOpsFeaturedLet's TalkOpen SourceVideo

CNCF Projects Are Ready For AI Workloads, Says Jorge Castro At Ai.dev Conference

0

Guest: Jorge Castro (LinkedIn)
Organization: CNCF (Twitter)
Show: Let’s Talk

AI.dev, the premier event for developers in open-source generative AI and ML, was held in San Jose, California earlier this month.

In this episode of TFiR: Let’s Talk, Cloud Native Computing Foundation (CNCF) Developer Relations Jorge Castro shares highlights of that conference as well as his insights on generative AI from the cloud-native perspective.

AI.dev highlights:

  • Castro felt like he learned 6 months’ worth of technology in 1.5 days, not counting the hallway conversations.
  • Large organizations that are doing AI at scale shared their expertise, including interesting talks from NVIDIA, Amazon, and Hugging Face.
  • The level of scale that people are doing cloud-native deployments and AI on top has been surprising.

On Kubernetes turning 10:

  • There are organizations that built their API on top of Kubernetes (Bloomberg, CERN, OpenAI, NVIDIA, and Hugging Face). And it’s due to Kubernetes’ extensible nature, i.e., API-driven and being able to extend outside the core primitives.
  • Kubernetes enhancement proposals over the past year are more around dynamic resource allocation, batch scheduling, and just things that remove a lot of the complexity for getting high throughput, low latency workloads in there. AI is a natural fit to that.

On CNCF and GenAI:

  • People are moving from the mindset of web apps to workloads that look more like AI. According to Clayton Coleman, “Inference is the new web app.”
  • AI is pushing cloud native in this manner due to the extensibility of the API. For CNCF, this means trying to grab those common patterns and put them upstream so that end users can get the economies of scale.
  • There are 174 projects in the CNCF landscape and all of them are starting to find new niches and new uses for what AI users are looking for.
  • Many of these projects are being used in production. Kubeflow just had a release in November and is one of the earliest AI projects on top of cloud native.
  • People are bringing in an entirely new outlook on how to solve a problem. It is a process that involves both the end user and the people that are producing at different levels, from the cluster hardware level, all the way up to the people that are consuming the projects are going to be doing that.
  • It’s a tremendous opportunity to bring a new diverse set of skill sets into open source, i.e., people that might not have any knowledge about the cloud native stuff but have deep knowledge on AI and things like that.
  • It brings in all sorts of different foundations and all sorts of software projects. All of a sudden, PyTorch, Apache, CNCF, and the rest of the Linux Foundation projects have this common thread of AI that ties them all together.
  • In order for AI to succeed due to the scale that you have to have, there is a lot more cross-collaboration with people who are wearing multiple hats and multiple organizations, trying to tie it all together.
  • It comes down to more human organization than the technology itself. And that is the generational challenge. Being able to scale that across multiple organizations, multiple tech stacks in a way that’s sustainable is the next 10-year goal.

On Castro’s Developer Relations role:

  • AI, in general, is just hitting production in so many places and that’s where the importance of having those healthy community processes to ensure that these projects continue to remain sustainable from a contribution perspective. The more production users you get, the more issues you’re going to find.
  • Castro’s job is to make sure that contributors are finding places where they can express themselves and contribute, learn, and do what they want to get out of open source. And be able to provide value to those end-user companies that want to consume the projects.

Advice for organizations on dealing with Kubernetes, cloud native, Gen AI complexities:

  • Being involved doesn’t mean that you have to figure it all out by yourself. Being in the room to keep track of how things are going is enough.
  • Be smart about where you put your time and where you want to contribute.
  • Find out what’s most important to you and get involved. Learn by reading the release notes of software that you might be interested in.
  • Attend a conference to get up to speed with people who have been in your shoes before. The only way to sustain open source is by ensuring that everybody else is learning from each other.
  • The health of the entire industry and how people consume this technology is part of the open-source model.
  • CNCF has two batch working groups, one at the Kubernetes level and one at the CNCF level, where an AI topic is being discussed every day.

This summary was written by Camille Gregory.