AI/MLCloud Native ComputingDevelopersDevOpsFeaturedLet's TalkVideo

AWS Powers Up Ecosystem At re:invent | Rob Hirschfeld

0

Guest: Rob Hirschfeld (LinkedIn)
Company: RackN (Twitter)
Show: Let’s Talk

With this year’s AWS re:Invent kicking off, it is no surprise that the craze around AI continued to be a key theme. Yet with so much progress going on in the world of AI, it can be difficult to see the forest for the trees. Many companies are working to incorporate AI into their applications but turning these ideas into real enterprise value can be tricky.

In this episode of TFiR: Let’s Talk, Rob Hirschfeld, Co-Founder and CEO of RackN, talks about this year’s AWS re:Invent and some of the key discussions and themes he saw come out of the event. He talks about the event from RackN’s perspective and what they are seeing in terms of helping organizations accelerate how they consume infrastructure. He goes on to discuss some of the key trends in AI and the challenges many organizations are facing.

Key highlights from this video interview:

  • Hirschfeld discusses this year’s AWS re:Invent saying he felt that the show was even bigger than the 2019 event from a resale and partner perspective. He talks about the shift from people just talking about Amazon to it being more about the partnerships and the ecosystem.
  • While it is clear AI is disrupting a lot of industries, Hirschfeld feels that nobody is entirely clear how. He talks about Amazon putting AI into many of their products and how he believes it is creating a lot of white noise.
  • Hirschfeld talks about the AWS re:Invent event from RackN’s perspective: they see cloud-native technologies and techniques as a core capability for organizations accelerating how they consume infrastructure. He talks about some of the work Amazon has been doing with hybrid cloud and how compute and training factor in.
  • Many companies are building partnerships to give them access to foundational AI models. Hirschfeld tells us he had lots of conversations on this approach saying it is not clear if companies need to build their own at this point. He talks about some of the concerns organizations are having for building their own models and the potential challenges of them running those types of workloads on the cloud infrastructure.
  • Hirschfeld discusses some of the strategies for hardware coming out of Amazon saying he feels they are doing a good job with their ARM-based infrastructure. He talks about the cloud providers’ push for alternatives for GPUs from NVIDIA and making inference technology accessible and available.
  • In light of OpenAI’s new AI model called Q* (pronounced Q Star), Hirschfeld gives us his thoughts on what this can mean from the wider perspective of people figuring out how to use AI/ML. He believes that even though there is a lot of progress being made, turning those ideas into real enterprise value and integrating them into applications is not so straightforward.
  • While normalizing how people consume cloud infrastructure such as cost patterns, controls, compliance and governance,  there are still major challenges for many users and AWS is working to improve that experience. Hirschfeld thinks next year to see some build in governance and compliance pieces from cloud providers and vendors to help address these challenges.

This summary was written by Emily Nicholls.