AI/MLDevelopersDevOpsFeaturedPredictionsVideo

Enterprises Will Start To Get More Value Out Of LLMs | 2024 Predictions By Qarik

0

Guest: Rob Schmit (LinkedIn)
Company: Qarik Group (Twitter)
Show: 2024 Prediction Series

Qarik Group focuses on providing excellent service, technical ability, cloud application transformation, and digital transformation to companies to enable them to deliver more value to their customers. Rob Schmit, Partner at Qarik, gives us his predictions for 2024.

While this year has seen a lot of people consuming OpenAI, the coming year will see companies start to branch out and think about the types of LLMs they are using, where they are using them, and how they are running them. People may start to try smaller models like Falcon or Llama and organizations will need to look at how teams are using these LLMs to get the value out of them.

With all the companies exploring LLMs, we will likely see a return of metadata management to integrate LLMs into enterprise workflows. This is because the data needs to be usable, well-categorized, and understood from a metadata perspective to make use of LLMs. Organizations may approach this by going back to focusing on managing their data libraries to get them to a level of better efficiency.

Coding assistants will become more integrated into developers’ workflows and will help speed up delivery and drive improvements. The virtual paired programmer may help with a code base the developer is unfamiliar with or with locating things and being able to communicate how something works. Coding assistants will help automate some of the previously manual labor, which is where Retrieval Augmented Generation (RAG) comes in. This can lead to huge productivity gains over time.

Although generative AI has a lot of potential there are also challenges associated with it. You cannot just slot it into the business, there needs to be cultural, structural, and business changes to facilitate these changes. Qarik is helping customers work out how to integrate generative AI and assess what use cases can be done safely. They take a holistic view of these technologies and how to unlock the value from them.

Qarik’s focuses are evolving with each year, having originally started as agile and cloud transformation specialists. Their key focus of helping organizations modernize their infrastructure can take a variety of routes but stands them in good stead for the new LLM/Generative AI world. The company will be uncovering new packages and programs they have built out to help accelerate this journey in the coming year.

This summary was written by Emily Nicholls.