AI/MLDevelopersDevOpsFeaturedLet's TalkVideo

Opaque Systems helps enterprises navigate AI data privacy complexities

0

While generative AI (GenAI) is rapidly being adopted, many companies are struggling to get their AI projects through to production due to data privacy concerns. In this video, Aaron Fulkerson, CEO of Opaque Systems, discusses the risks associated with generative AI and large language models (LLMs), shedding light on the challenges enterprises are facing. He goes on to share some of the technologies Opaque Systems is using to help solve these problems. He says, “Companies are stalling their LLM implementations because they need to have absolute certainty that their data is kept private, secure.”

AI data privacy and LLM risks

  • Opaque Systems enables companies to build confidential data pipelines that they can build AI applications on. Aaron explains what confidential data pipelines are and how it helps them have trusted AI.
  • Fulkerson talks about the challenges enterprises are facing getting AI projects from pilot to production and why one of the top reasons is concerns about data privacy.

AI data privacy and security challenges and techniques for solving them

  • Fulkerson discusses the risks of data leakages, such as in training data and prompt augmentation. He explains how companies’ proprietary information and customer data including personally identifiable information can be leaked.
  • Fulkerson talks about a white paper produced by the founders of Opaque Systems that goes through all the different techniques people can use to secure their AI implementations, particularly GenAI.
  • Although anonymization is the current standard technique it is not completely effective and it is likely some data is leaking. Fulkerson explains the risks of generative AI being used to de-anonymize data,
  • Techniques outlined in the whitepaper include federated homomorphic encryption (FHE) and hardware enclaves and their pros and cons.

Unlocked the business value of AI with data

  • Fulkerson emphasizes the importance of using private and sensitive data in AI projects, citing the potential for lost opportunities if companies cannot utilize their most valuable data.
  • Fulkerson highlights the importance of ensuring your business is not disrupted because another market player can get their AI projects into production and fine-tuned quicker.
  • Fulkerson believes the success of companies over the next three to five years depends on their abilities to adopt AI and how imperative it is to business operations.
  • Old techniques are insufficient to today’s AI needs and Fulkerson discusses how the use of anonymized data does not meet the requirements of regulations like GDPR and CCPA.

How GenAI compares to past tech supercycles

  • Fulkerson talks about how the GenAI movement compares to past technologies like Linux and Docker, saying each tech supercycle starts with a catalyst, like the Mosaic browser for the internet, and the iPhone for mobile phones.
  • ChatGPT has acted as a catalyst for GenAI but it’s still early days for this tech supercycle. However, Fulkerson believes this is the most significant technology supercycle in human history.

Technology innovation and the need for guardrails

  • Fulkerson believes that trust is the biggest factor that could damage our global society as new innovative technologies like VR and AR proliferate. He talks about the need for guardrails.

Guest: Aaron Fulkerson (LinkedIn)
Company: Opaque Systems (Twitter)
Show: Let’s Talk About AI

This summary was written by Emily Nicholls.