AI/MLCloud Native ComputingDevelopersDevOpsFeaturedLet's Talk

D2iQ Partners With Kong To Accelerate Development Of Cloud-Native Applications

0

Guest: Tobi Knaup (LinkedIn, Twitter)
Company: D2iQ (Twitter)
Show: Let’s Talk
Keywords: Cloud-Native Apps

D2iQ, the provider of enterprise-grade cloud platforms, has partnered with Kong, the cloud connectivity company, to help customers build, deploy and scale cloud-native applications in Day-2 operations. The joint solution will aid customers as they transition from monolithic to service-based applications across hybrid and multi-cloud environments.

Combining Kong’s service connectivity platform and D2iQ Kubernetes Platform (DKP), customers will be able to easily develop cloud-native applications without the complexity of manual cluster and API management. The companies have had previous joint customer engagements; however, they feel that their formalized partnership will make it easier for customers to consume both offerings together.

D2iQ believes that the partnership will save developers and users a lot of time. Instead of having to assemble different components and doing a lot of testing, D2iQ’s certified integration with Kong makes it easy to install so that developers and customers can get to production much faster. The partnership also allows customers another option for connectivity and the ability to leverage an alternative to Istio-based service meshes.

The two companies are working together to accelerate the development of cloud-native applications as they evolve towards smart cloud-native applications with organizations building AI capabilities into their products, which run on cloud-native architecture. Organizations who have already transitioned from stateless to stateful capabilities, can now collect data and build applications around the data they are collecting. However, building AI capabilities into the applications will enable them to gain insights into that data.

The cloud-native stack is becoming more complex with a lot of different components needed to build a product grade stack. By adding an AI pipeline with notebooks, training, hyperparameter tuning and deployment, there are a lot more capabilities which help tame the complexity of running these systems. D2iQ is also using AI to make its products better, leveraging AI to help operators make better decisions to run the platform more efficiently and to anticipate problems.

About Tobi Knaup: A cloud-native pioneer and evangelist, Tobi Knaup serves as the CEO of D2iQ. Previously, Tobi served as D2iQ’s Chief Technology Officer. As the primary author of the world’s first open-source container orchestrator (Marathon) and co-creator of the KUDO toolkit for building Kubernetes Operators, Tobi has the unique ability to understand an organization’s cloud-native journey from all levels–business, technological and talent. And as the driver behind D2iQ’s next-generation Kubernetes platform, Tobi helps make it possible for organizations to navigate the cost and time-intensive challenges associated with enterprise-grade container orchestration.

Before co-founding D2iQ, Tobi was one of the first engineers and technology lead at Airbnb, proving the technology’s value at scale in a production environment serving millions of users.

A German native, Tobi holds a Bachelor of Science and a Master of Science from the Technical University of Munich.

About D2IQ: D2iQ is the leading provider of enterprise-grade cloud platforms that enable organizations to embrace open source and cloud-native innovations while delivering smarter Day 2 operations. With unmatched experience driving some of the world’s largest cloud deployments, D2iQ empowers organizations to better navigate and accelerate cloud-native journeys with enterprise-grade technologies, training, professional services and support.

The summary of the show is written by Emily Nicholls.


Here is the full unedited transcript of the show:

  • Swapnil Bhartiya: Hi, this is your host Swapnil Bharitya of our newsroom. And today we have with us, once again, Tobi Knaup, CEO, co-founder at D2IQ. Tobi, it’s great to have you back on the show.

Tobi Knaup: Thank you for having me back.

  • Swapnil Bhartiya: Yeah, and today’s topic is going to be your partnership with Kong to kind of simplify deployment of smart cloud native applications. Before I talk about this partnership, can you explain what do you folks mean by smart cloud native apps?

Tobi Knaup: Smart cloud native apps is really, our a view of how cloud native applications are going to evolve in the next few years. And we call it smart cloud native apps, because we see this as kind of a shift similar to when we went from mobile phones that couldn’t do a whole lot to then smart phones that we use today, which have all kinds of exciting capabilities built in. And if you look at a modern smartphone, whether you’re using Apple or Android, they have a lot of AI capabilities built in. A lot of the key features that make these phones so strong, like the personal assistance like Siri, search, even the photos app that organizes your photos intelligently. That’s all driven by AI. And so what we’re seeing, is that a lot of organizations that are building cloud native apps are starting to evolve towards building these AI capabilities into their products that they’re running on cloud native infrastructure to be at the head of the pack, to be the leaders in the industry.

We’ve seen this evolution where, when cloud native was new, when Kubernetes first came out and containers were popular organizations, mainly built stateless applications because frankly that’s all the platforms could do at that point, right? And then over time we added state full capabilities, right? The container storage interface was only introduced into 2017. So now people were building state full applications, were starting to collect data and build applications around the data they’re collecting. Well, now the next step that a lot of these guys are going is they want to gain insights into their data. They want to build products around that. And so that’s why they need AI capabilities to build into their apps. That’s why we call them smart cloud native apps.

  • Swapnil Bhartiya: When you talk about smartest, it’s about their own capabilities, what they deliver to their users? Or smartest also mean to make those apps also more resilient, more reliable, more secure?

Tobi Knaup: It’s a great question. So I really think that AI and ML is a general purpose tool that will be built into pretty much all the products we use every day. A few examples we talked about smartphones, everybody knows about self-driving cars. The capabilities that folks like Tesla and others are building into their cars. But we work with a medical device manufacturer, for example. And they’re building AI into their MRI scanners and other medical equipment, right? So it is going to be built into those products that are running on top of cloud native platforms. But we, ourselves are also using it to make our products better. So we’ll see products like ours, like our Kubernetes platform, to also leverage AI, to help operators make better decisions to run the platform more efficiently, to anticipate problems before they come become real problems.

  • Swapnil Bhartiya: I often have this discussion that when they’re talking to AIML folks is the, at just the way a few years ago, we were like, “Hey, every company should be a software company”, then every company should have a cloud strategy [inaudible 00:03:46] multicloud. I think soon AIML will also become integral part. It will not be a separate thing. It will just be part of how we write, deliver, deploy applications, because without that we cannot see a word. So do you think that AIML will be at the central just like a lot of other technologies that we rely on?

Tobi Knaup: I really do think it will be built into everything, we’re already seeing it happen, right? Oftentimes we don’t even notice it, right? Because it’s really just built in and it improves those products that we use. Why it makes a lot of a sense in the cloud native world to leverage AI for operations is simply because the cloud native stack is becoming more and more complex, right? A lot of different components. We always talk about Kubernetes. So of course the predominant technology in the space, but the reality is there’s over a dozen of other open source pieces you need to put together to build a production grade stack. Now, when you’re adding AI capabilities, an AI pipeline with notebooks and training and hyper parameter tuning and deployment, you’re adding a ton more capabilities. So it’s a complex system. And I think AI is a great way to tame that complexity and to sort of be a sidekick to a human operator that needs to run these complex systems.

  • Swapnil Bhartiya: Let’s change the gear and talk about this partnership. First of all, tell us what kind of engagement you folks already had with [00:05:13]. Just reflect on your existing partnership. If it is new partnership, then we can talk about that as well, but let’s understand what kind chemistry is there between these two companies.

Tobi Knaup: We’re very excited about the partnership with Kong, because first of all, their philosophy and our philosophy around how we build products and how we bring them to the enterprise, it is just a fantastic match. We’re both based on the leading open source technologies from the cloud native ecosystem. And we sort of take those and build it in into a shrink wrapped, integrated, easy to deploy and fully supported enterprise offering. So what Kong offers is a connectivity platform for microservice architectures based on the popular Envoy open source projects, right? As a site car for connectivity and the Kuma service mesh that Kong built. And before announcing this partnership, we’ve had joint customer engagements. We focus on larger enterprise deployments and that’s where you see service messages show up quite often.

So we had those existing customers and so this partnership really formalizes what we’ve already been doing and just makes it very easy for our customers to consume both offerings together, right? So Kong’s connectivity platform is certified on our platform. It’s really easy to install. They’ve built a Kubernetes operator. So just to focusing on that ease of use again, I touched on the complexity of the cloud native stack. And so what this partnership is about too, is to just tame that complexity, make it really easy for our customers to deploy microservice architectures.

  • Swapnil Bhartiya: How is Kong going to kind of support your mission of providing data ready Kubernetes in an open hybrid and multi-cloud environment? Maybe even we could talk about edge also if that is also in the radar.

Tobi Knaup: Yeah, absolutely. So a lot of our customers have advanced Kubernetes use cases. You just mentioned a few, like multi-cloud and multi cluster and hybrid cloud, but even if you’re just running a fairly complex architecture that consists of dozens of microservices on a single cluster, you really get a lot of benefits from a connectivity platform like Kongs, right? It becomes this sort of fabric that ties all these services together that enables you to do things like easily encrypting traffic between your services from a central control point. Doing things like tracing setting policies on how services can talk to each other and given your observability capabilities, we talk a lot about at data operations, right? It’s even in our company names. So that observability capability is a really critical one towards being successful to run a resilient production offering.

We talk about our DKP platform as a complete and open platform. So that means, when customer uses DKP, they have all the pieces built in. They can get up and running in production quickly, but it’s also an open platform because a lot of the value in cloud native in this cloud native ecosystem is really in that there are so many great technologies to choose from out there. And so this partnership really allows our customers to leverage an alternative to Istio based service meshes, right? And it’s based on the onboard proxy and Kuma, so we think it’s a really strong offering, and we’re happy to bring that solution to our customers as another option for connectivity.

  • Swapnil Bhartiya: Okay. One thing that I’m really interested in. Whenever they are partnership, these are happening, is that what is the actual direct impact on developers lives or customers life? You can say, “Hey, you know what, this is the difference they will feel because of this partnership”.

Tobi Knaup: So I think the impact on developers or our users is that I think this integration and partnership will really save people a lot of time. It’ll shorten what we always talk about as the time, the value, right? That point, where you’re up and running in production, you have your applications on the platform, and you’re getting as a business, you’re getting value you from it, right? So instead of having to assemble a whole lot of different pieces and doing a lot of testing themselves, what we’ve done together with Kong is we’ve certified that integration we’ve made it really easy to install. So it just gets developers and our customers to production so much faster.

  • Swapnil Bhartiya: Is there any additional functionalities, services, or things that you usually feel because of this partnership in a nutshell? What I’m trying to ask is that, is it just improved D2IQ experience or they’ll get something additional because of this partnership?

Tobi Knaup: Yeah, I would say, a connectivity platform like Kong’s is just such a critical component if you’re setting up a microservice architecture. And so the additional piece that developers and customers are really getting here is not just so the basic service connectivity, making services, talk to each other. But also all the other capabilities that are important on day two that, frankly, people, when they’re starting down this path, don’t always see right away, but you want to make sure that your services can talk to each other in a secure way. Right? So the Kong platform offers that through encryption and through policies. So there’s a whole lot of day two capabilities around that, including observability that this joined offering brings to our customers.

  • Swapnil Bhartiya: This partnership is just to enhance once again, D2IQ’s offerings? Or was there some gap that you saw that you wanted to fill and bridge that you’re like, “Hey know what, this is also bridging that gap. We had the problem that our customers are facing”.

Tobi Knaup: Yeah. We are a very customer focused company. Being customer driven is one of our core values actually. And so how this partnership came to be was simply because our customers asked us to do this, right? We had joint customers with Kong, they were running the connectivity platform together with our product. And so that’s really what drove this. And that’s what drives a lot of our partnerships is just looking at what our customers want and how we can give them access to this whole cloud native ecosystem. Our platform has connectivity built in. It’s a complete and open platform. So it works out of the box. But for these advanced use cases, that’s where we’re bringing this partnership to the table.

  • Swapnil Bhartiya: Tobi, thank you so much for taking time out today. And of course, talk about a smart at cloud apps. Thanks for defining that for me. And also of course, talk about the partnership that you have with Kong and how it’s going to help your customers and user, as you said, the focus is always on customers. So thanks for sharing those insights as well. And as usual, I would love to have you back on the show. Thank you.

Tobi Knaup: Absolutely. It’s always a pleasure. Thanks again for having me on.