Buoyant is adding fully managed Linkerd capabilities to Buoyant Cloud, so that developers can treat Linkerd as a managed service even if it is running on their own cluster. Buoyant Cloud can now automate the Linkerd upgrades, rollbacks, installations and trust anchor rotation. The new capabilities aim to make Linkerd easier to use and manage.
“Linkerd is known for its simplicity. In the very crowded market of service meshes, it stands unique because it is focused on reducing what we call operational complexity so that once you have it running, it’s easy to install and get it working,” says William Morgan, CEO of Buoyant and Co-creator of Linkerd, on this episode of TFiR Let’s Talk.
Key highlights from this video interview are:
- Morgan discusses their motivation for open source Linkerd to be whatever developers’ need to run a fully productionized, fully secure, service mesh. Morgan explains the features that take advantage of what Linkerd provides, while still making it easy for customers in Buoyant Cloud to consume also.
- Morgan explains why open source and the open source community is so important to Buoyant.
- Automation is playing an important role in the management of service mesh. Morgan explains why automation is so critical in this area and what problems it is solving for developers.
- Morgan explains what makes Linkerd so unique compared to other service mesh solutions and why Buoyant was so focused on creating a simple system.
- KubeCon is being held May 16-20. Morgan explains what to expect at the event. He goes into detail about what notable developments and deployments they are seeing with Linkerd.
Connect with William Morgan (LinkedIn, Twitter)
The summary of the show is written by Emily Nicholls.
[expander_maker]
Here is the automated and unedited transcript of the recording. Please note that the transcript has not been edited or reviewed.
Swapnil Bhartiya: Today. We have with us once again, William Morgan, CEO of Buoyant, and also co-creator of Linkerd. William, it’s great to have you on the show.
Wiliam Morgan: Hi, thanks for having me back. It’s great to be here.
Swapnil Bhartiya: Perfect. Let’s talk about the newest, biggest story, which is the new fully managed Linkerd capabilities which are being added to the Buoyant cloud. So talk about it, what it is and why you’re doing it.
Wiliam Morgan: Yeah. So Linkerd is known for its simplicity, right? In the very crowded market of service meshes, it stands unique because it is focused on reducing what we call operational complexity. Right? So that is once you have it running, yes it’s easy to install and it’s easy to get it working. But once you have it running, what do you have to do as an engineer or as a team of engineers to operate it successfully in production, right? So what do you have to do to monitor it, to update it, to just maintain in the fast moving world of the Kubernetes ecosystem, even keeping your software up to date is a non-trivial amount of work. So, that’s been great. And, in that world Linkerd is far, far simpler than any other option. But we still hear from link D adopters that work is what SREs like to call toil, right?
It’s work that’s not really rewarding. It’s work that is not directly helpful to the business or to the automation goals. So what we’ve done with this release is we’ve added fully managed Linkerd capabilities to Buoyant cloud, which means you can treat Linkerd, even if it’s running on your own cluster, you can now treat it as a utility. You can treat it as a managed service, because buoyant cloud will upgrade it for you. We will do rollbacks, installs, trust anchor rotation, all these SRE tasks that are associated with toil. We can now take on for you.
Swapnil Bhartiya: When we look at open source, it does solve the day one problem. You can get it started with project manageability update, upgrades, and maintenance that can become a big challenge. And, once again, it’s not adding any value to the business, all the developer hours that you’re putting into that effort, but there are two things also. One is that, of course, to just manage that open source upstream project, but sometime a lot of commercial products based on open source also add a lot of additional features and functionality that not every member in the community want because sometime when your project is maintained by a larger community, it’s like every what everybody wants that goes into that. So is there anything additional, some pain points that in addition to just managing and manageability that you saw that customers or users are facing, so you want to help them as well?
Wiliam Morgan: Yeah. So here, the way we’ve drawn the line is we want everything in open source Linkerd to be whatever you need to run a fully productionized, fully secure, service mesh. And so the features that we’ve added in Buoyant cloud, which are not part of the open source, which are part of the managed product are number one, the management layer to reduce that SRE toil, and then number two, features that just take advantage of what Linkerd provides and make it a little easier to consume, make it a little easier to package up and deliver to other functions in the company. So for example, one of the things that Linkerd does is, in the security domain, it’s, what’s called mutual TLS. So between every pod, it will create TLS connections. It’ll validate the identity of either side it’ll encrypt it. So you get confidentiality and integrity and authenticity, and that’s all in the open source.
Right? And in fact it does it, it’s on by default. So you don’t even have to configure anything. And then on cloud side, we’ll just give you some reports, we’ll say, “Hey, here are all the connections in your cluster that have mutual TLS enabled. Here are the ones that are sold in plain text. Here are the ones that the system’s not able to determine, and we can provide these reports for the security team or for auditors who come in.” Right? So the way we’ve divided that is we never hold back any features from the open source, but we want to make the easy consumption available to our customers in Buoyant cloud.
Swapnil Bhartiya: I mean, that’s the best way to do open source, right? Everything is available to everybody, but also make it easy for those who are willing to do that.
Wiliam Morgan: I think so. And it’s really important to us because we want to have an open and authentic relationship with our open source community. We never want to hold anything back. We never want to put ourselves in a spot where the community is like, now something that we can’t be wholehearted about in our approach. So it’s critical, this dividing line doing it that way. It’s critical to that.
Swapnil Bhartiya: Now let’s talk about some major trends. One of the things is that automation is playing a very important role in the larger not only just cloud native, but the whole IT landscape, even if the landscape is changing. Can you talk about the role of automation in the management of service mesh on of course, Kubernetes itself.
Wiliam Morgan: Yeah absolutely. So, I think the reality is as amazing as open source software is running any software sucks, right? If you, pardon my French, it’s just a pain, right? Because you have to maintain it, you have to keep it up to date, you have to do a bunch of stuff that’s all in that bucket of toil. And it’s great that it’s open source. It’s great that you have full control over the code and you know, you’re not locked into anything, but the fact that you have to do this work is painful, right? And the kind of the traditional approach has been, it’s not so traditional, has been to provide hosted services, right?
So we’ll, if you have a database, you can run it yourself, or you can have a hosted database solution, a hosted My SQL, or whatever it is. For something like a service mesh. We can’t really do that because the way that the service mesh works is that data, plane component, those proxies, or in Linkerd’s case, we have these ultralight micro proxies. Those have to sit on your code, they have to be next to your application. So we can’t host it entirely for you, but we can do the next best thing, which is we can automate all the toil, we can keep the management burden off in the cloud, and then just allow you to have those components running on your system without requiring that you do the automation and the non automation, the monitoring and the management and the health checking and the upgrades just in order to use it. So again, you can treat Linkerd as a true utility.
Swapnil Bhartiya: Excellent. Now, since we are talking about broader topic, one other thing that I do want to talk about is also how does Linkerd compare to other service mesh solutions? It could be performance, it could be security, it could be usability. Of course, we have talked about that, but I would just want to hear, especially when you are baking this feature into the Buoyant cloud itself.
Wiliam Morgan: Yeah. That’s a great question. So we have a very particular approach. And LinkerD is a really unique product in the service mesh space. It’s the only one, I think it’s the only one that has not built on top of Envoy. Envoy is a very popular proxy. And I wrote a long blog post about why we did this, but we took a very specific approach, which is to build it on top of this ultralight Rust based proxy just called Linkerd proxy. It’s not a generic proxy. It’s custom built for this use case. And the reason we did that is because that allows us to be incredibly fast, incredibly lightweight, and most important, simple. So these proxies only do one thing. They only act as side car proxy. So they’re not a general purpose thing like Envoy is. They are, it’s three or four megs of runtime. So really, really tiny, incredibly fast, built on top of the modern ecosystem.
That’s all happening in the Rust networking space, Tokyo and Tower, and all this exciting investment from a lot of companies all benefits Linkerd. And the result is that when you are deploying Linkerd you can end up with 10,000 proxies sitting in your system, but you don’t have to tune them, you don’t have to go in there and tweak their config. There’s no difference between the Linkerd config and the proxy config, and some features are only available at certain levels. All that stuff is gone. So you end up with a very, very simple solution, and that is what makes Linkerd so unique in this crowded space. That it’s very easy to add more features and add more complexity and check more check boxes. It’s very, very difficult, incredibly difficult to make a simple system, but that’s been our focus.
Swapnil Bhartiya: KubeCon EU is almost here. Few folks will be there as well. What are the things that we should expect? Not only at the event, but beyond as well, because if you look at the service mesh space, it’s evolving, changing, a lot of movement is happening within the community itself. So talk about that.
Wiliam Morgan: Yeah. So as usual, we have a great turnout of end user talks at KubeCon EU from companies that are using LinkerD in production. We have a blog post with the full list up on Buoyant.io. But one talk that I’m particularly excited about is from Microsoft, from their Xbox cloud gaming team. So if you’ve ever used Xbox cloud gaming, which is you run your Xbox and you deliver the games through the cloud, there you go. That’s it, that’s actually being delivered through Linkerd and they have a massive Linkerd deployment. They’ve deployed it across 22,000 pods, many, many regions around the world, and they’re using it for security primarily, but also reliability. So really interesting talk from the folks at Microsoft. And it’s one of those things where you may be using Linkerd without even knowing about it. So if you’re playing the Xbox right now, you might just be using Linkerd.
Swapnil Bhartiya: William, thank you so much for, of course, talking about this new capability that you’re adding to the [inaudible 00:10:22] but also talk about in general, the problem, the challenges that SREs face and how you’re helping them. And of course the next time when the Xbox [inaudible 00:10:30] you can always blame it on Linkerd. So, thanks for sharing that story as well. And folks will be looking forward to that talk, but in general it was a great discussion and I look forward to meeting you at KubeCon. Thank you.
Wiliam Morgan: Thank you so much for having me. I look forward to that as well.
[/expander_maker]