0

Guest: Owen Nicholson
Company: SLAMcore
Show: Let’s Talk

SLAM stands for Simultaneous Localization and Mapping, and it’s something all humans and animals do by design. The moment someone enters into an unknown space they can move about without crashing into things. Wondering how? Well, this is done by building a mental model of the world and understanding where we are within that space at any given time.

SLAM is the technical equivalent of what nature has designed and SLAMcore is a company that spun out of one of the top universities in the world. The primary mission of SLAMcore is to make this technology available to people looking to enable their products with the ability to understand space and safely navigate through it.

SLAMcore has significant implications within the world of robotics. As Owen Nicholson, Co-Founder, and CEO of SLAMcore, puts it, “Robotics has been a dream and something sci-fi has talked about for decades and decades, and sometimes we may be a bit disappointed that we don’t have fully autonomous robots running around.” Of course, when you take a step back from the sci-fi side of things, you realize that robotics is used extensively within numerous industries. In fact, a lot of manufacturing depends on autonomous robots, especially in logistics and intra-logistics, where robots are better capable of moving materials from A to B or deliver goods to human workers for packing. Nicholson even mentions drones, when he says, “There are drones being used within the industry as well as for roof inspection, for inspecting pylons and areas which are actually quite dangerous for humans to go to.”

Pallet-Moving Robot

SLAMcore makes it possible for robots to be more aware. One example Nicholson brings up is the pallet-moving robot. He says, “…so that robot, that pallet-moving robot, if it encounters a problem, it can’t flag that to the drone or to the human operators via the AR headset, to say that there is something going on in space that it should be aware of.”

SLAMcore’s approach to solving the problem, according to Nicholson, is all about mimicking nature. He says, “There’s a reason why nearly every animal on the planet uses vision as the main sensing modality to understand space and move through it.” Obviously,  cameras are the technical equivalents to vision, and there is more spatial information available in the feed from a cheap camera than the most expensive LIDAR on the planet. However, according to Nicholson, “The problem has always been, to date, accessing that data in real-time, using affordable hardware, so it doesn’t take so long to process the data that the robots crashed by the time it’s realized there’s an object in its way. This is really what SLAMcore brings to the table.” To do this, SLAMcore specializes in the integration of vision-based localization and mapping and fuses that with inertial data, which is also inspired by nature.

Another technique, inspired by nature, is using gyroscopes (inspired by the inner ear accelerometers). By combining those two data set feeds together (vision and inertial sensors), SLAMcore can create a full spatial intelligence solution, which works extremely well for an individual robot. And because SLAMcore technology is based on vision, that information can then be “communicated across all of the different robots within that product range and probably more excitingly across other product ranges as well.”

But there are limitations with current technology. Take, for instance, the laser, which Nicholson says, is “not very good at giving you a general understanding of space. What are the objects? Is that a person? Is that a shelving unit? Is it another robot? Cameras are very good at that side of things because we have much more spatial information, but maybe not as accurate giving you a hundred percent certainty, is there an obstacle in my way?” For this, SLAMcore provides a full toolkit that allows developers to select the right algorithms and sensors, and then develop, tune, and optimize them to work for their applications.

Of course, there’s a social aspect of robots, where workers believe their very livelihood is on the line. On this subject, Nicholson says, “One of the missions of SLAMcore is to democratize autonomy for the benefit of humanity, and I would be heartbroken if our technology started to be used for negative consequences and actually in ways which we didn’t approve of.” He continues, “I will fly the flag and fight in the corner for robots because I think as long as we are careful and we talk openly and we have a real dialogue about how we want this amazing technology to be used, it’s the key to allowing us to live an amazing and fulfilling life well into our old age. And I haven’t even got started on mining asteroids and building planets or building homes on other planets, but this is where this technology takes us.”

The summary of the show is written by Jack Wallen


Here is the rough, unedited transcript of the show:

Swapnil Bhartiya: I’m your host Swapnil Bhartiya and welcome to TFiR Let’s Talk. SLAMcore is a company that’s reading algorithm that helps robots, consumer products, and drones to fully understand the space around them so that they can function and operate efficiently and safely. In today’s episode of Let’s Talk, we have with us Owen Nicholson, co-founder and CEO of SLAMcore, and we are going to learn a lot about the company and their technology. Owen, welcome to the show.

Owen Nicholson: Hi, great to be here.

Swapnil Bhartiya: So, let’s start with the basics. Tell me a bit about the company and, of course, SLAM and SLAMcore.

Owen Nicholson: SLAM is the name of the technical product that we build, SLAM is an acronym and SLAMcore, my company, we are designed to productize that technology and make it available to the world. So to tell you about SLAMcore, I need to tell you about SLAM.So, SLAM stands for simultaneous localization and mapping, and it’s actually something that all humans and animals do, just by design. The moment we enter into a space we’ve never been before, we’re able to get from one side of the room to the other, without crashing into things, and the way we do that is by constantly building a mental model of the world and understanding where we are within that space at any one time. Now, SLAM is the technical equivalent of what nature has already designed and SLAMcore spanned out from one of the top universities in the world, about five years ago, with the primary mission, to make this technology available to people looking to make products with the ability to understand space and safely navigate through it.

Swapnil Bhartiya: And realistically robots in today’s world are a bit different from what we see in science fiction or what Elon Musk kind of showed us a few days ago Tesla [inaudible 00:01:49]. If I ask you, where are robots today? If you look at their evolution curve, what role are they playing in today’s world?

Owen Nicholson: Robotics have been a dream and something sci-fi has talked about for decades and decades, and sometimes we may be a bit disappointed that we don’t have the fully autonomous robots running around. But actually when you take a step back, especially in industry, robotics are used extensively within industry. So if you look at a lot of the factories of today, a lot of the manufacturing is done by autonomous robots, especially in logistics and intra-logistics we’re starting to see robots which move materials from A to B, pallet moving robots that take goods to human workers, to allow them to pack them, to stop them having to walk miles and miles every single day. There are drones being used within the industry as well for roof inspection, for inspecting pylons and areas, which are actually quite dangerous for humans to go to.

And we’re also starting to see a new emergence of service space robots ranging from robots that will deliver your pizza in the restaurant to telepresence robots, which allow you to actually move around an office space, even though you’re on the other side of the world. We’re still in early days, to answer your question of where we are, I’d say we don’t have fully autonomous machines like the Tesla bot, which is the plan to be able to do everything a human can, but for specific areas and specific use cases, we are starting to see an increasing amount of autonomy and robots with the ability to understand the space being used, and we have a long way to go, and I’m super excited to see where this journey will take us. But actually I think what we sometimes miss is just how extensive the robotics market is today. It’s already a hundred billion dollar industry. So this is something which is here and it’s growing at over 25% [CAGR 00:03:53]. So just imagine where we’ll be in 10 years.

Swapnil Bhartiya: There are a lot of players who are getting a lot of global [inaudible 00:03:59] technologies there. If I look at SLAMcore, how are you creating your technologies that give you an edge over your competitors? And if I ask you, who are your competitors?

Owen Nicholson: Very good question. I think to answer that we should first really address the macro problem that we’re looking to tackle as a company, because at kind of an individual robot level, getting from A to B safely and reliably, that’s that’s what needs to happen, but at an industrial level, because we’re starting to see this explosion of different types of robots, within a warehouse there could be a pallet moving robot or a stock inspection drone, and even maybe an autonomous augmented reality headset, which the operator uses to be able to see where things are relative to them.

And all of these machines are trying to understand the space, but at the moment, most are solving the problem as isolated spatial [silos 00:05:00] . So that robot, that pallet moving robot, if it encounters a problem, it can’t flag that to the drone or to the human operators via the AR headset, to say that there is some, some something going on in space that it should be aware of.

And SLAMcore’s approach to solving this problem is very much mimicking nature. And there’s a reason why nearly every animal on the planet uses vision as the main sensing modality to get, understand space and move through it. So vision and cameras, obviously the technical equivalent, there is more spatial information available in the feed from a cheap camera than the most expensive LIDAR on the planet but the problem has always been to date, accessing that data in real time, using affordable hardware, so processes which don’t cost the earth or take so long to process the data that the robots crashed by the time it’s actually realized there’s an object in its way, and this is really what SLAMcore brings to the table.

We are deep experts in the integration of vision based localization and mapping, which we also fuse together with inertial data, which is also inspired by nature.

So humans and most animals use cameras, but also our inner ears: the gyroscopes, the inner ear accelerometers that we have that allow us to know, even with our eyes closed, how far we’ve moved in a small amount within a small amount of movement, and combining those two data sets feeds together, vision, so cameras, and inertial sensors, which are gyroscopes and accelerometers, just the cheap ones you find on your mobile phone, we’re able to create a full spatial intelligence solution, which works extremely well for an individual robot, but because we’re using vision and obviously the world is a visual world we live in, that information can then be communicated across all of the different robots within that product range and probably more excitingly across other product ranges as well.

So we can create a universal global understanding of space with each of the individual machines feeding into this.

Swapnil Bhartiya: I just want to go a bit deeper into the vision you’re talking about because when we look at factory floors or any other setup, there are so many variables. So can you talk about the evaluation that you see, or the limitations of today’s technology that you are still working on so that SLAMcore or SLAM gets, no pun intended, very good picture of what’s around it.

Owen Nicholson: So it’s absolutely the case that the robotic systems today whilst very capable, they struggle in many different environments and especially for safety critical applications, such as robots that move around humans. We need to be absolutely certain that if there is a person in the way that that machine is going to stop, because they can be half a ton of machinery at any one time, and the way in which we solve that problem is, it’s actually interestingly not necessarily just with vision; the best robots on the planet use multiple sensors, and the key then is to fuse that data together in a way, which gives you a sum, which is greater than the parts.

So for example, laser-based systems, LIDAR is very popular in the automotive and in the robotics space. They are very, very good at giving you almost a hundred percent certainty if there is something in your path, because it’s a laser, if that laser is blocked, then it ultimately acts as a proximity sensor. What they’re not very good at is giving you a general understanding of space. What are the objects? Is that a person? Is that a shelving unit? Is it another robot? Cameras are very good at that side of things because we have much more spatial information, but maybe not as accurate giving you a hundred percent certainty, is there an obstacle in my way?

So if we fuse the two different sense of feeds together, you can actually end up with something which works much better than any one single sensor can, allowing you to use cheaper censors but still deliver that high level of performance, and then the key really is to select the right sensors and the right combination for the application that you are looking to build.

And that’s something SLAM called really focused on is. It’s not just providing technology, but providing a full toolkit that allows developers to select the right algorithms and the right sensors, and then develop them and tune them and optimize them to really work for their applications.

Swapnil Bhartiya: It’s never this technology or that technology, it always has to be a mix of whatever. Also, since you initially talked about humans and living beings; we judge distance based on two eyes, the distance between two eyes and front facing eyes, but a lot of animals, they have side, some animals can have 360° views. We also use hearing, other animals, they also use infrared: how much ideas have you borrowed from the natural world for SLAMcore and SLAM?

Owen Nicholson: Nature is probably our biggest inspiration in what we do, and ultimately using the selection of cameras and vision is what nature has selected as the primary sensing modality, but also the sense of fusion. Like you said, using inertial sensors, using our ears, then we have other animals, bats use an ultrasonics and all these different elements that are come together to give spatial intelligence. But really interestingly, one thing you will find is there is very few animals on the planet that have very accurate distance measurement. I don’t know exactly how far away that wall is on the other side of my room, but I can navigate effectively without crashing into things. So actually sometimes we get overly focused on absolute accuracy and absolute distance measurements without thinking about what is the application you’re trying to do.

So if the aim is to move materials from A to B being within 10 20, even a meter might be fine. If you’re looking to do micro millimeter welding of a joint for an aircraft, you certainly do want to be able to make those accurate measurements. So again, it really comes down to system design and choosing the right sensors for the right application and everything from, like you said, where are the cameras positioned? There’s a reason why there are so many hundreds and thousands of different types of animals on the planet because they have selected or evolved the sensor suite that makes sense for their application. So although humans, we are obviously very capable animals, there are many other animals out there that can outperform us in specific domains and specific areas, and they have very different sensor setups than we do, but nearly always using vision as the core sensor.

Swapnil Bhartiya: You alluded to, you turned [inaudible 00:11:44] some applications. But if I ask you, what are the most exciting use cases that you have seen of SLAM and SLAMcore?

Owen Nicholson: We launched our SDK at the start of this year. We already have 100 companies and growing using the software and I’m amazed nearly every day when we have new inbound requests from companies with different types of application that they’re looking to use SLAM and autonomy for. So probably the biggest commercial growth area is the warehouse and intra-logistics. I think it’s fascinating, but if we want to talk kind of exciting, interesting applications, we’re seeing applications within… for UV cleaning, using robots, which can go into spaces and decontaminate entire hospital wards, or offices completely autonomously. I think that’s a very worthwhile technology to be developing.

We’re starting to see things like micro-mobility. I don’t know if it’s the same where you’re from… where you are, but in London, there’s lots of these micro-scooters lying around, and the ambition is ultimately for these micro-scooters and e-bikes to be able to move themselves around almost on-call, but they need autonomy to be able to do that and we’re working with customers who are looking to actually provide full autonomy to these machines within the urban landscape.

And then we have some really interesting applications ranging from social robotics that actually help to communicate with the elderly… everything from reminding them to take their medicine to just acting as a way for them to have that engagement in an interaction, and we see this just exploding at the moment. So many different types of application and so many opportunities for us to apply our technology.

Swapnil Bhartiya: Sometimes when we do talk about this technology, we tend to become overly ambitious, “Hey, we should everything be like that.” But if I ask you, because you give some examples of some realistic problems that we should be solving, and in most cases, we don’t even look at them or think of them, if I ask you that, if you have seen some problem in the area that you’re like, “Hey, this is the problem we could very easily solve with robotics.” What are those where you think robotics should play a more kind of critical role?

Owen Nicholson: I think agriculture and particularly being more sustainable in the way in which we produce our food. I think there’s a huge benefit and autonomy is a really great way of helping us to create more with less, ultimately.

With vertical farming, having the ability to directly inject pest control instead of generally just spraying entire crops, actually targeting the individual pests directly. I think there’s, that’s a really interesting area that autonomy is starting to come into. One thing I’m particularly passionate about is that there are a number of jobs out there, which are very, very dangerous for individuals to do from inspecting bridges to high electricity pylons and this large-scale infrastructure and this is exactly the type of application where autonomy and robots can perform very well, because we can actually remove the need to put people’s lives at risk and use drones, and robots that climb, and robots that are able to swim underwater, to actually inspect this infrastructure, ultimately allowing us to make things safer and last longer.

Swapnil Bhartiya: There is a social problem that people do see with robotics. Have you ever thought about that problem? It’s very easy for us to say, “Hey, you know what?” Then just… but there are a lot of people whose livelihood, their families rely on these kind of jobs. Have you ever looked at the social aspect? And if yes, what you think about that?

Owen Nicholson: [inaudible 00:15:36] and this is such an important question, and it’s one that we cannot afford just to say, “Okay, well, let’s see where we end up.” The good news is we have a long way to go, and this is going to take time, I’m talking decades and decades. But now is the right time to have this conversation. I think a couple of really important things to take away though, is a number of studies are now showing that autonomy actually increases jobs as opposed to decrease as jobs, and it’s really down to the way in which you use that and make sure that the jobs that they do replace, there are additional jobs around that, and there are places for people to go. I think another really important point to mention is from a global perspective, we have a real crisis in productivity and the ability to look after our aging population.

So at the end of the day, we do need, if we want to sustain this level of lifestyle, we do need to find ways of being able to do more with less people, because ultimately there are going to be more older people in the future who do need to be looked after. So autonomy has the ability to help augment humans and actually increase the amount of work we can do, and it’s… We do need to be super careful that this technology isn’t used for negative things.

And one of the missions of SLAM core is to democratize the autonomy for the benefit of humanity, and I would be heartbroken if our technology started to be used for negative consequences and actually in ways which we didn’t approve.

But actually, I think when you really look at the big challenges we face as a species, these are things like global warming, we have sustainable agriculture. Looking after the elderly robotics are not the problem, they’re actually the answer, and I will fly the flag and fight the corner for robots, because I think as long as we are careful and we talk openly and we have a real dialogue about how we want this amazing technology that you used, they are the key to allowing us to live an amazing and fulfilling life well into our old age. And I haven’t even got started on mining asteroids and building planets or building homes on other planets, but this is where this technology takes us.

Swapnil Bhartiya: No, if I just zoom out of all the discussion and go at a higher level, if you look at a company, of course, there are a lot of things that you cannot share at this moment but, what kind of roadmap you have? What plans you have in your pipeline for future?

Owen Nicholson: Sure, so we’ve been relatively stealth up to now, four years of building out the core technology. We launched at the start of this year and we’ve had an absolute huge amount of inbound. We’ve we’ve now got over a hundred companies using it.

There’s a relatively slow time… It takes 12 to 18 months to get these onto robots which are deployed, so I can’t give names of actual commercial deployments, but they’re coming very soon. And really the future for us is first to get this technology out into as many people’s hands as possible and we’re looking to do that over the next 18 months. We have built a system which allows us to do that very simply you can be up and running in less than a minute with our software, and that really will help us to enable companies to start using vision in their autonomy stack, but it will also allow us to keep on innovating and adding new features to that system because the more companies that use our solution, the better we can make it because we can start to optimize across all these different verticals and all these different application spaces.

We already have three million meters of external data and we did that just in one quarter. In the future, we’ll have hundreds of millions and billions of meters of data traveled by our system and that means we’ll able to, we’ll be able to provide very robust and reliable performance. So really, I think if I could summarize it with three words, it’s grow, grow, grow. That’s that’s the plan.

Swapnil Bhartiya: Well, thank you so much for taking time out today and talk about the company, SLAM technology, and also how robotics are… is kind of helping a lot of industries to kind of help us not do those dangerous jobs. And also give us an insight into what the world would look like; you share some exciting use cases about the scooters and all those. Sometimes we focus too much on big problems that we forget a small problem, which are the real problems actually. So thanks for sharing all those insights, and I would love to have you back on the show, thank you.

Owen Nicholson: It’s been a pleasure. Thank you so much.

You may also like