AI/MLFeaturedLet's TalkOpen Source

oneAPI Rendering Toolkit — From Avengers: Endgame To Automotive Industry

0

Guest: Jim Jeffers
Company: Intel
Show: Let’s Talk

In this episode of TFiR Let’s Talk, Jim Jeffers, Senior Principal Engineer and Senior Director of Advanced Rendering and Visualisation at Intel, joins me to talk about the open-source oneAPI Rendering Toolkit and how it’s being used not only to create blockbusters like Avengers: Endgame, but also in other industries like automotive.


Here is the raw/unedited transcript of the discussion.

Swapnil Bhartiya: Hi, I’m your host, Swapnil Bhartiya. Welcome to Let’s Talk. Today we are going to talk about Intel’s oneAPI Rendering Toolkit and we have with us Jim Jeffers, Senior Principal Engineer and Senior Director of Advanced Rendering and Visualisation at Intel. Can you talk a bit about the oneAPI initiative from a broader perspective as it’s an industry-wide initiative?

Jim Jeffers: It’s an open source, openAPI. That’s why it’s called oneAPI libraries, components, and a use model that crosses all platforms from all vendors. And then is what we call cross architecture, which means our CPUs, our GPUs… not only ours, the industry’s central processing units, graphics processing units at FPGAs. There are now network processing units. And the idea is to have an initiative. It’s not an overnight thing. We’ve been working on it for a couple of years already. But, the movement is single codebase runs everywhere. That’s the simple thing.

And then secondly, it’s fully open source. The compilers are open source. There’s a compiler called Data Parallel C++, which allows more parallelism capabilities. But, extracted from a GPU extracted from an FPGA, utilize your CPU, all with one tool chain and a development tool chain and libraries that cross architectures. Now, we need our vendor partners, AMD, Nvidia, the many partners out there to join up. That’s not going to be easy because it’s an industry, or an initiative that feels like Intel’s driving it. But, we really truly mean openness. One quick example is Apple’s M1 ARM-based devices run almost all of Rendering Toolkit’s libraries. That already happens. So, we are deeply into enabling Apple and others who have the products to ensure that the marketplace isn’t left behind. That’s the initiative, cross vendor, cross architecture, open tools and elements. We are applying that every day to our own products. And, we’re supportive of others joining it, to make the world easier for developers. The biggest complaint that we see is I have to detect whether I have your vendor’s product on there and then write different code. We’re trying to solve that problem.

Swapnil Bhartiya: Jim, can you tell us a bit about the oneAPI Rendering Toolkit?

Jim Jeffers: It’s a collection of Rendering and compute for Rendering-optimized libraries. They have continued to grow over the years. The most well-known one, that has been in the market close to 10 years or more, is called the Intel Embree ray tracing kernel library. That’s the heart or the beginning of a collection of libraries with the goal, and the overwhelming majority of the libraries have some relationship to ray tracing or, what’s called physically based rendering (PBR), where we use the physics of light and mechanisms around that to take a virtual 3D world and allow it to be rendered up to photo realistically. So, the oneAPI Rendering Toolkit is more than just Embree. Embree is the most well-known one. Embree won an academy award this past year because of its impact on the film industry. Embree has always been an open source project. Everybody can see what Embree is. A detailed scientist or researcher, graphics researcher can look at the code and see what we do. But, the goal is optimized high-speed rendering with a wide array of features to deliver the highest fidelity.

Other elements that have come to fruition with the tool are something that’s really gotten a lot of pickup in the last couple of years. It’s our AI-based denoiser that is called Open Image Denoise. Again, everything I’m going to talk about is a fully open source project. We do most of the code. We do accept contributions, but we’re really trying to put the highest performing, highest quality set of components useful. So, we use AI-based mechanisms to help filter out noise that is inherent in the earliest phases or through the phases of ray tracing for rendering. This allows you to create a real photo image, but cut it off. If you want it to be truly photo real and what we call 100% converged, depending on the 3D model, depending on how complex it is, you’re talking hours before it can get rendered on CPUs or even GPUs. When you add this denoising filtering capability, when it’s as high quality as Open Image Denoise, you actually just get to that solution much faster.

Now for a final movie, you still might want to wait, wait, wait. Because, the more you compute on a ray-traced image, the closer it is to the physics of light and how your eyes receive it. Additionally, Embree which does geometric rendering, Open Image Denoise which provides a denoising artificial intelligence or deep learning based denoising algorithm… It’s actually the highest quality in the industry. We’re going to be talking about that quality in, I guess, a blog that will hit right around when this steps.

The next thing is what’s called volume metric rendering. Intel Open Volume Kernel Library, we tend to call an open VKL, is hitting 1.0. It’s been available in the market for well over a year, but we’ve worked with our partners to push it to the point that it’s fully featured, ready to go. We euphemistically call it the Embree for Volume Rendering.

What is volume rendering? Volume rendering is effective when you have a space of pixels or objects that are effectively in a 3D cube, if you will, in some form or another. It is heavily used in particular in, again, movies. But, it’s also used in scientific visualization. In movies, these explosions aren’t really geometric objects. They’re fog clouds, things like that, that you want to be realistic. Volume rendering is important for that. And volume rendering is really picking up across a wide variety of areas. So, we have that product.

And then, we have a product that brings it all together in an easier use fashion than the individual ones. That’s called OSPRay and OSPRay brings a rendering infrastructure that consumes as libraries, Embree, Open Image Denoise and Open Volume Kernel Library. Now you have a higher level API that a traditional graphics programmer can make sense of. You don’t have to be a ray tracing expert to use it.

They’re the four main libraries. There are a couple of other components. One’s called OSPRay studio and the other’s called OSPRay plugin for the Hydra API. Hydra is this API defined by Pixar within an open source project called Universal Scene Description (USD), which is also getting dramatic uptake in the industry. Because it allows interchange of the same format data from multiple studios in order to make a movie. Today’s movies, even though it says Pixar on it, you probably have had 20 contributors to it. And, USD makes that easier to do that. So, we support the industry in that way.

Finally, I just want to say the domains are actually quite wide that ray tracing is used in, and that our products are used in. We’ve broken them into four domains that touch various areas. But you could think of your in-between domain and networks. The four domains are film, studio, media and entertainment; the other domains being scientific visualization, high performance computing, high scale visualizations and then, product design and architectural design often called architectural engineering.

Swapnil Bhartiya: We have already witnessed transitions from black boxes and proprietary technologies to open source hardware and software in many industries. Telco and energy sectors are also going through the same transition and transformation. What kind of transformation is happening in the studios?

Jim Jeffers: Particularly, let’s say in the film industry, and now we see it also, like you said, across other domains. But in the film industry, you’re completely right. It was say somewhere between 5 and 10 years ago, everybody had their own render. Everybody did their own thing, all the way down to the metal. One of the visions and goals of Embree from an open source perspective was they weren’t getting the performance out of the hardware, which isn’t a very… They spend a lot of money on processors to do 3D processing, to do the ray tracing and the other elements so that you get Fatos looking like a real character, not some cartoon thing or that type of thing. So, that’s just continued to grow dramatically. And the appetite, the consumers’ appetite for realism, even when things aren’t real, but that realistic view is very high.

Now you have a cost, you have two levels of cost involved. It boils down to, “Hey, we’re a capitalist society. We need to reduce costs or increase revenues, right?” You want to do both. So for a movie studio, you’ve got multiple things. But, the two key things that we’ve seen that have created the shift to open source are: for the things that everybody has to do that are cross-industry and don’t really affect what we would call special sauce or the look and feel of a Pixar movie like Pixar’s goal or DreamWorks’ goal. I don’t want to keep doing Pixar, Illumination’s goal. These are all customers of Embree and users of Embree. They do have a goal to say, “Oh, that’s an Illumination movie.” That you don’t actually have to see it. That’s almost the ultimate goal.

That look is their special sauce. But, they want to put out more movies per year, or they want to put them out at a lower cost. So, when it comes to the lowest level access to the computing hardware, Intel, and some of our competitors know their hardware way better and can make it shine much faster when you buy a new processor to take advantage of it. Our software is completely tuned to maximize that capability. We’ve done what we’ve looked at with our open source and seen how it’s used. And, there are other non-Intel versions of it too. The bottom line is things can cross the industry where technology can be poured in. It’s not like it’s not low technology. It’s high technology, but it doesn’t provide a competitive differentiation, open source is just exploding in those areas. From the Intel perspective, it’s our ability to take our products as they launch, because of Moore’s law and all that stuff. We’re making things better, but it’s complex and my team are effectively Ninja programmers who are the world’s experts in ray tracing.

We provide that. These layers and things like the similar things for other software only processing elements. There’s something called OpenColorIO. Think about what that means? It’s very simple. Color images in a movie, better be able to look the same. If you are having multiple people contributing to a movie, use OpenColorIO to ensure that the end result color, which is a continuum in floating point space, matches what you want. If you ask one person, one company does a scene and another person comes to the scene and the colors don’t work. That’s actually visible. It’s things like that. There’s a thing called OpenVDB, which is a volume rendering, which we support and OpenVKL. Again, the industry is standardizing open environments.

You have the other notion in open source, which is both the benefit and the negative, which is usually a visionary or a small visionary who delivers the code, and then they start picking it up. If that’s not curated and managed as a community project, it could die and people could actually stop making movies. I’ll stop there.

It’s that type of thing. It is important. It doesn’t mean that the customized applications or the commercial applications aren’t important. It’s this mix of commercial and open source and the foundational open source that is really exploding.

Swapnil Bhartiya: Let’s talk a bit more about rendering. It’s being used just beyond the film industry. What industries are leveraging it and is oneAPI playing a role there?

Jim Jeffers: Absolutely. I’m going to pick automotive. We have a close partnership with Bentley Motors in the VW group, who is one of the spearheads of technology for VW. There’s a multi-level use case in automotive that a lot of people don’t understand. The first one’s pretty simple designing, and then reflecting those designs to the marketplace in a way that people want to buy a car or know what car they’re buying. Color, right? What color car do you want? Right. We all love that. But, the idea that in order to give them the flexibility, they effectively need to make 3D movies of their cars to do commercials. Now, a lot of it’s real, a lot of it’s the environment, but a lot of it is manufacturing it in a way that’s appealing, but real and physically based. They don’t want to fool anybody, but they want to be efficient. Automotive is there.

The other twist that we’re working on, which I think is super cool, is autonomous driving. A completely different rendering mechanism, validating that your car will successfully drive around the world. How do you do that? How do you prove that your AI is smart enough to do it? There’s a whole other project where we emulate real world cameras and then there’s LiDAR and other things that need a highly accurate, correct physical basis so that our future autonomous driving cars are actually ready to go. I’ve heard in the past, or five years ago from Tesla or whatever, that in order to get their software to work, they would have to drive around to see or experience 150,000 miles worth of stuff. That’s not economically feasible. But, if a computer feeds it with real world data and you can randomize it, so that the bird flies by while the kid’s ball goes out and there’s a big white truck in front of you…that is going to make it better. There are two uses in automotive that I find very exciting and we’re actually actively involved in it.

Swapnil Bhartiya: Let’s go back to the example of the film industry. How are they leveraging Intel’s rendering technologies and also oneAPI?

Jim Jeffers: Not everybody knows this, but people in the industry do. Movie rendering is a high-performance computing act. They have the equivalent of super computers, some in the Cloud, some what we call on-premises at Pixar and California and at DreamWorks in LA and Hollywood. They have big things called render farms. It’s called a farm for a reason, because farms have a big scope. What we have done and what our A goal, A goal that is extended is, our optimized software linked with their smart rendering and look and feel software, if you will, optimize the delivery of movies, making them from the past, twice as fast to render, whatever it is, hours and hours of time to render a movie. If we can add 10% improvement, that’s a cost benefit.

Interestingly, our code runs anywhere on a render farm. If they’re running 24×7, Embree or OpenVKL or Open Image Denoise, when they’re in there, that class of code is running 50% to 80% of the time. The actual execution is in our code. Because that’s where all the compute happens and everything. Again, that’s a more generic thing so that when they get the answer from our code, they can do what’s called shading or deliver the pixel that way. One major use of oneAPI, and one of the benefits is the flexibility that day after day compute platform, after compute platform, it just works. We make it, so it just works. All they have to do is integrate our latest codes.

So, that’s one of the major uses in visual effects, all that stuff has all this compute going on. We try to make that as fast as possible.

Swapnil Bhartiya: Jim, thank you so much for talking to me, not only about oneAPI project at Toolkit and all those exciting use cases  but also sharing that it just goes beyond film—Automotive as you shared an example. I would love to have you back on the show, not only to talk about this initiative, but other projects that are there, which of course have some open source perspective. Thank you for your time today.

Jim Jeffers: Oh, I loved it. It was great talking to you. I would be happy to come back when the time is right. Yeah. Thanks to everybody listening and we’ll see you later.

author avatar
Swapnil Bhartiya
Swapnil Bhartiya is a seasoned journalist and media personality. He is the founder, show-host and CEO of TFiR.io.