Physically-Based Rendering on PowerVR – Part 1 – The Model

Share on linkedin
Share on twitter
Share on facebook
Share on reddit
Share on digg
Share on email

ImageBasedLighting is one of the most visually exciting examples included with the PowerVR SDK. For those of you who are unaware, this demo makes use of physically-based rendering combined with image-based lighting to produce a result that looks stunning but still runs smoothly, particularly on mobile PowerVR GPUs.

Image-based lighting

We thought ImageBasedLighting is such an interesting demo that it deserves a bit more of an in-depth discussion, so we put our heads together to produce our guide to this example, Physically-Based Lighting with Image-Based Rendering for PowerVR, which is available now on our new documentation website.

This guide gives you an overview of how this example works, including:

  • a breakdown of the basic ingredients of the demo
  • a discussion of the assets used
  • how these assets were processed to prepare them for use in the application
  • why certain image formats were chosen
  • how the large value ranges of high dynamic range images were managed
  • how the various shaders used in the demo work
  • how the demo was optimised to make physically-based rendering possible in real-time
  • and why PVRTexTool is awesome

Phew! That’s a lot, but it shows how much work was put into this example to get it right.

This post will give you a bit of a taste of what’s in store in this document, focussing on what were assets used in the demo and how they were processed. In later posts, we’ll take a look at some of the other assets as well as how the shaders were optimised to get the application running smoothly.

But before we go any further it is important to mention that the implementation used in ImageBasedLighting is largely based on the Epic Games publication, Real Shading in Unreal Engine 4 by Brian Karis. This publication provided the basis of the demo and Epic has our thanks for this.

So, without further ado, let’s dive in!

Wait, wait, wait… Physically-Based Rendering?

Right, of course. I’m sure many of you reading this have heard of physically-based rendering (PBR) and image-based lighting (IBL), but to make sure everybody is on the same page:

Physically-based rendering is a set of techniques which attempt to model how light interacts with objects based on the real material properties of those objects. These material properties include things like the albedo, emissivity, roughness, and the metalness, all of which can change how light bounces off an object. These properties are often encoded in texture maps which are sampled by the shaders at runtime. This is great for artists and designers because they can describe objects in a much more intuitive way by tweaking these material properties.

This is also great for the rest of us, because it produces very realistic object lighting, meaning we get to see great results like this:

This is taken from the Dwarf Hall demo created by the PowerVR Demo Team. More information can be found here.

Image-Based Lighting

Image-based lighting (IBL) is focussed on trying to simulate something called global illumination. Global illumination is based on the idea that everything in a 3D scene (and the real world) is at least a little bit reflective and the light falling on an object doesn’t just come from direct, explicit light sources like the Sun, but from all objects in the scene. In IBL, global illumination is approximated using a set of images to capture the light coming from all directions. In our demo, this takes the form of an environment map which encodes all of the light information from a scene in the real world. The results from this approach look great when the light is coming from a faraway environment, however, it is currently too expensive to be anything other than static. The great advantage of using static image-based lighting is that it makes physically-based rendering viable across a broad range of GPUs, even ones on the lower end.

Physically-based rendering and image-based lighting are widely seen across the gaming industry, with Unity and Unreal supporting physically-based rendering pipelines by default.

Now that we’re all up to speed on the basics, we can take a look at some of the assets which made our physically-based lighting demo possible.

The Assets

Assets are those pieces of content which are created prior to runtime – think textures, models, audio, and so on. If you were reading the paragraphs above carefully, you may have already spotted some references to important assets.

The assets used in the demo can be divided into four different categories:

  1. A model and its associated textures
  2. The bidirectional reflectance distribution function (BRDF) stored in a look-up table
  3. The environment map
  4. Global illumination maps

Today we’re going to focus on the first of these categories.

The Model and its Textures

The model file describes the mesh of the object that is going to be rendered in the demo. In ImageBasedLighting we used the Damaged Helmet model which is included as part of the glTF Sample Models GitHub repository which is managed by Khronos.

damaged helmet

This model is great for demonstrating physically-based rendering, as it has a fantastic mix of different types of materials and surfaces, including a gently curved glass faceplate, more angular metal surfaces (some polished and shiny and others scratched and battle-damaged), duller non-metallic tubes, and a light-emitting HUD. Combining all of these different types of surfaces in one compact object shows how accurately physically-based rendering can model the diffuse and specular reflection effects of light.

If you want to have a play with this model in glorious 3D, take a look at it on Sketchfab. Or download the PowerVR SDK and run the ImageBasedLighting demo.

The information about how reflective, metallic, or rough different areas of the model are is stored in texture maps which are helpfully provided with the model. This model ships with basically all of the textures you might need to produce a realistic result.

These are:

The Albedo Map

The albedo tells you the normal base colour of the object or the colour of diffuse light, meaning this texture doesn’t contain any shadow or directional light information. You can think of it as the colour of the object in normal, white light.

The Metalness and Roughness Maps

The roughness is self-explanatory. It describes how rough or smooth the surface is. In PBR, the roughness determines how specular reflections occur on a surface. On a perfectly smooth surface like a mirror the light is reflected at the same angle as it came in, meaning the outside world will be reflected perfectly in the surface. For very rough surfaces the incoming light is scattered at various angles, almost approximating diffuse light, leading to a very fuzzy reflection. Values in the roughness map vary between 0.0 and 1.0

Without going too far into the physics of materials, the metalness determines whether the material should be interpreted as an insulator or a conductor. Insulators like wood or ceramics tend to absorb and scatter light, producing diffuse reflections while conductors like metals produce specular reflections. The metalness, therefore, determines how the calculations for diffuse and specular light should be combined. For a metalness of 0.0, the material is an insulator, so we’ll exclusively use the diffuse contribution to the colour. For a metalness of 1.0, the material is considered a conductor, so we’ll only use the specular contribution. Now, while a material can’t be half a conductor or half an insulator, objects can be layered with different materials, so for metalness values between 0.0 and 1.0, the diffuse and specular contributions are combined.

The metalness and roughness both only have values ranging from 0.0 to 1.0, this means they only require a single channel of a texture to store all of their information. It makes sense then to pack these two maps together into a single texture, with each colour channel representing a different property. You can see this in the image below. The metalness is stored in the blue channel while roughness is stored in the green.

The Emissive Map

This texture contains information on any self-emitting light sources on the surface. These can be features like LEDs or LCD displays. After all of the fancy PBR maths in the shaders, the contribution from the emissive map is added to the final pixel colour.

In the image below you can see that there are only a few small areas which emit their own light. These are mainly on the faceplate where there is a HUD.

The Normal Map

The normal map describes the medium-scale structure of the surface. Information about any bumps or dents on the object is captured in this map.

The surface normals sampled from this map will be used when calculating the diffuse contribution to the lighting.

The Ambient Occlusion Map

The ambient occlusion map tells the renderer how bright different areas of the object will be in ambient light.

The ambient occlusion can be precalculated based on the geometry of the model. If light of equal intensity is hitting the object from all directions, the more exposed parts of the object will be brighter while the more enclosed or sheltered areas will be darker. The amount of shading is generally calculated by projecting rays out in all directions from points on the surface of the object. The greater the proportion of those rays which hit another part of the object, the darker that point will appear on the ambient occlusion map.

geometry

Processing the Textures

We had to do a little bit of preparation before these textures were ready for the demo. Most of the work simply involved compressing the original jpeg files to Imagination’s proprietary texture compression format, PVRTC. This was done in only a couple of clicks using PVRTexTool and meant that the textures took up less disk space while maintaining high image quality.

We also did a bit of texture packing. As mentioned before, the metalness and roughness maps had already been packed together into a single texture – being stored in that texture’s blue and green channels, respectively. This is good, but there’s still room for improvement as the texture can be packed further by using the empty red channel. We choose to pack the ambient occlusion map into this texture because, like metalness and roughness, it only has values between 0.0 and 1.0, so you can fit all of its data into a single channel. Again, it was PVRTexTool to the rescue, as this could be accomplished using a single command in the PVRTexTool CLI (command-line interface). Packing like this reduces the number of textures that have to be retrieved from memory at runtime which helps overall performance.

Conclusion

That’s all you get for now on physically-based rendering. If you want to read more about how these textures were used to create the ImageBasedLighting demo check out the full guide, Physically-Based Rendering with Image-Based Lighting for PowerVR, at docs.imgtec.com.

Next time we’ll take a look at the BRDF, environment, and global illumination maps and see how they were used in the demo.

 

Before you go…

Our new documentation website isn’t just for physically-based lighting, we’ve got plenty of interesting documents whatever your level of knowledge, including:

So, if you’re interested please take a look at our new, regularly-updated website.

Do feel free to leave feedback through our usual forums or ticketing systems.

You can also follow @tom_devtech on Twitter for Developer Technology-related news, or @powervrinsider for the latest on PowerVR!

Tom Lewis

Tom Lewis

Tom Lewis is a graduate technical author in the PowerVR Developer Technology team at Imagination. He is responsible for producing documentation to support the PowerVR SDK and Tools, including user manuals and guides. Outside of this, you will probably find him cycling up a hill that is far too steep or catching up on the latest PC game releases.

Please leave a comment below

Comment policy: We love comments and appreciate the time that readers spend to share ideas and give feedback. However, all comments are manually moderated and those deemed to be spam or solely promotional will be deleted. We respect your privacy and will not publish your personal details.

Blog Contact

If you have any enquiries regarding any of our blog posts, please contact:

United Kingdom

benny.har-even@imgtec.com
Tel: +44 (0)1923 260 511

Search by Tag

Search by Author

Related blog articles

Celebrating the 20th anniversary of Dreamcast and PowerVR

It was 20 years ago today (well yesterday, strictly speaking, but close enough) that Sega released the Dreamcast onto the world in the US, with the Japanese and European launches following on later that year. While its reign was short

Monitor window

Balancing GPU workloads on PowerVR hardware

This post is another summary from our PowerVR Performance Recommendations focused on eliminating performance bottlenecks by balancing different GPU workloads.

Stay up-to-date with Imagination

Sign up to receive the latest news and product updates from Imagination straight to your inbox.

  • This field is for validation purposes and should be left unchanged.

Subscribe to our newsletter

  • This field is for validation purposes and should be left unchanged.