Physically-Based Rendering with PowerVR – Part 2

Share on linkedin
Share on twitter
Share on facebook
Share on reddit
Share on digg
Share on email

A few weeks ago we introduced Physically-Based Rendering with Image-Based Lighting for PowerVR – which has been published on docs.imgtec.com. This document is a guide to the PowerVR SDK demo, ImageBasedLighting, which implements physically-based rendering (PBR) with image-based lighting (IBL). If these terms don’t mean a lot to you, then why not take a look at part one of this blog post series which covered the basics of PBR and IBL, as well as some of the assets used in the demo.

Today we’ll be looking at some of the more complex assets which we had to generate ourselves: the bi-directional reflectance distribution function (BRDF) lookup texture, the environment map, the irradiance map, and the prefiltered reflection map.

These are vital for image-based lighting, as, if you remember from last time, the approach of IBL was to approximate all of the light coming from a scene (global illumination) by using a series of images which encode the intensity of light from a scene in the real world.

Let’s start with the first of these images, the environment map…

The Environment Map

The environment map stores the real light energy from a real-world scene. In ImageBasedLighting, we used an unclipped HDR photo from HDRI Haven. This image gives a 360-degree view of the scene around an object, so it captures both the intensity and direction of light. This is great because we can simply sample this texture at runtime to determine the light hitting our object.

We did a bit of processing on this image to get it ready for the demo, including converting it to a cubemap, transcoding to a PVRTC compressed image format, and saving it as a PVR texture. This is covered in much more detail in Physically-Based Rendering with Image-Based Lighting for PowerVR, but let’s just say you should download PVRTexTool (it saves a lot of hassle!).

So, we’ve got the information about the light hitting our object. Next, we need to look at a little bit of physics to figure out how the surface of the object interacts with the incoming light.

The BRDF and Diffuse Reflection

In simple terms, the bidirectional reflectance distribution function (BRDF) determines how light reflects off of a surface. More specifically, this function tells you what proportion of the light hitting a surface from one (incident) direction is reflected in another (viewing) direction.

The simplest possible BRDF is a constant. This is the case for diffuse reflection (Lambertian reflectance) where the apparent reflected brightness from a surface is the same in all directions.

The BRDF, in this case, is simply given by:

Where ρ is the albedo. We mentioned the albedo in the previous post.

In PBR, the albedo is considered the base colour of the object, or the colour in normal, white light. The amount of reflected diffuse light from a point on the surface is then given by:

where the two vectors are normal vector of the point on the surface and the intensity and direction of the incoming light.

So, calculating the diffuse colour of a particular pixel on the surface of an object should be simple, right? You just need to sample the environment map and add up the light contributions from all directions. Well, unfortunately since we’re trying to approximate global illumination every pixel in a hemisphere on the environment map can contribute as a light source to the colour of a single pixel on the object. The environment map used in ImageBasedLighting is 2048×1024, so that’s a lot of texels to sample. It gets even worse because we have to sample the entire environment map for every single point on the object. This would be impractical to do in real-time.

Instead, we pre-calculated this and stored the result in a texture called an irradiance map. Like the environment map, this map was converted to a cubemap – the images below show each of its faces.

More information on how this map is actually calculated can be found in the document. This irradiance map can simply be sampled at runtime with the surface normal (from the normal map) of a particular pixel to determine the diffuse contribution to the colour of that pixel.

The BRDF and Specular Reflection

We’ve got our diffuse lighting which will give some nice flat, even lighting on the model. This is fine, but it’s not that interesting and doesn’t take into account all of the material textures we mentioned in the previous post. For this, we need specular reflections and a much more complicated BRDF. We could assume the surface is perfectly smooth, like a mirror, so light rays would simply be reflected around the surface normal rather than being diffused.Mirror Specular Reflection - Physically-Based Rendering with PowerVR However, most objects are rougher than a perfect mirror. We need a BRDF which can modify the specular reflection depending on the roughness of the surface. In ImageBasedLighting, we used a Cook-Torrance microfacet specular model. We won’t be looking at the specific details of this model today (as there are many great resources online already), but in short, this model is based around the idea that a rough surface can be approximated as a series of microfacets. These can be thought of as very small bumps and dints in the surface that reflect light perfectly about their local surface normals like mirrors. The direction of the surface normals of each of these microfacets will vary, meaning the incoming light will be reflected in slightly different directions depending on which microfacet they hit.Microfacet Reflections - Physically-Based Rendering with PowerVR

This leads to the specular reflection spreading out slightly, forming a lobe and becoming a bit more blurry.Specular Lobe - Physically-Based Rendering with PowerVR

For very smooth surfaces, the surface normals of the microfacets all pretty much line up, so they are pointing in the same direction. The distribution of microfacet surface normals will vary depending on the roughness of the surface. The rougher the surface, the greater the variation of surface normals and the more diffused the specular component becomes. For really rough surfaces, the specular begins to approximate diffuse light. Microfacet models simulate the cumulative effect on the specular reflection of a large number of microfacets. Calculating the Cook-Torrance model at runtime would not normally be possible on mobile hardware; luckily, we can use an approach called the “split-sum approximation” to push much of the calculation offline.

Unsurprisingly, this approach splits the calculations into two sums which can be computed individually and then multiplied together to get the final specular colour. The results of these two parts of the BRDF can be stored in two textures. This avoids a set of expensive calculations as these textures can simply be sampled at runtime. As we mentioned in the last post this method is based on (the PDF document) Real Shading in Unreal 4 by Brian Karis, who suggested using this “split-sum approximation” approach. 

The BRDF Lookup Table

The first sum in the split-sum doesn’t depend on either the environment or the model, so it is the same for any implementation. It is stored in the 2D lookup texture shown below.

 

BRDF lut with labels - Physically-Based Rendering with PowerVRThis texture essentially encodes how the roughness and viewing angle (θv) can modify the value of F0. F0 is a property of the reflecting surface called the specular reflectance at normal incidence. This is a fancy term for the proportion of light which is reflected back as specular when incoming light hits the surface perpendicularly. It can be thought as an equivalent of the albedo of a surface in the diffuse case.The lookup table can be sampled with a roughness and a viewing angle to find a scale and a bias to F0. In the shader, F0 is multiplied by the scale with the bias being added to it afterwards.

The Prefiltered Reflection Map

The second part of the split-sum approximation uses the environment map to model the effect of increasing roughness on the specular reflection. As we’ve mentioned before, as you increase the roughness the specular reflection gets more and more scattered. This means smooth surfaces reflect the surrounding environment perfectly while very rough surfaces produce a blurry (almost diffuse) image of the environment.So how did we capture this behaviour in a texture? First, we assumed a range of roughness values between 0.0 and 1.0 and fed these values into the maths of the model. The output of this told us exactly how to blur the environment map for each roughness, so it was simply a matter of sampling the environment map and blurring appropriately. The images below show the result of this process in one direction of the map, but the actual texture will be a cubemap.

Now here’s the clever bit. You may notice as the roughness increases in the images above the resolution gets progressively smaller. What graphics technique deals with progressively smaller versions of a texture (hint)?Yes, we stored different roughnesses in different mipmap levels. This is an elegant solution, as during runtime we can just sample with this mipmapped texture using a roughness value as the level of detail parameter. There are potentially a few issues with using this approach, but these are covered in the main document.

Putting it all together

All of this work preparing these textures means at runtime, all the shader has to do is:

  • Find the first sum by calculating the specular reflectance and applying the scale and bias from the BRDF lookup table
  • Sample the prefiltered environment map to get the second sum
  • Multiply these two together to obtain the specular colour
  • Sample the irradiance map to get the diffuse light from the environment
  • Multiply this value by the albedo to find the reflected diffuse colour
  • Multiply by a factor related to the metalness. The metalness determines the balance between diffuse and specular reflections. Completely metallic surfaces will only produce specular reflections while totally non-metallic surfaces will only produce diffuse reflections. You can read more about the metalness in the previous post or in the main document.
  • Add the diffuse colour and specular colour to get the final pixel colour

As you can see there are no major calculations in any of these steps. Much of the work has already been done by pre-baking calculations into textures. This makes this implementation of PBR possible even on lower-end hardware.

Conclusion

So, there we have it. This and the previous post on physically-based rendering with PowerVR give a very brief overview of all of the assets used in ImageBasedLighting. However, if you want to get a full picture of how all of the assets were created and processed, as well as how the shaders were optimised for PowerVR GPUs, you really need to check out Physically-Based Rendering with Image-Based Lighting for PowerVR, as well as looking at the ImageBasedLighting demo itself in the PowerVR SDK.

Before you go…

Our new documentation website isn’t just for physically-based rendering with PowerVR; we’ve got plenty of interesting documents whatever your level of knowledge, including:

So, if you’re interested please take a look at our new, regularly updated website.And do feel free to leave feedback through our usual forums or ticketing systems.You can also follow @tom_devtech on Twitter for Developer Technology-related news, or @powervrinsider for the latest on PowerVR!

Tom Lewis

Tom Lewis

Tom Lewis is a graduate technical author in the PowerVR Developer Technology team at Imagination. He is responsible for producing documentation to support the PowerVR SDK and Tools, including user manuals and guides. Outside of this, you will probably find him cycling up a hill that is far too steep or catching up on the latest PC game releases.

Please leave a comment below

Comment policy: We love comments and appreciate the time that readers spend to share ideas and give feedback. However, all comments are manually moderated and those deemed to be spam or solely promotional will be deleted. We respect your privacy and will not publish your personal details.

Blog Contact

If you have any enquiries regarding any of our blog posts, please contact:

United Kingdom

benny.har-even@imgtec.com
Tel: +44 (0)1923 260 511

Search by Tag

Search by Author

Related blog articles

Improving Particle Systems on PowerVR

This article is part of our ongoing series about the PowerVR Performance Recommendations. Today we’ll be focusing on what particle systems are and how they can affect performance, so if you’re interested, why not take a look?

Read More »

Connect

Sign up to receive the latest news and product updates from Imagination straight to your inbox.

  • This field is for validation purposes and should be left unchanged.

Subscribe to our newsletter

  • This field is for validation purposes and should be left unchanged.