So colors, brightness, shadows, and lighting effects have to change dynamically. And to do this, your graphics card uses a special program called a pixel Shader. Also Read: What Is Ray Tracing Technology and How It Works in GPUs
Pixel Shaders
Pixel Shaders can perform many different functions in a system. However, their main job is to make sure that each and every pixel is lit properly. They do this by first taking into account how much light a surface should be reflecting. They then account for texture, transparency, and any other nearby objects that could be casting shadows. Shaders typically take a good bit of processing power in order to run. That is because they’re responsible for so many graphical functions. Applying a different shading effect to every single pixel has become more and more computationally expensive. This is majorly due to the continued rise of display resolutions. This is especially true for Virtual Reality (VR) gaming where VR headsets require high resolutions and refresh rates to provide immersive experiences. Otherwise, VR players would be feeling sick quite often. That is where variable rate shading comes in. Also Read: DirectX 12 Ultimate: All You Need To Know
Variable Rate Shading
This exciting technique really helps lighten the system load leading to very smooth VR experiences. All that without necessarily requiring expensive hardware. Here’s how it works. When wearing a VR headset, some of the pixels have to be lens corrected because you’re viewing the image through a curved lens. Which is instead of looking directly at a flat-screen like you would on a traditional display. This means that some areas of the image don’t have to be as detailed as others. That is due to the fact that some areas will always be either in your peripheral vision or even discarded entirely as part of the lens correction process. So instead of treating every individual pixel with equal weight, variable rate shading varies with how much processing power is used for each part of the image. With that, game developers can instruct the GPU to provide as much detail as possible to image parts in front of your eyeball. At the same time, shading parts in the peripheral vision in a quicker and dirty fashion. Also, developers are able to choose just one shading output for a block of up to 16 pixels. They then evaluate a single pixel near the middle of the selected area and apply the result to the entire chunk of the image. That ends up saving a significant amount of processing power. And if a pixel was just going to get discarded completely by the time it reaches the headset, it actually doesn’t have to be rendered at all. Also Read: What is Virtual reality (VR) and how different is it from Augmented Reality(AR)
Eye Tracking
More improvements are on the way as eye-tracking finds its way into VR headsets. Eye-tracking is set to be integrated with variable-rate shading technology. Here, your computer will not dedicate its processing power to the image parts in front of your eyes all the time. With eye-tracking, your console or PC can tell where you are looking at each moment. Then it will assign the most power to that location leaving your peripheral vision a little blurrier. As it would be anyway. This technique can still be useful for gaming without eye-tracking VR headsets. For instance, variable rate shading can also be useful when it comes to games with lots of uniform or redundant textures. A good example is large scale simulation games where a lot of your screen’s real estate is taken up by giant patches of land. Now, of course, this feature only started getting support last year (2019). So it might be a little while before we see tons of game developers implement it. However, if you’re the type of person who gets nauseated when the frame rate on your Oculus Rift drops, I’m sure you’re feeling like it can’t come soon enough. Also Read: Facebook launches Oculus Quest, an All-in-One wireless VR headset