Currently HDRR has been prevalent in games. Though these games are mostly for the PC, it is also possible to render scenes with high dynamic range in Microsoft's Xbox 360 and Sony's PlayStation 3. It has also been simulated on the PlayStation 2, GameCube, and Xbox. In desktop publishing and gaming, color values are often processed several times over. As this includes multiplication and division it is useful to have the extended accuracy and range of 16 bit integer or 16 bit floating point format. This is useful irrespective of the abovementioned limitations in some hardware.
[edit] Development of HDRR through DirectX
Complex shader effects began their days with the release of Shader Model 1.0 with DirectX 8. Shader Model 1.0 illuminated 3D worlds with what is now called standard lighting. However, standard lighting had two problems:
1. Lighting precision was confined to 8 bit integers, which limited the contrast ratio to 256:1. Using the HVS color model, the value (V), or brightness of a color has a range of 0 - 255. This means the brightest white (a value of 255) is only 256 levels brighter than the darkest shade above pure black (i.e.: value of 0).
2. Lighting calculations were integer based, which didn't offer much accuracy because the real world is not confined to whole numbers. “Nature isn't clamped to [0..1], neither should CG” [4].
Before HDRR was fully developed and implemented, games would create an illusion of HDR by using light blooming and sometimes using an option called "Enhanced Contrast Settings".
On December 24, 2002, Microsoft released a new version of DirectX. DirectX 9.0 introduced Shader Model 2.0, which offered one of the necessary components to enable rendering of high dynamic range rendering: lighting precision was not limited to just 8-bits. Although 8-bits was the minimum in applications, programmers could choose up to a maximum of 24 bits for lighting precision. However, all calculations were still integer-based. One of the first graphics cards to support DirectX 9.0 natively was ATI's Radeon 9700, though the effect wasn't programmed into games for years afterwards. On August 23, 2003, Microsoft updated DirectX to DirectX 9.0b, which enabled the Pixel Shader 2.x (Extended) profile for ATI's Radeon X series and NVIDIA's GeForce FX series of graphics processing units.
On August 9, 2004, Microsoft updated DirectX once more to DirectX 9.0c. This also exposed the Shader Model 3.0 profile for high level shader language (HLSL). Shader Model 3.0's lighting precision, according to Dr. Sim Dietrich Jr., has a minimum of 32 bits as opposed to 2.0's 8-bit minimum. Also all lighting-precision calculations are now floating-point based. NVIDIA states that contrast ratios using Shader Model 3.0 can be as high as 65535:1 using 32-bit lighting precision.