However, if you want to output true 10-bit, then you’ll need to step down to YUV422 signal.
The TV will convert the signal back down to 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. How to output 10-bit video on an NVIDIA GPU Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal. You might be wondering “But my TV turns on its HDR modes and games look better” this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.īy default, both NVIDIA and AMD GPUs are configured to output RGB 8bit. You’ve probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.