March 28, 2024

Over this past weekend the internet has been flooded with lot of misinformation regarding AMD’s ability to display HDR ready content using HDMI 2.0, particularly in the few HDR ready gaming titles that are available.

In contrary to what some online publications may be telling you, AMD is NOT limiting HDR colour depth when using HDMI 2.0, with the HDMI standard itself being the limiting factor. Right now we assume that this release of false information is due to a lack of research on the part of writers working that these publications and we will not mention them any further.

Now let’s set the record straight, the problem here lies with the HDMI standard itself and not with AMD, as even Nvidia has the same issues when using HDMI to display HDR content at 4K.

 41886ad41cf6

The original story can be read here, which claimed that Radeon graphics cards were reducing the color depth to 8 bits per cell (16.7 million colors) or 32-bit, if the display was connected to HDMI 2.0, and not DisplayPort 1.2 – something that spiked my interest.   10 bits per cell (1.07 billion colors) is a much more desired height to reach for HDR TVs, but the original article made it out to seem like this was a limitation of AMD, and not that of HDMI 2.0 and its inherent limitations. Heise.de said that AMD GPUs reduce output sampling from the “desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD ‘Polaris’ GPUs, including the ones that drive game consoles such as the PS4 Pro,” reports TechPowerUp.

Antal Tungler – the Senior Manager of Global Technology Marketing, said “we have no issues whatsoever doing 4:4:4 10b/c HDR 4K60Hz over Displayport, as it’s offering more bandwidth”.

Tungler added: “4:4:4 10b/c @4K60Hz is not possible through HDMI, but possible and we do support over Displayport. Now, over HDMI we do 4:2:2 12b 4K60Hz which Dolby Vision TVs accept, and we do 4:4:4 8b 4K60Hz, which TVs also can accept as an input. So we support all modes TVs accept. In fact you can switch between these modes in our Settings”.

20112126200l

To put things simply HDMI 2.0 was designed for 4K and was never intended fully support HDR, at least at YCrBr 4:4:4 colour spaces at anything higher than 8-bit colour. This makes things problematic for HDR content, as anything HDMI 2.0 does not have enough bandwidth to support HDR at a full YCrBr colour sampling of 4:4:4 and instead needs to compress the signal to YCrBr 4:2:2 or 4:2:0.

We will need to wait for HDMI 2.1 or another future standard before we will get to play HDR content on at 4K with a full chroma colour sampling, with current HDR standards like HDR10 and Dolby Vision relying on Chroma Sub-Sampling to playback HDR content using HDMI 2.0.

To be clear AMD does support HDR standards like HDR10 and Dolby Vision, with these standards relying on Chroma Sub-Sampling to play 10-bit and 12-bit HDR content at a chroma sub-sampling on 4:2:2. HDMI does not support 10-bit or 12-bit displays with a full chroma sampling of 4:4:4 as it simply does not have enough bandwidth to do so.

What should be taken away from this article is that HDMI is holding modern display technology back, first limiting 4K TVs to 30Hz while we waited for HDMI 2.0 and now limiting 4K HDR to a chroma sampled content while we wait for the next iteration of the standard.

HDMI has long held back displays while DisplayPort has always been there to offer additional bandwidth for the displays of tomorrow, rather than just the displays of yesterday. It is baffling that the display community has not abandoned HDMI for DisplayPort, especially since DisplayPort is a royalty-free standard.