# Bit of a weird one. Ultra deep color (enabling HDR) causes Nvidia GTX1080 to show green spots???



## Darkstar82

Hi all. The joy of ever more complex technology is ever more bizarre bugs.

After a painful PC build:








The Tale of the Computer Build of Woe.


In the film Conan, he gets nailed to "the Tree of Woe". This is the story of the build of woe to which I've been repeatedly nailed. I realise these are first world problems, but still frustrating. So after 6 years or so of faithful service it was time to replace ye olde rig. Ye olde rig...




www.overclock.net





Followed by an excruciating several months trying to get a GPU:








After 25 years, I'm about a whisker away from...


First off, these are first world problems and I understand few will give a damn, I just want to vent. Ive been building PC's for 25 yrs both personally and professionally. And I have never, ever, seen the industry in the state it is currently in due to rampant greed not just on the part of the...




www.overclock.net





And being conned by Chinese sellers on Ali, (got a refund in the end)

I ended up buying a second hand GTX1080 (for $510 ) because I live in the UK where anyone who works for their wages is regularly bent over a desk with every purchase and there is literally nothing new available from anywhere despite me having the majority of a thousand dollars sat on a visa card ready to go. Bought card from CEX-UK so 2yrs warranty on it which is decent and it is good enough for now.

I have a new TV, LG nanocell86 50 inch 4K 120Hz hdmi 2.1 which has a setting "HDMI Ultra Deep Colour" that when turned on enabled me to turn on Wide color gamut and HDR in windows. But when I do this with the GTX1080, I get little flickering green dots on the desktop background.

Now my first thought was (card is borked, artifacts...) but It blew through firestrike and timespy with many a black surface and not a green dot was in sight. I loaded up red dead 2 and for the first time jacked up all the settings to max @ 1080p (very pretty) and played the game, day/night, no green dots, no artifacts, silky smooth fps, very happy. So I exit the game back to black desktop background and all my flickering green dots.

The problem goes away entirely if I disable ultra deep colour on my TV which disables HDR in windows so it is obviously related to this feature. I left the card overnight at 100% load on stability tests, got up 8hrs later, still going fine, no issues nor errors. All happy. Just with green dots on the desktop background and some dark theme websites like youtube.

What am I missing here? does the 1080 not support this at all? could it be an issue with my HDMI cable and how Nvidia sends data Vs AMD hence why it didn't show up on the RX580? Although faulty and having to be returned in the end, I got the RX580 in order to enable HDR so I could watch "The Boys" season 2 on Amazon with its HDR features.

As a first look at 4K with HDR for me, It was visually impressive. So I really want to have this feature turned on and indeed I spent good money on this TV for such features when connected to PC, and don't know why it is causing me this issue of green measles. I would grab you a screen shot but when I take a snip, the measles are not present.

Whether you reply or not, my thanks to you for your time none the less. Best wishes.


----------



## shilka

I had the same problem with an RTX 2070 so the problem is not your 1080 the problem is Windows dont work with the ultra deep color thing


----------



## UltraMega

Almost nothing supports "deep color". It's sort of a gimmick, or a feature that you will almost never be able to use correctly because it's intended usage case is very rare today. Just turn it off. It's not the same as HDR, you can still use HDR without it.


----------



## Darkstar82

UltraMega said:


> Almost nothing supports "deep color". It's sort of a gimmick, or a feature that you will almost never be able to use correctly because it's intended usage case is very rare today. Just turn it off. It's not the same as HDR, you can still use HDR without it.


I wish that were the case friend but when I turn ultra deep color off on the TV it disables HDR and wide colour gamut in windows, they cannot be enabled in windows until this is turned back on on my TV. I presume it has to do with "nanocell" and their funky color tech that makes them so incredibly vibrant on this TV, when enabled/working. IT really does make a hell of a difference to the vibrancy of the colors in every situation to have this enabled on my TV. Especially reds and blues. They just "pop" out and my eye balls love it. I can run without it and yeah its nice enough, but its miles away from as good as it can be. And as mentioned, it works fine under an RX580, hence my head scratching.


----------



## Juicin

Damn 500 for a 1080....?

I have a 1080 just sitting in a closet.


----------



## Offler

Juicin said:


> Damn 500 for a 1080....?
> 
> I have a 1080 just sitting in a closet.


1. Keep the card and use it if possible.
2. Try to look for WCG color profiles for your display/tv. It might solve the issue in some cases.

As mentioned above almost nothing supports Deep Color/WCG or HDR.
a) There is HDR10, HDR12 and HDR14.
All three are listed as WCG/Deep color but each is in fact different color depth (30bit, 36bit and 48bit). My GPU and display both support only HDR10.

b) HDCP
Even when your display and GPU support HDR, they have to support HDCP on top of it. Its a certification/encryption. My Samsung display supports HDR10 colorspace, but NOT HDCP for Deep colors. That means that colorspace will revert to standard 24bit/8bit per RGB channel.

This certification is required for the display, GPU, the cable between them, and eventually UHD Bluray drive and USB connection and even player/browser.

c) HEVC and HEVC_main_10 hardware decoding
HEVC is a next gen codec for 4k. HEVC10 is version with support for HDR10 or 10bit/30bit colorspace. For example my FuryX supported only HEVC, but NOT HEVC10.

When it works, it looks really amazing, but in fact its little bit more complicated to make it work.

With the issue you described i would guess the card/windows desktop attempted 12 or 14bit depth per channel, but the display supports only 10bit, good news is that UHD blurays or deep color streams support 10bit max.


----------



## Beagle Box

Juicin said:


> Damn 500 for a 1080....?
> 
> I have a 1080 just sitting in a closet.


And I'm looking at my watercooled 1080 sitting unused on my desk. Strange times.


----------



## UltraMega

Offler said:


> As mentioned above almost nothing supports Deep Color/WCG or HDR.


Deep color is a weird rare thing to have support for but plenty of things support HDR. 

OP seems like you have a tv with crappy tv settings.


----------



## Slaughtahouse

UltraMega said:


> Almost nothing supports "deep color". It's sort of a gimmick, or a feature that you will almost never be able to use correctly because it's intended usage case is very rare today. Just turn it off. It's not the same as HDR, you can still use HDR without it.


Is this true? I thought "Deep colour" was required to be activated for HDR to work on LG tv's *for external devices*. At least with my C9 OLED, when I have my computer hooked up, HDR will not "activate" with the little HDR logo unless "deep colour" is activated.

When I activate it, the feed cuts out for about 5 seconds (black screen) and then HDR kicks in. If I attempt to play an HDR compatible game without "deep Colour", i'll just get weird, overcorrected gamma and incorrect colour.

How do you enable HDR without "deep colour"?



Spoiler: HDR guide from Rtings






> *HDR*
> 
> 
> HDR is automatically enabled for the native apps. Once you start playing HDR content, some of the settings change automatically, including *OLED Light*, which increases to '100.' We recommend leaving these settings to their default settings in HDR.
> For HDR to work from external devices, the *Ultra HD Deep Color* option usually has to be enabled from the 'Additional Settings' menu for the input you are using. Older devices may have compatibility issues if this option is left enabled, so it is recommended to only enable this setting for devices that require it.














LG C9 OLED Calibration Settings


We used the following calibration settings to review the LG 55" C9 (OLED55C9), and we expect them to be valid for the 65" model (OLED65C9), and the 77" model (OLED77C9).




www.rtings.com


----------



## UltraMega

Slaughtahouse said:


> Is this true? I thought "Deep colour" was required to be activated for HDR to work on LG tv's *for external devices*. At least with my C9 OLED, when I have my computer hooked up, HDR will not "activate" with the little HDR logo unless "deep colour" is activated.
> 
> When I activate it, the feed cuts out for about 5 seconds (black screen) and then HDR kicks in. If I attempt to play an HDR compatible game without "deep Colour", i'll just get weird, overcorrected gamma and incorrect colour.
> 
> How do you enable HDR without "deep colour"?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LG C9 OLED Calibration Settings
> 
> 
> We used the following calibration settings to review the LG 55" C9 (OLED55C9), and we expect them to be valid for the 65" model (OLED65C9), and the 77" model (OLED77C9).
> 
> 
> 
> 
> www.rtings.com


I don't know the specifics of your TV but when I google deep color, I found that it's a tech meant to upscale the color depth of older video sources.


----------



## Slaughtahouse

There is a good deep dive here OP: HDMI Deep Colour. On or Off?



> Yes as 4k UHD (3840x2160) @ 60hz with uncompressed 32-bit RGB color (8-bit per component [R,G, and B] ) requires 17.82Gbps of the 18Gbps that HDMI 2.0 provides.
> 
> What is frustrating is that they don't tell you that audio may be troublesome when using this feature as there wouldn't be enough remaining bandwidth for the audio portion. What I had to do with my gtx 980 is directly use the HDMI 2.0 port to my LG UHD TV and a separate DVI-HDMI connector to my AV receiver for sound.
> 
> This way the HDMI port only needed to worry about video and wouldn't flicker/drop audio. If you rely on your TV speakers for audio, then the solution is to get separate speakers or wait for HDMI 2.1 displays (revision announced at the beginning of January). FYI, I had to disable the audio on the HDMI port of the graphics card to make it stop flickering.


@OP, your device only supports HDMI 2.0 and potentially, you're running into this issue (tapping into all available bandwidth). If you disable audio, does the issue (green spots) persist?

Also, do you have any other HDMI cables?


----------



## alexp247365

Just a random 2080ti and 3800x sitting around.. Could be for sale.. or may go to wife's rig.


----------



## Asmodian

@Darkstar82, did you turn on Wide Color Gamut and Windows HDR before you got the green spots?

It likely isn't a problem with Deep Color itself, but with a color option, Windows' HDR, or WGC.

What output mode are you using in the Nvidia control panel? Try setting YCbCr 444 10 bit, if you can. Also try other options.











UltraMega said:


> Deep color is a weird rare thing to have support for but plenty of things support HDR.


I don't understand all this talk of Deep Color being weird and rare?!

Deep Color is very standard, there isn't really any content for it that isn't HDR, but most everything supports it today. It is simply high bit depth color, supported by Photoshop for decades.


----------



## UltraMega

Asmodian said:


> @Darkstar82
> 
> I don't understand all this talk of Deep Color being weird and rare?!
> 
> Deep Color is very standard, there isn't really any content for it that isn't HDR, but most everything supports it today. It is simply high bit depth color, supported by Photoshop for decades.


When I google it, what I found may have been incorrect. Perhaps Deeps color is just LG's way of saying "HDR".


----------



## oobymach

It's self descriptive what it does, you'll notice if you watch movies, it preserves black/dark and white/bright detail better which you'll notice if you watch movies/tv shows with and without it (they tend to look washed out without it and vivid/bright with it). 

You can find similar setting in the video player settings in nvidia control panel which only affects video content.


----------



## Asmodian

UltraMega said:


> When I google it, what I found may have been incorrect. Perhaps Deeps color is just LG's way of saying "HDR".


Not HDR, but higher bandwidth (>8bit) display modes. Windows might limit HDR to displays that support >8 bit color but that is on Microsoft, not LG or the HDR specs in general. I can disable Deep Color and still send HDR to my LG CX. The TV still switches into HDR mode and I get an HDR image.

Disabling Deep Color disables HDMI 2.1 display modes but I can still send 10 or 12 bit data, only limited to 4Kp30 4:2:2 (not needed for HDR, 8bit HDR is great, with dithering). 

There is a lot of mixing of ideas when it comes to the new HDR standards but it is really mostly independent concepts and standards.



oobymach said:


> It's self descriptive what it does, you'll notice if you watch movies, it preserves black/dark and white/bright detail better which you'll notice if you watch movies/tv shows with and without it (they tend to look washed out without it and vivid/bright with it).


That is not a mode you can randomly change like that! It is a configuration option that needs to match between the TV and the computer. Please do not use it to adjust the image, use the color and/or gamma options if you have no way of doing a proper calibration and you need to adjust the image. This advanced video setting is a way to force it if you have a weird display or you cannot simply set it properly in the normal resolution menu. 

On LG TVs this setting is called Black Level. High means full range, 0-255. Low means limited range 16-235. Auto does seem to work reasonably well in my experience.

Using the wrong mode simply displays the image incorrectly in two different ways:

If you send the TV full range video, but it is using black level low, then the shadow and highlight details are clipped. It looks higher contrast in a way, because 16-235 was mapped to pixel 0%-100%, but the image contained data below 16 and/or above 235.

If you send the TV limited range video, but it is using black level high, then the image looks washed out, because the 16 that is supposed to be 0% black is instead represented as a shade of gray since it is mapping 0-255 to pixel 0%-100%.

Windows ALWAYS renders in full range RGB so if you set any output options to limited range (16-235) the GPU will be doing a range compression, mapping 0-255 to 16-235 (with dithering). With hardware decoded video this may not be true, depending on how it is rendered, that pathway is pretty much a black box so it is hard to determine exactly what happens to the video data in that case.

For this reason you can get double range compression steps happening, for washed out video even with the TV using black level low (or auto), but you cannot get the reverse; double expansion clipping shadows and highlights even with the TV using black level high.


----------



## geriatricpollywog

Juicin said:


> Damn 500 for a 1080....?
> 
> I have a 1080 just sitting in a closet.


I found some stuff laying around.


----------



## Asmodian

Taunting people isn't nice.


----------



## Juicin

0451 said:


> I found some stuff laying around.
> 
> View attachment 2483841


I game at 60 fps brah i have no use for any of that lol

I used that 1080 for mining


----------

