Forza Motorsport Comparison between the old and new DLSS

This is what currently bothers me. I can’t play races at night because of the shadow bug.

Game looks like this for me atm. With the settings above etc. this is 200% res scale at already 2160p native res.

It’s super smooth for me, not like in this video lol.

I have an LG OLED as well and a similar Denon receiver. I had issues getting g-sync to work after upgrading to an Nvidia 4090. I eventually bypassed running everything through the receiver. I now run everything directly into the TV, which includes a PC and Xbox series x and a ps5. Then I used the eArc channel on the TV to get the audio to the receiver. This has eliminated so much frame stuttering and gsync and vsync issues. You may wish to try this if you’ve got the cabling for it. I also replaced all my HDMI cables with certified 8k48 Gbps cables. And now I am able to run 4K 120 FPS 12 bit RGB HDR from my 4090 to the TV. Previously when routing through the receiver I was limited to 10 bit and YCbCr.

12 Bit RGB isn’t a thing. Just set it to 10 Bit Full RGB or you are actually stuck with 8 Bit BELIEVE ME!

I also have a OLED I have a LG B9 from 2020. I also have a DENON AVR, my 5090 is plugged into the DENON and the DENON is plugged into the B9. In the Nvidia Control Panel I activated G-Sync for the DENON and it works perfeclty fine. It’s an extra setting YOU HAVE TO ENABLE, or else it doesn’t work right.
See this ticked box? That only exists if you have a G-Sync capable AVR connected to your GPU.

1 Like

Everything is correct, I have the same settings

You can believe that 12-bit RGB is not a thing. But the truth is there is 12-bit RGB and there is a difference in banding and color depth capability when running 12 bit RGB versus 10-bit. 12-bit RGB is not somehow stuck on 8-bit. Not sure where you ever got that information.

I’m a professional photographer, use calibrated 12-bit RGB displays for editing and if I switch it to 10 bit I can see a difference in heavy HDR editing. The newer LG OLED panels are 10-bit panels but they use 12-bit processing before down sampling to the display. So they will definitely accept a 12-bit signal and use it to create a 10-bit panel display.

If you’ve got a scientific article that proves otherwise please feel free to share I would be curious to see it.

1 Like

You must have bought one of these newer professional monitors then, because regular monitors that have the option for 12 bit, aren’t really 12 Bit capable. It’s just fake.

8 Bit = SDR
10 Bit = HDR
12 Bit = Dolby Vision

If you don’t have an apple laptop or something that has the LICENSE to run Dolby Vision in your OS then you are most probably using that fake pseudo option that is just 8 Bit.

I can set my TV to 12 Bit RGB too and no it doesn’t work, it’s just 8 Bit. But hey if you know it better then that is good for you…