This was an issue I also discovered on Xbox 360 in 2008. TV’s have overscan and depending on that setting, your resolutions will be off.
However, at the time, we couldn’t create render targets that matched the overscan safe area. XNA added a Screen SafeArea rect to help guide people but it was still an issue that you had to consciously develop for.
Now, we can create any back buffer size we want. It’s best to create one 1:1 or use DLSS with a target of 1:1 to the safe area for best results. I’m glad the author went and reported it but ultimately it’s up to developers to know Screen Resolution != Render Resolution.
Anyone using wgpu/vulkan/AppKit/SDL/glfw/etc need to know this.
If I understood you correctly... you wanted to be able to render to a slightly smaller surface to avoid wasting graphics compute time, but that's still going to be upscaled to 1080 for the HDMI scanout, and then mangled again by TVs' overscan - which to me feels like introducing more problems more severe than whatever problem you were trying to solve in the first place.
(Besides, TV overscan is a solved problem: instead of specifically rendering a smaller frame games should let users set a custom FoV and custom HUD/GUI size - thus solving 3 problems at once without having to compromise anything).
No, Your TV says it’s 1080 but it’s not, it’s 1074… This is a solved issue now but it wasn’t when HDMI was first introduced. The Xbox 360 suffered from red rings of death. Microsoft hated Linux. And C# was cool.
Basically, if you rendered an avatar image in the top left of the screen, perfectly placed on your monitor, on the TV its head would be cut off. So you change to safe area resolution and it’s perfect again (but on your monitor safe area and screen resolution are the same, except Apple apparently). Make sense?
You can see how if your screen says it’s 4k, but really it’s short 40 pixels, you render at 4k - the screen will shrink it by 40 pixels and introduce nasty pixel artifacts. TV overscan goes the other way. Interesting find by the author about the notch.
Read up on g buffer and deferred rendering. Usually one doesnt do everything at full resolution until the final output and even then it is often better these days to have fancy upscaling.
Many games do let users set the things you mention but it is not always so simple. For example, handling rounded edges and notches is a huge pain.
This was an issue I also discovered on Xbox 360 in 2008. TV’s have overscan and depending on that setting, your resolutions will be off.
However, at the time, we couldn’t create render targets that matched the overscan safe area. XNA added a Screen SafeArea rect to help guide people but it was still an issue that you had to consciously develop for.
Now, we can create any back buffer size we want. It’s best to create one 1:1 or use DLSS with a target of 1:1 to the safe area for best results. I’m glad the author went and reported it but ultimately it’s up to developers to know Screen Resolution != Render Resolution.
Anyone using wgpu/vulkan/AppKit/SDL/glfw/etc need to know this.