3liteDragon
Member
RDNA 2 is more power efficient than Turing and can push higher clocks while consuming way less power than a 2080 Ti while also being built on a superior process node (12 nm vs. 7 nm), so I’m not surprised to see PS5 handling ray-tracing at native 4K better than a 2080 Ti.You need high bandwidth for true 4K and raytracing. Even the 2080 Ti can be bandwidth starved in today's games. Imagine not having dedicated raytracing cores like tensor cores, and having less bandwidth, trying to achieve the same. No amount of SSD can change this, as you need to render that data.
The R&C demo ran at native 4K with ray-traced reflections, which even took me by surprise as someone who at first really didn’t have any faith in ray-tracing on consoles. We saw Spider-Man: MM running at native 4K with ray-traced reflections as well, if developers are starting to use ray-tracing like this this early, imagine how this feature’s gonna be used when we get improved BVH data structures and more advanced de-noising algorithms down the line.
Last edited: