Some of you must be living under a rock. A bunch of games run with temporal upscaling solutions already, the advantage here is for the developers that don't have a good solution of their own.
The GTX 1080 is only 20-25% more powerful than the GTX 1070 yet it enables them to double the suggest resolution?![]()
AMD FidelityFX Super Resolution 2.0 GDC 2022 Announcements
Last week was a big week for our AMD FidelityFX™ Super Resolution (FSR) (1) upscaling technology – after FSR launched in 2021 and became the fastest adopted software technology in AMD history (2), we announced FidelityFX Super Resolution 2.0, the next generation of our open-source upscaling...community.amd.com
Heres a table
Still it's supports rdna 1 and 2 cards so we're good yeah.....They're literally reiterating what AMD said directly lol. No one asked about anything, AMD announced this themselves.
Target output resolution. The internal upscaled amount isn’t that significantThe GTX 1080 is only 20-25% more powerful than the GTX 1070 yet it enables them to double the suggest resolution?
I wonder what enables that. More cores?
You still on 1070 ? Jesus lol.
Time to upgrade broda. A 3060 at least will do you wonders and they are everywhere now
PS5 supports this too, just needs a portGood to see that Xbox can support this. I wonder if that is due to the additional machine learning modifications MS made to the hardware?
Interesting take, considering it specifically mentions nothing like that is used.Good to see that Xbox can support this. I wonder if that is due to the additional machine learning modifications MS made to the hardware?
AMD lists the native resolutions on that page.Target output resolution. The internal upscaled amount isn’t that significant
In that under 1.5ms period, FSR 2.0 does all sorts of things, though — AMD says it replaces a full temporal anti-aliasing pass (getting rid of a bunch of your game’s jagged edges) by calculating motion vectors; reprojecting frames to cancel out jitter; creating “disocclusion mask” that compare one frame to the next to see what did and didn’t move so it can cancel out ghosting effects; locking thin features in place like the barely-visible edges of staircases and thin wires; keeping colors from drifting; and sharpening the whole image, among other techniques.
So what are the framerates (and thus the differences) for these 4 modes on a system?As expected but good to see confirmation. As with FSR 1, this should also run on PS5 as well.
![]()
![]()
You would think so, it's weird that AMD went out of their way to name drop Xbox but not PS5, they're both running on AMD hardware.
Unless this is a case of full RDNA 2![]()
Remember FSR 1.0 announced for Xbox Series and not PS5? PS5 had a game using it first.From the stills it looks really impressive. However, I will wait to see it in motion before saying its the second coming of DLSS.
It's a bit of a head scratcher about why it's available on XSX/S and not PS5. AMD would have announced it on PS5 if it was.
So it either requires a bit of programming work to bring it to PS5, which hasn't been done yet, or, MS played a part in its development so it's not going to be available on PS5, or, for some unknown reason it won't be able to run on the PS5.
That we haven't seen yet.So what are the framerates (and thus the differences) for these 4 modes on a system?
No, it is because it was created using HLSL and Xbox supports HLSL. Vulkan support is not there yet but any developer can port it to Vulkan or PSSL if they want as it is open source.That RDNA 2 RX 590.
It's because they are showcasing it with an Xbox studios game.
This looks really impressive.
You can compare native to the different FSR modes here.
FSR 2.0 - IMG 1 - Imgsli
imgsli.com
FSR 2.0 - IMG 2 - Imgsli
imgsli.com
lol then its useless info so far. Only point of reducing IQ is gaining frames. Need the real info.That we haven't seen yet.
No, it is because it was created using HLSL and Xbox supports HLSL. Vulkan support is not there yet but any developer can port it to Vulkan or PSSL if they want as it is open source.
Good news for the Series S.
Who needs native anymore?
You have to understand that FSR isn't some crazy new technology, it's just Temporal reconstruction with some AMD algorithms and sharpening filters.Remember FSR 1.0 announced for Xbox Series and not PS5? PS5 had a game using it first.
This information is coming out of GDC where developers are being briefed on the technology, there are a few more months before the first game to use it is released to the general public. The point is to reduce render load to gain more FPS but still retain a similar IQ which is what they are showing here. This is not useless information. It is a given that it will lead to increased FPS, what has not been shown is how much but the technology itself costs less than 1ms to run at the highest quality level on a high-end GPU.lol then its useless info so far. Only point of reducing IQ is gaining frames. Need the real info.
If I tell you I can save you money by using these 3 coupons, but I never tell you how much you'll save, then which coupon is best?This information is coming out of GDC where developers are being briefed on the technology, there are a few more months before the first game to use it is released to the general public. The point is to reduce render load to gain more FPS but still retain a similar IQ which is what they are showing here. This is not useless information. It is a given that it will lead to increased FPS, what has not been shown is how much but the technology itself costs less than 1ms to run at the highest quality level on a high-end GPU.
This information is coming out of GDC where developers are being briefed on the technology, there are a few more months before the first game to use it is released to the general public. The point is to reduce render load to gain more FPS but still retain a similar IQ which is what they are showing here. This is not useless information. It is a given that it will lead to increased FPS, what has not been shown is how much but the technology itself costs less than 1ms to run at the highest quality level on a high-end GPU.
If I tell you I can save you money by using these 3 coupons, but I never tell you how much you'll save, then which coupon is best?
I misread. The cost would more than likely pay for itself otherwise it would be pointless. As far as I remember DLSS can cost up to 10ms on some hardware so 3ms wouldn’t be bad at all.According to that table above, < 1ms on a high end GPU in performance mode, not quality mode.
Would it be terrible if quality mode was 3ms or more on a 6800XT?
1. It’s on a game by game basis so if you are told it provides X% performance uplift on Y game it won’t be the same for game Z.If I tell you I can save you money by using these 3 coupons, but I never tell you how much you'll save, then which coupon is best?
sadly that's what happened you leave your PC without usual upgrades every now and then. for example. if I to upgrade my GPU this year, to a 4000 series, I might have to upgrade my power supply to a new one that has this weird new connector the new GPU require.Fraid so, good man. I bought a prebuilt which is now hampered by too low of a wattage power supply and insufficient PCI Express lanes to make full use of the newer GPUs. I need to build a whole new system.
The example was Deathloop. Yet no FPS listed. Useless example.I misread. The cost would more than likely pay for itself otherwise it would be pointless. As far as I remember DLSS can cost up to 10ms on some hardware so 3ms wouldn’t be bad at all.
1. It’s on a game by game basis so if you are told it provides X% performance uplift on Y game it won’t be the same for game Z.
2. Download the sdk and implement it otherwise wait for actual games to release so you can see how much performance uplift it provides. It is one more optional feature you can use or not use.
3. Give me the fucking coupon, I’ll try all 3 for myself and find out.
Would like to add, that MS is using drivers in the Xbox OS directly from AMD, where Sony is using just basic bios and they are building on top of that themselves (they are using different business model than MS, even when it comes to architecture IP licensing, AMD supply everything for MS, but Sony is licensing their IP for usage/modifying/etc).
So that's why the difference between "True RDNA 2" and "RDNA 2 based technology".
And also why it's in the marketing materials AMD can state the functionality on Xbox, but not on PS5, because ultimately that's not their word to say it.
Hope that clear up some confusion.
Quality mode on a 6800XT takes less than 1.1ms @4K.According to that table above, < 1ms on a high end GPU in performance mode, not quality mode.
Would it be terrible if quality mode was 3ms or more on a 6800XT?
All you need to do is take Deathloop's performance and add the overhead.The example was Deathloop. Yet no FPS listed. Useless example.
And you know this how?PS5 supports this too, just needs a port
Nice framerate boost! Native 4k on 6800 is 56.8 FPS.Quality mode on a 6800XT takes less than 1.1ms @4K.
The FSR 2.0 overhead difference between quality modes is seemingly negligible. What changes is mostly performance and output IQ.
![]()
All you need to do is take Deathloop's performance and add the overhead.
For example, to get FSR 2.0 Quality mode with 4K output the game will render at 1440p natively.
Then you just get a Deathloop benchmark for 1440p:
![]()
And add 1.1ms to that framerate.
For example on a RX 6800 you get 97.7 FPS, that's 1/97.7 = 10.2ms per frame.
With FSR 2.0 Quality it takes 1.1ms more, so its 11.3ms.
1/0.0113 = 88.5 FPS for a 4K output with better than native IQ.
It runs on Polaris and Pascal GPUs from 2016. Insomniac has been using similar tech since the PS4.And you know this how?
Quality mode on a 6800XT takes less than 1.1ms @4K.
The FSR 2.0 overhead difference between quality modes is seemingly negligible. What changes is mostly performance and output IQ.
![]()
This is HUGE news, these consoles are in desperate need of this kind of tech. I hope it works well on them.
Was this at some Microsoft / AMD event or something? Why wouldn't AMD just mention both Xbox and PS5 the tech supports loads of GPUs so surely they could just mention the current consoles?