• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD announces FSR4, available “only on Radeon RX 9070 series”

CrustyBritches

Gold Member
We've seen the leaked benchmarks that put it around 2% faster than a 7900 GRE, which retails for ~$550. The primary reason to get a 9070 XT around the same price would be FSR4, but they didn't even demonstrate it. It's fucking crazy, man.
 
We've seen the leaked benchmarks that put it around 2% faster than a 7900 GRE, which retails for ~$550. The primary reason to get a 9070 XT around the same price would be FSR4, but they didn't even demonstrate it. It's fucking crazy, man.
I think AMD will want to hold the cards close to their deck, the only thing they can beat Nvidia on is pricing but other than that I imagine the 5070 will smoke AMD's 9070, especially with DLSS 4 and RT.
 

phant0m

Member
RADEON is cooked. Seriously. Feels like they have no confidence in their product, I’d wager fsr4 probably isn’t even ready.
I have more faith/confidence in Intel at this point, and they don’t even bother playing in enthusiast/high end.
 
I have more faith/confidence in Intel at this point, and they don’t even bother playing in enthusiast/high end.
As funny as that is, those GPUs don't work with older CPUs, which is precisely the market the B580 is targeted at.

No PCMR bros, its over. You've got the monopoly you've always wanted.
 
  • Thoughtful
Reactions: amc

SolidQ

Member
Could you maybe rank your priorities for RDNA 4, why you picked them, why you ranked them so high?

I would say our number one priority is focusing on improving performance in the areas that gamers care about most. In this generation, you'll see big ray tracing improvements, big MLOps improvements for things like FSR4 and ML Super Resolution.
 

Buggy Loop

Gold Member
I don't know how this guy is still a source of news on technology websites.



This was in one of MLID's thumbnail like before RDNA 3 I think?


7kYYlXD.png



Monday Night Raw Lol GIF by WWE
 

llien

Member
They pick the worst price points every time, just cheaper than Nvidia but not enough that it’s a good value.
6600 was a full tier faster than 3050, even faster at RT gimmick, ran cooler and was cheaper.
3050 outsold 6600 four to one.

Dumdums are buying green no matter what.

As far as I can see, at least in Germany and at least in DIY segment, AMD is doing well.
 

llien

Member
AI AI AI, like who uses copilot locally?
I run AIs locally on my ASUS AMD Advantage with 6800M.
Stable Diffusion on Amuse AI (funny that AMD owners get this sleek experience while NV guys need to bake half bake python scheisse to get to it).

Text stuff on AMD version of Ollama. It is insane what one can download and run with a single command line nowadays.
 

llien

Member
AMD might as well not bother announcing the 9000 series later on in q1, because most people who were going to buy a GPU will just buy a 5000 series
No card shortages, no overpricing.

Nobody would buy 5000 series in Q2 either, as everyone would get them in Q1.

Yeah. I mean, totally.
 

SolidQ

Member
AMD's marketing was pathetic in Raja's times.
There also seems spy working, which cancel top RDNA4, they was thinking 5090 would be much faster, but 5090 is like RDNA3, with slower clocks. They would be easy kill current 5090 with N4C, but someone pathetic guy decide cancel it.
 

llien

Member
There also seems spy working, which cancel top RDNA4, they was thinking 5090 would be much faster, but 5090 is like RDNA3, with slower clocks. They would be easy kill current 5090 with N4C, but someone pathetic guy decide cancel it.
I view it like this:

Bigger dies are huge investment.
NV sells top cards at ridiculous price points no matter what.

AMD, even if rolling a card that is faster at raster and even faster at RT, would still be pinned down with "buh DLSS 7?", "buh why does it run slower RT in <insert NV sponsored game>".
(see how dumdums are orgasming over marketing slides of DLSS4, as if DLSS1 marketing didn't exist)

Very poor ROI prospects. Not wasting time on it and polishing other products instead is a better move.

Heck, AI AI AI and effing 150+ notebook design wins is the best strategic move imaginable.
pytorch and tensorflow already run on AMD natively, Amuse AI is amazing, but this has potential to greatly increase AMD's footprint in ML software world.

Entire ML market was at about 200 billion last year. Next year, Microsoft alone is pouring 80 billion.
 

SolidQ

Member
But why not even xx80 competitor?
Because N42 was tiled, they decide make only mono. For UDNA as we know for now they will make like X080. UDNA also gonna have Matrix Cores from CDNA.
Maybe after seeing 5090 is like RDNA3, they decide gonna make HALO with UDNA2/3

They calculate. Ok we can make HALO, but question like "Will it give money back to us"?

There was like
N44
N43 aka N48
N42
N41
N40 aka N4C

There was like - "they just modelled a huge uplift based off 50% more SM, new Uarch, GDDR7 and 33% moar memory width", but 5090 is underdelivered
 
Last edited:

Radical_3d

Member
Which being faster at raster is not, lol.


Buying a $1k card (7900XTX is under $1k, why $1.200) to... upscale stuff is not very reasonable, or is it.
It is because with only raster you can’t move the heaviest of games without recurring to scaling. If the AMD’s high end would run some unoptimised shit like Wukong at 4K 120fps+ on Ultra, or some path traced game at the same level without any scaling, then sole raster power would matter. But we’re going forward to a future of fake pixels it seems.
 

llien

Member
Radical_3d Radical_3d

Wukong screnshots proudly shared by fellow gafers look.... terribly low resolution to me.
Which is admitted by the posters themselves with "but in motion you don't notice it".

Well, if I don't notice resolution in motion, I can simply pay at lower resolution, eh?


That's been everyone's goal with the AI stuff. Using gaming GPUs isn't the most efficient way to crunch those numbers - it was just the best thing they had on hand at the moment AI became a thing. In a few years all the AI stuff will run on dedicated hardware that's very different from gaming GPUs. Then our side of the market can go back to normal.
Google's cloud has TPU hardware, but that didn't take off that well for whatever reason.
 
Oh, does it? And dafuq that means ,exactly?

Does it have AI Turbo version of Poopybutthole too?

Does it do other amazing things like opening chakra's?


Exactly what needs to happen to prevent gamer brains from vibrating when they come across meaningless marketing buzzwords?
Reconstruction, my fucking bottom. DLSS1 did that. No, it didn't quite work.
Nothing hints at glorified TAA derivatives using NNs for anything besides denoising.

DLSS 4.0 has super mega blastoid frame insertion slitting your eyes lightning the inside of your poopy butthole globally illuminating the lemmings that opens up the chocolate starfish so you can smell the farts real time megaton.

AMD (and Intel) are still going to play catch up. The best thing they can offer is price to performance ratio with decent raster, RT, and upscaling. I would not fall for the NVIDIA trap of 5070 = 4090ti that is indeed bait and would wait for real world performance.
 
This is a dumb decision. The only thing AMD had going for them was being generous with VRAM and not locking AI advancements behind a new GPU paywall.

That said, I don't even use FSR for most games and my 7900XT plays in 4K no sweat. Fluid Motion Frames is very useful to hit 120 for anything that isn't an FPS.
 
Last edited:

llien

Member
"GPU estimated at 390mm^2" (9070) - should be around $127 just for the die


NAVI48.jpg


Amd is about to take an additional 10% of the market share from Nvidia at the very least.
Bench for waitmarks, but much more importantly, the price.
B*tch later.
 
Last edited:

llien

Member
To address market share b*tching.
Here are stats from mindfactory, the biggest German DIY retailer with about 25% of the market. (They cover Q1-Q3, I could not find Q4 article)



Things are likely much different in pre-built/OEM side of things, but it shows what people who can build their own PC buy. At least in Germany. :)
 
Last edited:

Wolzard

Member
To address market share b*tching.
Here are stats from mindfactory, the biggest German DIY retailer with about 25% of the market. (They cover Q1-Q3, I could not find Q4 article)



Things are likely much different in pre-built/OEM side of things, but it shows what people who can build their own PC buy. At least in Germany. :)

Market share depends a lot on how you look at it. If we only look at desktop GPUs, Nvidia dominates more than 90% of the market.
However, none of the manufacturers will look at this specifically, they will look at all segments (desktop, mobile, embedded, datacenter, HPC, etc.)

PR_MW_Q324_002.png


 

llien

Member
However, none of the manufacturers will look at this specifically, they will look at all segments (desktop, mobile, embedded, datacenter, HPC, etc.)
Intel is understandable. But AMD is simply missing from the 2024 gen of laptops and a lot of laptops are shoveled by OEMs. So what gives.
 

Wolzard

Member
Intel is understandable. But AMD is simply missing from the 2024 gen of laptops and a lot of laptops are shoveled by OEMs. So what gives.

Apparently this has been adjusted, at CES 2025 they presented partnerships with several laptop manufacturers such as Dell.





 

Wolzard

Member
Yeah, but how does that change 2024???


I didn't really understand your point. As I said, it's not just desktop GPUs, nor laptops. There are GPUs for data centers, for HPCs, for workstations.
Ultimately, manufacturers will look at their market share as a whole and not just a specific segment.
 

llien

Member
I didn't really understand your point. As I said, it's not just desktop GPUs, nor laptops. There are GPUs for data centers, for HPCs, for workstations.
I thought of laptops as the only non-desktop GPUs.

If we include datacenter.... I doubt AMD is at16% there, so AMD should be at lower, not higher market share.
Numbers simply do not add up.
 

Wolzard

Member
I thought of laptops as the only non-desktop GPUs.

If we include datacenter.... I doubt AMD is at16% there, so AMD should be at lower, not higher market share.
Numbers simply do not add up.

The data is literally there. If you doubt it, that's another story, but Jon Peddie Research is very reliable.
 

llien

Member
This chart is price. So they want make price for 9700XT like currenty 7900XT
$650?

With 5070Ti at $750 there ain't much room left for a xx70 branded card, or is there?
If perf is "higher" why not call it 9080?

The data is literally there. If you doubt it, that's another story, but Jon Peddie Research is very reliable.
I'm confused. So in which category is it 10% vs 90%? Notebooks? There it is probably 99% vs 1%.
 
Last edited:

Zuzu

Member
It's looking pretty good!



Here's an AI summary of the vid:

Overview of AMD's FSR 4 Technology

- The video begins with a demonstration of AMD's FSR 4 upscaling technology, which was not prominently featured in their presentation.
- Two prototype RX 970 XT monitors are used to showcase the performance of FSR 3.1 and FSR 4 in the game Ratchet and Clank: Rift Apart at 4K resolution.
- The focus is on comparing the visual quality between FSR 3.1 and FSR 4, particularly in performance mode.

Visual Quality Comparisons

- Initial observations highlight significant improvements in image quality with FSR 4 compared to FSR 3.1, especially in handling transparent effects and particle details.
- The FSR 3.1 performance mode exhibits blurred and garbled visuals in areas with transparency, whereas FSR 4 presents clearer definitions of individual particles.
- The video illustrates how FSR 4 resolves issues related to garbling in effects such as fire and particle movements, leading to a smoother visual experience.

Performance Mode Insights

- The discussion emphasizes that FSR 3.1 struggles in performance mode, resulting in lower quality visuals, particularly in fast-moving or complex scenes.
- In contrast, FSR 4 demonstrates a marked improvement, providing a cleaner and more defined appearance even when upscaling from 1080p to 4K.
- The presenter notes that while FSR 4 shows promise, it is still not without limitations, particularly in maintaining high detail across all elements.

Detailed Observations of Specific Effects

- The video showcases specific examples of improvements in particle effects, noting that FSR 4 handles confetti and other small details with less ghosting and streaking.
- The analysis includes a comparison of how fur details are rendered, revealing that FSR 4 reduces pixelation artifacts significantly compared to FSR 3.1.
- The presenter highlights the overall stability of images with FSR 4, especially in scenes where fine textures are critical, such as carpet details in the game.

General Impressions and Future Considerations

- Overall, the presenter expresses a positive impression of FSR 4, noting it as a substantial upgrade over its predecessor, particularly in visual fidelity.
- Despite some remaining issues, the technology shows potential for enhancing gameplay experiences in visually demanding titles.
- The video concludes with a mention of the broader implications of FSR 4 for future gaming, suggesting further exploration and testing will be necessary to fully assess its capabilities.
 
Last edited:

ghairat

Member


Here's an AI summary of the vid:

Overview of AMD's FSR 4 Technology

- The video begins with a demonstration of AMD's FSR 4 upscaling technology, which was not prominently featured in their presentation.
- Two prototype RX 970 XT monitors are used to showcase the performance of FSR 3.1 and FSR 4 in the game Ratchet and Clank: Rift Apart at 4K resolution.
- The focus is on comparing the visual quality between FSR 3.1 and FSR 4, particularly in performance mode.

Visual Quality Comparisons

- Initial observations highlight significant improvements in image quality with FSR 4 compared to FSR 3.1, especially in handling transparent effects and particle details.
- The FSR 3.1 performance mode exhibits blurred and garbled visuals in areas with transparency, whereas FSR 4 presents clearer definitions of individual particles.
- The video illustrates how FSR 4 resolves issues related to garbling in effects such as fire and particle movements, leading to a smoother visual experience.

Performance Mode Insights

- The discussion emphasizes that FSR 3.1 struggles in performance mode, resulting in lower quality visuals, particularly in fast-moving or complex scenes.
- In contrast, FSR 4 demonstrates a marked improvement, providing a cleaner and more defined appearance even when upscaling from 1080p to 4K.
- The presenter notes that while FSR 4 shows promise, it is still not without limitations, particularly in maintaining high detail across all elements.

Detailed Observations of Specific Effects

- The video showcases specific examples of improvements in particle effects, noting that FSR 4 handles confetti and other small details with less ghosting and streaking.
- The analysis includes a comparison of how fur details are rendered, revealing that FSR 4 reduces pixelation artifacts significantly compared to FSR 3.1.
- The presenter highlights the overall stability of images with FSR 4, especially in scenes where fine textures are critical, such as carpet details in the game.

General Impressions and Future Considerations

- Overall, the presenter expresses a positive impression of FSR 4, noting it as a substantial upgrade over its predecessor, particularly in visual fidelity.
- Despite some remaining issues, the technology shows potential for enhancing gameplay experiences in visually demanding titles.
- The video concludes with a mention of the broader implications of FSR 4 for future gaming, suggesting further exploration and testing will be necessary to fully assess its capabilities.

Off topic, what AI program do you use to summarize?
 
Top Bottom