• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 9070 expected to start at $479, benchmarks now released (Up: Radeon 9000 series go on sale in late March)

They delayed because Blackwell is much slower than expected and they saw an opportunity to raise prices if 9070 XT ended up faster than 5070Ti. But 5080 reviews were so negative they probably won't do it now.
Digital Foundry are claiming that RDN 4’s RT performance is lacklustre, I’m curious to see how the 9070 XT will stack up against the 5070 in that regard, let alone the 5070 Ti.
 

iQuasarLV

Member
Glad you found a solution but this guy especially would shit on that solution dude. It’s a solution that is not even competing against FG or FSR 3 in image quality. Lossless scaling is a post process effect and gets a lot more errors by not having access to motion vectors. Imagine peoples criticism’s for fake frames but take up lossless scaling on a pedestal..

Threat interactive wants you to run native or super sampled with no upscaling technologies of any sort. In his world DLSS is the devil and FSR is even lower in his tier list. No lumen. No RT of any form. No nanite.

Who really wants to side with this guy fully lol? He’s too much.
Well you do what you need to when you play 20 year old games that are not supported anymore. I cannot fault anyone for searching for solutions the developer nor hardware companies want to embrace. I can fault people who drop $5k every 8-12 months on upgrades because they are no different than gambling addicts who just have to chase that payoff.
 

Buggy Loop

Member
Well you do what you need to when you play 20 year old games that are not supported anymore. I cannot fault anyone for searching for solutions the developer nor hardware companies want to embrace. I can fault people who drop $5k every 8-12 months on upgrades because they are no different than gambling addicts who just have to chase that payoff.

He wants to go back to 2015

If you watch his videos you’ll understand

Nobody is going back there. Cerny didn’t make a pro console with more ML and RT because evil Nvidia sets a master plan that everyone has to follow. Dafuq is this?

That video is his most unhinged so far. Criticism to bad optimization or badly using tools you are given I give props, but now he’s a chihuahua barking at every trees. The attacks he received and got kicked out of UE forums took a toll on him, I expected a better return than just wildly swinging in a desperate crowdfunding call. That’s a weak ass call to arms.
 

iQuasarLV

Member
He wants to go back to 2015

If you watch his videos you’ll understand

Nobody is going back there. Cerny didn’t make a pro console with more ML and RT because evil Nvidia sets a master plan that everyone has to follow. Dafuq is this?

That video is his most unhinged so far. Criticism to bad optimization or badly using tools you are given I give props, but now he’s a chihuahua barking at every trees. The attacks he received and got kicked out of UE forums took a toll on him, I expected a better return than just wildly swinging in a desperate crowdfunding call. That’s a weak ass call to arms.
Haha, bro the mental image I got when reading this is awesome. What's better as a former chihuahua owner it rang home =D.
 
Sure but i bet 99-100% of demanding AAA games will support DLSS. If indies have dlss hardly matters since they are performant regardless
I don’t doubt that most newer AAA games will support it. But who know what happens if AMD has a software solution that out muscles Nvidia in a few years? You still would need a set of computational metrics from frames from a natively rendered image as baseline for comparisons to know what the hardware was capable of.
 
Last edited:
Coreteks is almost as bad as MLID.

This guy just outright spouts his headcanon as the truth in some of his AMD roadmap videos to rdna and the AMDrones eat it up because of his asmr voicetrap.

There's also this another guy named Adored TV or w/e IIRC, don't know if he still does YT, and he almost does the same thing, but at least he got some of things right like release dates and has sources within AMD.
 

Crayon

Member
This guy just outright spouts his headcanon as the truth in some of his AMD roadmap videos to rdna and the AMDrones eat it up because of his asmr voicetrap.

There's also this another guy named Adored TV or w/e IIRC, don't know if he still does YT, and he almost does the same thing, but at least he got some of things right like release dates and has sources within AMD.

Seriously though they both have great voices lol. I've never seen them once show anything to back their rumors up, but I'd hire either to narrate a dark documentary.
 

AGRacing

Member
This is usually the norm with a node shrink. AMD had one, Nvidia didn't. The die size of the 5080 is 378, Navi 48 is 390 mm².
If something, AMD probably thought the 5080 would be the 5070ti instead (which would have been great), but was blindsided by Nvidia launching basically the same cards as last gen. So their 9070xt is now not a 5070ti competitor, but is close to the 5080.

I think this is the explanation that makes the most sense. I think they fucked themselves over by copying Nvidia's naming. If the rumors are true (and I think they are, we are in this situation).

RTX 5080 close in power to 9070XT
5070TI close in power to 9070
5070 close in power, probably to the 9060XT
5060ti close in power to 9060

Because of Nvidia selling us the same cards, they are now forced to price their products at a much lower price than before. I now understand what their problem is.
I like how it took a mistake to get them to do the correct thing and be competitive. They SHOULD be selling a 5080 tier card in raster (inferior in upscale and RT) as a competitor to the 5070 class. It’s the only way anyone other than die hard AMD guys (of which there are too few to justify a business) were ever going to buy the thing.

Will they price it by name and take the W and grow the market share a little? Let’s see if they can manage to fuck it up in the final minutes.
 
Last edited:
Digital Foundry are claiming that RDN 4’s RT performance is lacklustre, I’m curious to see how the 9070 XT will stack up against the 5070 in that regard, let alone the 5070 Ti.

We will know from the pricing - supply and demand. That was the reason for the delay I think - to avoid having to cut the price soon after launch and the message that sends.
 

Marlenus

Member
Dude just stripped AMD buck naked and spanked them on the ass.

Not really. AMD have decided that taking advantage of Intel's continual missteps is far more profitable than taking on NV who have been pretty damn good execution machines.

Also a 9800X3D for $500 is going to have far higher margin than any $500 GPU and when that part is in such high demand why use wafer capacity for the less profitable product.

That is the dGPU dilemma for AMD. If Intel were more competitive then AMD would have lower prices and probably lower volume so the dGPUs while lower margin and adding value where as now when AMD can't make the 9800X3D fast enough dGPUs just compe with their own wafer capacity.

Also NV gained market share through reliable execution with only a few mis steps in their history. FX, Fermi, Turing kinda and Blackwell kinda. AMD have been far more hit and miss which has allowed NV to take up the dominant market position. Now NV have it they have been trying to leverage that with walled gardens the same way they leverage CUDA in the pro space and it seems to be working.
 

winjer

Member
They delayed because Blackwell is much slower than expected and they saw an opportunity to raise prices if 9070 XT ended up faster than 5070Ti. But 5080 reviews were so negative they probably won't do it now.

AMD did officially say it's because of improvements to stocks and their software stack.
But I agree that AMD seeing Blackwell being just a rebrand of Ada Lovelace, made AMD rethink their whole strategy.

 

Rosoboy19

Member
Not really. AMD have decided that taking advantage of Intel's continual missteps is far more profitable than taking on NV who have been pretty damn good execution machines.

Also a 9800X3D for $500 is going to have far higher margin than any $500 GPU and when that part is in such high demand why use wafer capacity for the less profitable product.

That is the dGPU dilemma for AMD. If Intel were more competitive then AMD would have lower prices and probably lower volume so the dGPUs while lower margin and adding value where as now when AMD can't make the 9800X3D fast enough dGPUs just compe with their own wafer capacity.

Also NV gained market share through reliable execution with only a few mis steps in their history. FX, Fermi, Turing kinda and Blackwell kinda. AMD have been far more hit and miss which has allowed NV to take up the dominant market position. Now NV have it they have been trying to leverage that with walled gardens the same way they leverage CUDA in the pro space and it seems to be working.
I feel like he did make a good point on AMD’s lack of innovation, though. At this point they may have just conceded that their custom APU designs for console and handheld devices are their best chance for financial success so instead of putting in all the R&D for desktop gpu innovation, they simply let NVidia do that work then copy their technologies (albeit a generation late and with inferior quality) to use in future APU designs.

Or, maybe they really DO want to beat NVidia in desktop gpu and they just suck at it. /shrug
 
It's 2 days before my Daughters birthday and I was thinking of having a party, amd inviting you all. ❤️
Excuse Me What GIF
 
Last edited:

Zathalus

Member
That Threat Interactive video is something else. Alan Wake 2 apparently now has garbage graphics. What a clown.
 

SolidQ

Member
In Gaming Graphics, revenue declined year over year, as we accelerated channel sellout in preparation for the launch of our next-gen Radeon 9000 series GPUs.

Our focus with this generation is to address the highest volume portion of the enthusiast gaming market with our new RDNA 4 architecture. RDNA 4 delivers significantly better rate tracing performance and add support for AI-powered upscaling technology that will bring high-quality 4K gaming to mainstream players when the first Radeon 9070 series GPUs go on sale in early March.

Dr. Lisa Su - AMD CEO (Q4 2024 Earnings Call)

We will see how it highest
 

DenchDeckard

Moderated wildly
Would be amazing if we saw some really good competition in the mid range from Intel, AMD and NVIDIA. We deserve it!
 

Crayon

Member
March 6th would be nice. I hear 'march' and just assume it's going to be last weekday of the month.
 

Gamer79

Predicts the worst decade for Sony starting 2022
I am hoping this is a card that gives killer price to performance. I would jump ship to AMD if they made it appealing enough.
 

MikeM

Member
I may side grade my 7900xt if the 9070xt is on par raster wise but upscaling is vastly better. Cost depending of course.
 

Crayon

Member
If it's good I'm waiting a year for a price drop on that non-xt.

very-tight-with-the-dollar-cheap.gif


Only thing is the question mark on those tariffs. I already did a hedge-bet upgrade for that so if I miss the boat I'm not completely up shit creek.
 

Ev1L AuRoN

Member
With FSR4 finally being hardware accelerated and improvements to the RT performance, that might be the disruptive card we are waiting for. At the very least I hope this card kick NVIDIA out of their comfort zone, ripping us off with their mid-end offerings.
 

kiphalfton

Member
Man, knowing what we know, AMD has literally no blockers in front of them.

I CAN'T WAIT to see how they fuck it up.

Considering some stores already got stock allegedly, that's just been sitting there for like a month at this point... drivers must suck really bad.

Otherwise why wouldn't AMD do everything in their power to beat Nvidia to market.
 

iQuasarLV

Member
Considering some stores already got stock allegedly, that's just been sitting there for like a month at this point... drivers must suck really bad.

Otherwise why wouldn't AMD do everything in their power to beat Nvidia to market.
Uh, because AMD literally follows Nvidia's heels in business decisions. Since Nvidia fucked up the launch of the 5000 series AMD has no lead to follow. They have to absolutely nail the FSR4 feature launch to coincide with the 9070 or they are fucked for an entire generation.

Pricing is not the only hurdle AMD has to pass to actually sell cards. Especially since they announced losses in that department last year in which console sales can no longer hide from the investors. They need to make up Radeon $$$ from somewhere now that console sales are slowing.
 

Topher

Identifies as young
Considering some stores already got stock allegedly, that's just been sitting there for like a month at this point... drivers must suck really bad.

Otherwise why wouldn't AMD do everything in their power to beat Nvidia to market.

Sometimes it seems like that's the same as me saying I'm going to do everything in my power to beat Lebron James in a basketball game

No Way Reaction GIF by CBS
 

DenchDeckard

Moderated wildly
Considering some stores already got stock allegedly, that's just been sitting there for like a month at this point... drivers must suck really bad.

Otherwise why wouldn't AMD do everything in their power to beat Nvidia to market.

It's because they get one shot at this launch.

I can confirm that the company i work for has had stock for some time.

I have close relationships with the industry and not even staff at AMD know the true reasons for the delay last time I spoke to them.

Rumours are it's to combat 50 series and to ensure software was in a better place.
 
I understand people don’t like fake frames but for me the experience that I had with it was great.. and I’m sure it was one of the worst implementations in a video game.. I’m referring to BMW.. even so for me it felt much better to play the game this way than the 30 or 40 fps modes.. I’ve managed to have really great moments playing it and finish the game in the 60 fps mode and now I’m really a great advocate for this technology in single player games.. I only wish more developers implement it…
 
With FSR4 finally being hardware accelerated and improvements to the RT performance, that might be the disruptive card we are waiting for. At the very least I hope this card kick NVIDIA out of their comfort zone, ripping us off with their mid-end offerings.
Disruptive? To do that it needs to beat - and not only be comparable - to a good and expensive nvidia card.

If it ends up being comparable in raster alone it won’t do anything for anyone.

Since nvidia new offerings are basically refreshes with some unproven new tech, these cards might fit in nicely.

But to sway nvidia users, or change nvidia behaviour, even on the lower end, they need to either come out handily on top, or have comparable extra features (RT, frame gen and a much better FSR) at a much much lower price.

Not really optimistic about it, but curious nonetheless.
 
How? It outperforms a 4080 Super in synthetics:


… for less than half the price….

This is an amazing offering.

Raytracing performance will suck on AMD as always, don't worry.
 

Ev1L AuRoN

Member
Disruptive? To do that it needs to beat - and not only be comparable - to a good and expensive nvidia card.

If it ends up being comparable in raster alone it won’t do anything for anyone.

Since nvidia new offerings are basically refreshes with some unproven new tech, these cards might fit in nicely.

But to sway nvidia users, or change nvidia behaviour, even on the lower end, they need to either come out handily on top, or have comparable extra features (RT, frame gen and a much better FSR) at a much much lower price.

Not really optimistic about it, but curious nonetheless.
The RX 480 wasn't competing in the high-end segment, it was a powerful GPU in the $200 to $300 mark, and it made NVIDIA launch one of the most legendary generations with the 10xx series, that disrupt the market and benefited gamers on both teams, If the 9000 series can do the same on the XX70 all be all for it. I live in Brazil and the high-end products are prohibited expensive around here, so I live and die on the mid-range segment, and in these line NVIDIA got a bit too comfortable and the mid-range today cost as much as a xx80 tier card.
 
Top Bottom