• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 9070 expected to start at $479, benchmarks now released (Up: Radeon 9000 series go on sale in late March)

SolidQ

Member
Sorry, but MLID has been caught lying so many times, that he now has no credibility.
it's real from AMD slides(December)
NAVI48 XTX was projected to compete with XTX +-5%
NAVI48 XT with XT +-5%
The rest story just driver to take this perf
 
Last edited:

Radical_3d

Member
it's real from AMD slides(December)
NAVI48 XTX was projected to compete with XTX +-5%
NAVI48 XT with XT +-5%
The rest story just driver to take this perf
So. You’re saying that a 650$ card is supposed to compete with what until last week was a 1.000$ card plus having more space taken for AI cores to run an acceptable upscaler…

Does Lisa Su have photos of the TSMC CEO with a horse or something?
 

Kataploom

Gold Member
I'm sorry, but who the fuck is gonna buy a 9070XT without even considering or waiting for 5070 reviews first?
The answer is nobody.

If AMD launches first, the immediate response has historically always been "Wait for Nvidia". So what's the fucking point?
I prob will and I don't think 5070 is gonna make it for me... I'd be surprised if they end up being kinda at the same level, AMd has to fuck it up too much for that
 

SolidQ

Member
ou’re saying that a 650$ card
We don't know price. to compete with 5080 they just need slap GDDR7 into NAVI48XTX, but will they do it?

I don't think 5070 is gonna make it for me.
5070 vanilla could be worse, than 4070S or same.
Add lower cache bandwidth and worse cache latency for blackwell
be1990eb2d5434bd27f0f47e48c9681a.png
 
Last edited:

FingerBang

Member
So. You’re saying that a 650$ card is supposed to compete with what until last week was a 1.000$ card plus having more space taken for AI cores to run an acceptable upscaler…

Does Lisa Su have photos of the TSMC CEO with a horse or something?
This is usually the norm with a node shrink. AMD had one, Nvidia didn't. The die size of the 5080 is 378, Navi 48 is 390 mm².
If something, AMD probably thought the 5080 would be the 5070ti instead (which would have been great), but was blindsided by Nvidia launching basically the same cards as last gen. So their 9070xt is now not a 5070ti competitor, but is close to the 5080.

I think this is the explanation that makes the most sense. I think they fucked themselves over by copying Nvidia's naming. If the rumors are true (and I think they are, we are in this situation).

RTX 5080 close in power to 9070XT
5070TI close in power to 9070
5070 close in power, probably to the 9060XT
5060ti close in power to 9060

Because of Nvidia selling us the same cards, they are now forced to price their products at a much lower price than before. I now understand what their problem is.
 

FingerBang

Member
I could only find about 390 online


Obviously chip size doesn't mean much in itself.
 

FingerBang

Member
Then they’d probably price it around 900$ (10% less price for a 10% less performance).
No one would buy that card at that price. Nvidia still has superior upscaling and ray tracing. AMD has now positioned their card as a xx70 competitor, it needs to compete in price as well. I think it will be 499 for the 9070 and 650 for the 9070xt.

They really fucked up with their naming, but that's great news for us
 

Radical_3d

Member
No one would buy that card at that price. Nvidia still has superior upscaling and ray tracing. AMD has now positioned their card as a xx70 competitor, it needs to compete in price as well. I think it will be 499 for the 9070 and 650 for the 9070xt.

They really fucked up with their naming, but that's great news for us
That price for a 4080 tier card would be bananas. I don’t see it happening just because nothing good ever happens.
 

Kataploom

Gold Member
That price for a 4080 tier card would be bananas. I don’t see it happening just because nothing good ever happens.
What I'm fearing is that AMD thinks they have way too good of a deal and are discussing whether they should price pair with Nvidia, I can see AMD execs being divided probably due to AI data centers paying whatever anyway
 

Buggy Loop

Member
Add lower cache bandwidth and worse cache latency for blackwell

haha

Interested The Office GIF


Do explain SolidQ the impact of that twitter benchmark for everyone, do so please

New power gating and RISC distribution of tasks in the pipeline

Now read this


And come back to us with those huge dealbreaker differences and taking those values as a true benchmark without understanding all the hurdles into microbenching these architectures. I'll grab the popcorn.
 
I think this guy is full of shit, usually, but this explanation is the one that makes the most sense to me.
What that should tell you, is that you just like what he's saying.

Having watched this video, this is just pure speculation.
Jensen Huang, rebranded the 5070Ti into the 5080? To force AMD's prices lower? Lmao When the fuck would Nvidia ever sacrifice their 80 class GPU getting embarrassed by a cheaper "70" class GPU? That didn't happen with Ampere, where their costs were significantly smaller so they would have had way more room to clip their own margins, so why the hell would it happen now?
 

SpokkX

Member
Why are they comparing 9070xt to 5080 without taking DLSS into account? Native rendering is basically theoretical - in practice everyone uses DLSS since it performs and usually even looks better (less shimmer etc)
 

Kataploom

Gold Member
Why are they comparing 9070xt to 5080 without taking DLSS into account? Native rendering is basically theoretical - in practice everyone uses DLSS since it performs and usually even looks better (less shimmer etc)
FSR4 allegedly being very very good as per media outlet that tried it and also upscalers and other proprietary tech are not in every game, so you have to rely on other upscalers, mods, etc.
 
Last edited:

FingerBang

Member
What that should tell you, is that you just like what he's saying.

Having watched this video, this is just pure speculation.
Jensen Huang, rebranded the 5070Ti into the 5080? To force AMD's prices lower? Lmao When the fuck would Nvidia ever sacrifice their 80 class GPU getting embarrassed by a cheaper "70" class GPU? That didn't happen with Ampere, where their costs were significantly smaller so they would have had way more room to clip their own margins, so why the hell would it happen now?
You're focusing on ONE thing in that video. I agree that Nvidia did not rebrand anything.

Bur I believe AMD expected the performance of the 5070ti to be in line with the 4080
 

Clear

CliffyB's Cock Holster
Dude just stripped AMD buck naked and spanked them on the ass.

When this guy ships something to back up his claims, I'll give him some credit.

But gotta be real, right now it smells like a grift to me! He's mobilizing support based on grievances and gish galloping through points so as to disguise the fact that he's really got no answers and absolutely zero ability to influence driver architecture going forwards.
 

Bitmap Frogs

Mr. Community
they finally have some decent hardware and are instead fucking up on simply getting it into market? what is this shit show? this is all frank azor doing, goddamn
 

winjer

Member
they finally have some decent hardware and are instead fucking up on simply getting it into market? what is this shit show? this is all frank azor doing, goddamn

Quite the opposite, they delayed it to improve drivers and software stack.
And to get more stock, not to do a paper launch, like nvidia did with the RTX 5080 and 5090.

The question now is if AMD is going to screw up with the price.
 
Last edited:

Crayon

Member
I'm more inclined to think bumping it back two months is a good thing, one way or another.

Ar thing that crossed my mind: They might be better off launching against the reality of the 5070 than the promise of a 5070 that was as fast as a 4090. Or maybe launching against a short-stock 5070 with fucked up street prices.

Postponing the reveal the day of for software improvements doesn't seem right. It was some kind of reaction to nvidia's reveal. The 50 series was a little cheaper and a little slower than was expected. Now that could have actually changed the game before and after nvidia's reveal, explaining amd's seemingly snap decision.

If it turned out to be a good decision, it would at least be good for amd. Might be good for everyone if the product-price is right and some good competition doesn't get kneecapped from a bad launch and the resulting bad reviews that get referenced for years on end.
 

KeplerL2

Member
Quite the opposite, they delayed it to improve drivers and software stack.
And to get more stock, not to do a paper launch, like nvidia did with the RTX 5080 and 5090.

The question now is if AMD is going to screw up with the price.
They delayed because Blackwell is much slower than expected and they saw an opportunity to raise prices if 9070 XT ended up faster than 5070Ti. But 5080 reviews were so negative they probably won't do it now.
 

Arsic

Loves his juicy stink trail scent
I’ve never used AMD anything. I think if I was upgrading I’d consider it but giving up nvidias new feature suite just isn’t worth the savings…
 

Buggy Loop

Member


Timestamped to the conclusion but the whole video is a sad reminder of how easily the PC gaming sphere has been manipulated.


You can find my long list of replies in the other previous graphic thread that was from this guy's channel and modern lighting, which I mainly agree with him that most of the time they are not using Lumen/RT/Path tracing correctly and that these make most sense when a game has a very dynamic scene with building/destruction

But Nvidia / AMD / Intel gave a tool to make the above, including for the games that it makes sense. Blaming them on bad implementations just takes the cake lol. Worthless video this time imo. Devs are the ones holding the key.

His crusade on any AA is getting tiring too. Sure bud, do make your solution which you need funding on.

Keeping a 1060 as some benchmark to how games should run in modern days is completely ridiculous. Its 9 years old. Even pure rasterized games left that card behind, I know I had one.
 
so which upcoming games wont?

the only big title the last years I have seen ís Helldivers 2 - but that is more of CPU bound game
I don’t know?!? That a pretty wide question. You can’t assume all future games will though. Heck even announced games don’t even get it lol. Destiny 2 was announced when the first RTX was first rolled out, never got it. You can’t assume anything or even older games will get dlss 4 with frame gen options, therefore it’s not a static baseline for apples to apples.
 

iQuasarLV

Member
You can find my long list of replies in the other previous graphic thread that was from this guy's channel and modern lighting, which I mainly agree with him that most of the time they are not using Lumen/RT/Path tracing correctly and that these make most sense when a game has a very dynamic scene with building/destruction

But Nvidia / AMD / Intel gave a tool to make the above, including for the games that it makes sense. Blaming them on bad implementations just takes the cake lol. Worthless video this time imo. Devs are the ones holding the key.

His crusade on any AA is getting tiring too. Sure bud, do make your solution which you need funding on.

Keeping a 1060 as some benchmark to how games should run in modern days is completely ridiculous. Its 9 years old. Even pure rasterized games left that card behind, I know I had one.
All I can say is that for $5 I got a program that is doing the same thing a $2000 gpu is being marketed as having (MFG). I feel bad for the people that buy a 5000 series card because it makes more frames. Just another in a long line of bullshit buzzwords people gobbled up with shit eating grins.
 

Crayon

Member
Wait - that sounds affordable. How does something like that perform on something new, like KCD II?

If you mean the price in the thread title, that is probably way too low a guess. Realistically, $600 for the xt and $500 for the non would be enough to win a lot of people who are willing to look at both brands to begin with. Could be more, though! There's really no way to know and I can't even remember where this $479 rumor came from.
 

Buggy Loop

Member
All I can say is that for $5 I got a program that is doing the same thing a $2000 gpu is being marketed as having (MFG). I feel bad for the people that buy a 5000 series card because it makes more frames. Just another in a long line of bullshit buzzwords people gobbled up with shit eating grins.

Glad you found a solution but this guy especially would shit on that solution dude. It’s a solution that is not even competing against FG or FSR 3 in image quality. Lossless scaling is a post process effect and gets a lot more errors by not having access to motion vectors. Imagine peoples criticism’s for fake frames but take up lossless scaling on a pedestal..

Threat interactive wants you to run native or super sampled with no upscaling technologies of any sort. In his world DLSS is the devil and FSR is even lower in his tier list. No lumen. No RT of any form. No nanite.

Who really wants to side with this guy fully lol? He’s too much.
 

SpokkX

Member
I don’t know?!? That a pretty wide question. You can’t assume all future games will though. Heck even announced games don’t even get it lol. Destiny 2 was announced when the first RTX was first rolled out, never got it. You can’t assume anything or even older games will get dlss 4 with frame gen options, therefore it’s not a static baseline for apples to apples.
Sure but i bet 99-100% of demanding AAA games will support DLSS. If indies have dlss hardly matters since they are performant regardless
 
Top Bottom