• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Switch dataminer from Famiboards suggests the Switch 2's portable GPU clocks will be 561MHz. He also said 1.8GHz for the CPU is "hopium"

I just want them to show this thing already. These rumors and the amateur analysis that's come along with them are getting tiring.

I don't care about theoretical limits. I just want to see how it performs in practice and how that balances out with price and battery life.
 

Buggy Loop

Gold Member
Correct me if i am wrong but this is taking the best of all scenarios isn't? I mean with all the possible cache and everything

If that's the case I am not sure Nintendo will go this way, i wouldn't be surprised if actually trade blows with the jaguar

You watch the DF video with Richard demolishing the Jaguar CPU, a mobile $200 Chromebook laptop trash CPU from 2012 at the worst of AMD's IPC performances in all of its history. It almost bankrupted AMD because it was so bad. Jaguar would not even compete with Intel's Atom, and as Richard says, even the 2008 Intel QUAD core (not 8-cores) Q6600 score being almost equal to Jaguar scores.

While A78, a standardized core from ARM is above Zen 2 IPC and your conclusion is that you wouldn't be surprised that it would even trade blows with Jaguar?

flat,750x1000,075,t.jpg


WHAT IS GOING ON? Is this reverse logic universe?

Tegra Orin is a KNOWN product. T239 is based on T234. We know most of what needs to be known from it. Not some rules of thumb guess. You can go and buy the kit and have fun downclocking it, measure power, etc.

A78 is a known quantity. In MANY products :

Code:
Google Tensor G2, MediaTek MT6877, MT6878, MT6879, MT6891, MT6893, Dimensity 7020, 7025 (Ultra), 7030, 7050, 7300 (Energy/Ultra/X), 8000, 8020, 8050, 8100, 8200, Kompanio 900T, 1200, 1380, 1300TQualcomm Snapdragon 4 Gen 1, 4(s) Gen 2, 695, 6 Gen 1, 6(s) Gen 3, 778G(+), 780G, 782G, 888(+)Samsung Exynos 1080, 1280, 1330, 1380, 2100

Its in many chipsets. Its benched. There's no further tweakings of those clusters by either Nvidia or Nintendo. IPC is pretty much the best benchmark of CPU. Jaguar was not clocked high, its easy for A78 to surpass it, easy.
 
Last edited:

Landr300

Neo Member
You watch the DF video with Richard demolishing the Jaguar CPU, a mobile $200 Chromebook laptop trash CPU from 2012 at the worst of AMD's IPC performances in all of its history. It almost bankrupted AMD because it was so bad. Jaguar would not even compete with Intel's Atom, and as Richard says, even the 2008 Intel QUAD core (not 8-cores) Q6600 score being almost equal to Jaguar scores.

While A78, a standardized core from ARM is above Zen 2 IPC and your conclusion is that you wouldn't be surprised that it would even trade blows with Jaguar?

flat,750x1000,075,t.jpg


WHAT IS GOING ON? Is this reverse logic universe?

Tegra Orin is a KNOWN product. T239 is based on T234. We know most of what needs to be known from it. Not some rules of thumb guess. You can go and buy the kit and have fun downclocking it, measure power, etc.

A78 is a known quantity. In MANY products :

Code:
Google Tensor G2, MediaTek MT6877, MT6878, MT6879, MT6891, MT6893, Dimensity 7020, 7025 (Ultra), 7030, 7050, 7300 (Energy/Ultra/X), 8000, 8020, 8050, 8100, 8200, Kompanio 900T, 1200, 1380, 1300TQualcomm Snapdragon 4 Gen 1, 4(s) Gen 2, 695, 6 Gen 1, 6(s) Gen 3, 778G(+), 780G, 782G, 888(+)Samsung Exynos 1080, 1280, 1330, 1380, 2100

Its in many chipsets. Its benched. There's no further tweakings of those clusters by either Nvidia or Nintendo. IPC is pretty much the best benchmark of CPU. Jaguar was not clocked high, its easy for A78 to surpass it, easy.
Wow, i just asked a question, that benchmark is showing the best of all scenarios isn't

In the test video that DF made from a PC hardware similar to switch 2 they don't even bother to choose an equivalent CPU, they said something in the lines of "there's no real world comparison possible"

You guys need to lower you expectations, this thing will have a bad CPU, even MLID has said that
 

Klosshufvud

Member
The fact that Switch 2 docked is being even compared to PS4 is absolute insanity. Even back in 2013 (yes, almost TWELVE YEARS AGO) the PS4 was called out as an underpowered shitbox. I was one of them. It got outperformed by even low end PCs. Its shortcomings weren't fixed until PS5 which is also entering its fourth year of release.

Of course Ampere is better than trash GCN and of course Jaguar CPU was a complete failure. But the discussion even being about PS4 is a Nintendo failure. Handheld chips like Z1E decisively outperform the PS4. And Switch 2 is so way behind Z1E. You can easily get hold of RoG Ally for $500 and you'll have a handheld with a generation's superior specs. Now all boils down to how much heavy lifting DLSS can do. At that low power supply, I am sceptical it will be as transformative as desktop DLSS. We'll see.
 

Buggy Loop

Gold Member
the guy literally leaked everything

This part? The part where he regurgitated the T239 specs that had leaked way before him? Even slow-ass DF had time to make a video before him.



He never mentions outside of the spec listing. Do point it out where he says A78 is bad 🤷‍♂️ In fact his threshold is up to Series S. That's not even a fucking threshold on my radar :messenger_tears_of_joy:
 
Last edited:

Landr300

Neo Member
This part? The part where he regurgitated the T239 specs that had leaked way before him? Even slow-ass DF had time to make a video before him.



He never mentions outside of the spec listing. Do point it out where he says A78 is bad 🤷‍♂️ In fact his threshold is up to Series S. That's not even a fucking threshold on my radar :messenger_tears_of_joy:

he has said that many times

one of then is here at 16:23m

 
Last edited:

SonGoku

Member
As you can see in the video, it scales based on the difference between input and output resolutions. So dropping down to 540p for the base resolution for the 1440p upscale would incur a further cost.

Based on the titles tested in the video it seems that Series S level settings are only going to be viable at 1080p DLSS Balanced/Performance. So it's a question as to whether developers stick to a 1080p output or go with 1440p at reduced (eg. Steam Deck) settings.

So i finally watched the entire video and can say based on it that Series S settings are only viable halving the framerate or resolution and in some cases going even lower. For example a 30fps/1080 DLSS performance target is possible if the Series S is targeting 60 fps at 1080p as seen in Cyberpunk 2077 where Series S runs at 60fps on a 1080p target (800p lowest), on the DF test the RTX 2050 runs console settings at half the framerate: 30fps with 1080p DLSS performance (540p rendered resolution)

For games where Series S targets 30fps/1080p or very low resolutions at 60fps (540p to 720p) the switch 2 would have to scale back settings for a 1080p DLSS performance/balanced reconstruction. As is the case with A Plage Tale Requiem which targets similar SX/PS5 settings on Series S at 900p/30 fps and the RTX 2050 from DF targeted 1080p DLSS Balanced (626p) at the lowest possible settings with a shaky 30fps which even dropping to 900p DLSS Balanced (522p) could not lock despite only rendering 33% the pixels of the Series S at one step lower settings (low vs medium)

Besides CP the closest game to the Series S settings tested was control which on Series S runs at 900p/60fps and the RTX 2050 even on DLSS performance (540p) could not match Series S performance resulting mostly on 50 to 60 fps "that can go lower" while rendering 36% the pixels of the series s (900p vs 540p)

Fortnite again shows the Series S is twice as performant averaging 73% of 1080p while mantaining a locked 60fps
The RTX 2050 on the other hand targeted 1080p DLSS performance, rendering 50% the pixels of 1080p while targeting half the refresh rate at 30fps

1440p DLSS performance target seems largely unviable in anything but the less demanding games where Series S targets 1440P/60fps

Found interesting how the one PS4 optimized game engine tested (Death Stranding) showed comparable raw performance to the Base PS4 before DLSS kicks in, giving credence to the TPU relative performance charts i posted earlier that pegged the 7870 at 26% the raster performance of the RTX 3060
 
Last edited:

Buggy Loop

Gold Member
he has said that many times

one of then is here at 16:23m



That's compared to the Steam deck CPU you doofus,

Word for word " even if they cheap out on clock speeds and cooling here, I would still say it's about 20 30% faster than the steam deck's GPU, but the CPU is a bit weaker for gaming"

the Steam deck Zen2 @ 2.4-3.5GHz.

Tegra Orin will not even clock above 2GHz CPU, which is fine because PS4 is like 1.6 GHz.

A fucking world of difference with your statement that It would not surpass PS4 Jaguars. You can't even comprehend what you're fed from this pile of shit "leaker" which has a dedicated subreddit for all the times he fucked up. Don't even reply to me. You've wasted my time enough in this thread.
 
Last edited:

Landr300

Neo Member
This part? The part where he regurgitated the T239 specs that had leaked way before him? Even slow-ass DF had time to make a video before him.



He never mentions outside of the spec listing. Do point it out where he says A78 is bad 🤷‍♂️ In fact his threshold is up to Series S. That's not even a fucking threshold on my radar :messenger_tears_of_joy:

he has gone on record saying many times how bad is the cpu of the switch 2 is

here at 28:57m he says the console probably will run into CPU issues

 
Last edited:

Buggy Loop

Gold Member
he has gone on record saying many times how bad is the cpu of the switch 2

here at 28:57m he says the console probably will run into CPU issues



CPU issues against a Series S

Do you realize we are TALKING AGAINST PS4 since how many posts now?

WHO IN THIS THREAD IS EVEN SAYING THIS IS GOING AGAINST SERIES S? ARE THOSE PEOPLES IN THE ROOM WITH YOU RIGHT NOW?
 
Last edited:

Landr300

Neo Member
That's compared to the Steam deck CPU you doofus,

Word for word " even if they cheap out on clock speeds and cooling here, I would still say it's about 20 30% faster than the steam deck's GPU, but the CPU is a bit weaker for gaming"

the Steam deck Zen2 @ 2.4-3.5GHz.

Tegra Orin will not even clock above 2GHz CPU, which is fine because PS4 is like 1.8 GHz.

A fucking world of difference with your statement that It would not surpass PS4 Jaguars. You can't even comprehend what you're fed from this pile of shit "leaker" which has a dedicated subreddit for all the times he fucked up. Don't even reply to me. You've wasted my time enough in this thread.
i said that i would not be surprised if dont surpass the jaguars, i never said that wont be FIXED
 

Landr300

Neo Member
CPU issues against a Series S

Motherfucker

Do you realize we are TALKING AGAINST PS4 since how many posts now?

WHO IN THIS THREAD IS EVEN SAYING THIS IS GOING AGAINST SERIES S? ARE THOSE PEOPLES IN THE ROOM WITH YOU RIGHT NOW?
you said that a78 surpass the zen 2, i asked if that benchmark you posted was the best scenario possible, you didnt answer, i them procede to say that i would not be surprised if the a78 INSIDE the switch will trade blows with jaguar

iam not comparing it with the series s
 

Buggy Loop

Gold Member
you said that a78 surpass the zen 2

in IPC. Instructions per clock.

i asked if that benchmark you posted was the best scenario possible,

It's IPC/PPC.

If you take a 3GHz Zen 2 then no, a 1.8 GHz A78 will not perform as good. The IPC advantage on A78 is 7%. If you go with a 24-core Zen 2, then no, A78 8-core cannot compete. You need to bring to same core count.

Notice the benchmark I gave. The A78 scores lower on Geekbench, but at much lower clocks, thus better PPC.

You take that Zen 2 processor with same core count and downclock it to Jaguar's 1.6GHz and you have the
+Extra margin % as Jaguar were not excavator level IPC
+40% from Zen 1
+29% from Zen 1 to Zen 2

as per AMD's own slide.

That's how Intel & AMD announce IPC improvements generation to generation. We have the info directly from AMD here. A78 IPC is again extra margin % over that because it has better IPC. A Steam deck @ 2.4GHz or above 3GHz will still have more instructions being computed even with lower IPC because it just trounces an A78 clock. The 7% PPC from A78 compared to Zen 2 is not leveraging a 33% clock difference. Although Van Gogh has 4-cores Zen 2 vs 8 core switch 2, I still think that's heavily dependent on what kind of CPU utilization devs will make of it. MLID is worried for CPU for clocks. He's aiming Series S, it won happen. Totally out of ballpark of the discussion of Jaguar CPU.

i would not be surprised if the a78 INSIDE the switch will trade blows with jaguar

Man, I am not a native english speaker, so, anyone on this forum can translate this? :

" i wouldn't be surprised if actually trade blows with the jaguar"
" i would not be surprised if dont surpass the jaguars,"

How am I not interpreting this as you "I don't think A78 will surpass Jaguar CPU?"

?
 
Last edited:

Landr300

Neo Member
in IPC. Instructions per clock.



It's IPC/PPC. If you take a 3GHz Zen 2 then no, a 1.8 GHz A78 will not perform as good. Instructions per clock. You take that Zen 2 processor with same core count and downclock it to Jaguar's 1.8GHz and you have the +40% +29% +extra margin from bulldozer as per AMD's own slide. That's how Intel & AMD announce IPC improvements generation to generation. We have the info directly from AMD here. A78 IPC is again extra margin % over that because it has better IPC. A Steam deck @ 2.4GHz or above 3GHz will still have more instructions being computed even with lower IPC because it just trounces an A78 clock. The 7% PPC from A78 compared to Zen 2 is not leveraging a 33% clock difference. Although Van Gogh has 4-cores Zen 2 vs 8 core switch 2, I still think that's heavily dependent on what kind of CPU utilization devs will make of it. MLID is worried for CPU for clocks. He's aiming Series S. Totally out of ballpark of the discussion of Jaguar CPU.



Man, I am not a native english speaker, so, anyone on this forum can translate this? :

" i wouldn't be surprised if actually trade blows with the jaguar"
" i would not be surprised if dont surpass the jaguars,"

How am I not interpreting this as you "I don't think A78 will surpass Jaguar CPU?"

?
i think is very clear that i never nail that will be worse, what i said is that i would not be surprised if it wasn't
 

Buggy Loop

Gold Member
i think is very clear that i never nail that will be worse, what i said is that i would not be surprised if it wasn't

Let me have a go at the wording then, I'm just a french speaker typing on this forum for 20 years...

"I would be surprised that it doesn't surpass Jaguars"

or

"i would not be surprised if dont surpass the jaguars"

I'll let a native english speaker decide.
 

Landr300

Neo Member
Let me have a go at the wording then, I'm just a french speaker typing on this forum for 20 years...

"I would be surprised that it doesn't surpass Jaguars"

or

"i would not be surprised if dont surpass the jaguars"

I'll let a native english speaker decide.
"I would be surprised that it doesn't surpass Jaguars"

where did you read that? I literally try to find it and did not seen anywhere

i said either "would not be" or "wouldn't be"
 

Buggy Loop

Gold Member
"I would be surprised that it doesn't surpass Jaguars"

where did you read that?

MY wording of what you are trying to say.

"i would not be surprised if dont surpass the jaguars" just doesn't make sense to me with what you're trying to say 2 posts above, at least my interpretation of it.
 

Landr300

Neo Member
Let me have a go at the wording then, I'm just a french speaker typing on this forum for 20 years...

"I would be surprised that it doesn't surpass Jaguars"

or

"i would not be surprised if dont surpass the jaguars"

I'll let a native english speaker decide.
ok ma bad, i did not see the error, english is not my native english too
 

Sgt.Asher

Member
Let me have a go at the wording then, I'm just a french speaker typing on this forum for 20 years...

"I would be surprised that it doesn't surpass Jaguars"

or

"i would not be surprised if dont surpass the jaguars"

I'll let a native english speaker decide.
It's a double negative, it can be strange to parse and generally frowned upon in english writing.

I'll rephrase how I read it

"I would not be surprised if it is worse than the jaguar"

Correct me if I'm wrong
 

BlackTron

Member
If the rumors that we are bickering about are true, it means that NS2 would be about 10 times more powerful in handheld mode and 3-4 times in docked mode, before any other boosts mostly from dlss and ram.

Given that those titles like Zelda, Doom, Xenoblade, Witcher etc ran on .15tf without exploding, and NS2 has a baseline of 1.7 in handheld mode, we'd be unshackled from some royal jank.

Yeah that means the raw tf number is like a PS4. Doesn't mean it will actually run like a real PS4 or that the chip is nearly as flawed or bad as Jaguar.

With Switch I think even casual users were aware of the performance tradeoff and were just willing to make it. But with 2 I don't believe it will as apparent to begin with. Of course a console will be better but people won't be bothered to worry about it that much unless the Switch 2 encounters jank and it won't. It'll likely just have toned down resource hog effects that largely go unnoticed without comparing versions.
 

Buggy Loop

Gold Member
Isn't Switch 1 CPU better than Jaguar but only weaken by limited clock and cores count?

Almost everything outside AMD from 2008 and forward has better IPC core for core clock for clock. I'm not kidding when I say this architecture almost bankrupted AMD.

As you say, X1 Switch had limited core numbers and low clocks.

Nvidia TX1 - Switch SoC - A57 @2GHz - 280
AMD A9-9820 - Xbox One APU - Jaguar @2.35GHz - 294

Since Switch was @ 1GHz and PS4 Jaguar was 1.6GHz, by extrapolation, single core Geekbench 5 would be

Switch - A57 @1GHz - 140
PS4 - Jaguar @1.6GHz - 200

So weaker because of clocks and cores, but better IPC.

A 3GHz A78 single core performance
Mediatek Dimensity 1200 - A78 @3GHz - 975

At a comfortable Tegra Orin 1.6 GHz (I think its more like 1.8GHz):
A78 @1.6GHz - 520
A78 @1.8GHz - 585

In this case though it does not have less cores nor clocks. It would have to downclock to 615 MHz to match Jaguar PS4's 1.8GHz score.
 

Kataploom

Gold Member
Isn't Switch 1 CPU better than Jaguar but only weaken by limited clock?

Almost everything outside AMD from 2008 and forward has better IPC core for core clock for clock. I'm not kidding when I say this architecture almost bankrupted AMD.

As you say, X1 Switch had limited core numbers and low clocks.

Nvidia TX1 - Switch SoC - A57 @2GHz - 280
AMD A9-9820 - Xbox One APU - Jaguar @2.35GHz - 294

Since Switch was @ 1GHz and PS4 Jaguar was 1.6GHz, by extrapolation, single core Geekbench 5 would be

Switch - A57 @1GHz - 140
PS4 - Jaguar @1.6GHz - 200

So weaker because of clocks and cores, but better IPC.

A 3GHz A78 single core performance
Mediatek Dimensity 1200 - A78 @3GHz - 975

At a comfortable Tegra Orin 1.6 GHz (I think its more like 1.8GHz):
A78 @1.6GHz - 520
A78 @1.8GHz - 585

In this case though it does not have less cores nor clocks. It would have to downclock to 615 MHz to match Jaguar PS4's 1.8GHz score.
I can believe you, IIRC AMD stock went as low as $2 per each.

"Doomers" tend to believe Nintendo can always get weak tech "just because Nintendo", yet it's really hard to find something as weak as Jaguar, they'd have more issues finding something like that, unless they really try IMO.

Same applies to those comparing PCs with similar GPUs than PS5, I see many complaining that the CPU is always too powerful to be "fair", but it would be hard to find a current CPU on PC that's that weak, unless you go for the very low end (I think even Ryzen 3 is better than consoles CPU these days so you have ti get what? Athlon? Do they still sell those?)
 

Zathalus

Member
The fact that Switch 2 docked is being even compared to PS4 is absolute insanity. Even back in 2013 (yes, almost TWELVE YEARS AGO) the PS4 was called out as an underpowered shitbox. I was one of them. It got outperformed by even low end PCs. Its shortcomings weren't fixed until PS5 which is also entering its fourth year of release.

Of course Ampere is better than trash GCN and of course Jaguar CPU was a complete failure. But the discussion even being about PS4 is a Nintendo failure. Handheld chips like Z1E decisively outperform the PS4. And Switch 2 is so way behind Z1E. You can easily get hold of RoG Ally for $500 and you'll have a handheld with a generation's superior specs. Now all boils down to how much heavy lifting DLSS can do. At that low power supply, I am sceptical it will be as transformative as desktop DLSS. We'll see.
The ROG Ally has a battery life of under one hour at its maximum speed. It’s also very noisy. In a configuration that actually gives you a usable battery life (silent mode) the GPU performance is under the original Xbox One.

I’m sure if Nintendo wanted terrible battery life they could beat the PS4, like it does in docked mode where battery life isn’t a concern.
 
Last edited:

James Sawyer Ford

Gold Member
It'll likely just have toned down resource hog effects that largely go unnoticed without comparing versions.

Nowhere close to being true, the difference between SW2 and modern consoles is enormous. XSS has been the red headed step child of the consoles all gen long and SW2 will be lucky to compete with that console
 

llien

Member
... and of course Jaguar CPU was a complete failure...

I think Jaguar is sh*t on for no reason.
2013 is pre-Zen, failed buldozer times, when Intel's CPUs could run circles around AMD's while consuming less power.

Jaguars were an exception from that rule. They stood out from entire lineup, having competitive perf/$.
Anandtech (RIP) had an excellent article:


Bobcat was a turning point for AMD. The easily synthesized, low cost CPU design was found in the nearly 50 million Brazos systems AMD sold since its introduction. Jaguar improves upon Bobcat in a major way. The move to 28nm helps drive power even lower, which will finally get AMD into tablet designs with Temash. Despite being lower power, Jaguar also manages to increase performance appreciably over Bobcat. AMD claims up to a 22% increase in IPC compared to Bobcat. Combine the IPC gains with a more multi-core friendly design and Jaguar based APUs should be appreciably faster than their predecessors.

Screen%20Shot%202013-05-22%20at%2011.52.41%20PM_575px.png
 

Panajev2001a

GAF's Pleasant Genius
I'm not kidding when I say this architecture almost bankrupted AMD.
No, but I think you are wrong here. Jaguar got them the console design wins and massively improved their perception over Bulldozer (which seems to have been a costly mindshare and money pit). Jaguar (and Bobcat before it) were the ones that inverted the trend and gave them time to come up with and launch Zen. The fall of AMD in those days was not started by Jaguar.

Sure it was not a comeback story as much as Banias / Pentium M was for Intel, but it helped a lot.

For what game consoles needed, it did its job. GPU was again the focus of the transistors’ budget and overall cost. No wonder both console manufacturers pivoted to the same CPU and created semi-custom SoC to hide some of the weaknesses and max on the strengths they had.
 
Last edited:

FireFly

Member
The fact that Switch 2 docked is being even compared to PS4 is absolute insanity. Even back in 2013 (yes, almost TWELVE YEARS AGO) the PS4 was called out as an underpowered shitbox. I was one of them. It got outperformed by even low end PCs. Its shortcomings weren't fixed until PS5 which is also entering its fourth year of release.

Of course Ampere is better than trash GCN and of course Jaguar CPU was a complete failure. But the discussion even being about PS4 is a Nintendo failure. Handheld chips like Z1E decisively outperform the PS4. And Switch 2 is so way behind Z1E. You can easily get hold of RoG Ally for $500 and you'll have a handheld with a generation's superior specs. Now all boils down to how much heavy lifting DLSS can do. At that low power supply, I am sceptical it will be as transformative as desktop DLSS. We'll see.
At what power level? DF found the Ally was only up to 33% faster than the Deck at 15W, with many games showing signficantly smaller improvements. In GoW it delivered 86% of the PS4's frame rate in this mode. And bear in mind this is with 15W dedicated to the SoC only, where as we would expect Switch 2 to be <15W for the entire system. A more reasonable comparison would probably be the Ally in its 9W/10W mode. From what I can see online, 800 MHz is the maximum speed in that mode, which would give ~1.2 RDNA 2 TF.
 

Klosshufvud

Member
The ROG Ally has a battery life of under one hour at its maximum speed. It’s also very noisy. In a configuration that actually gives you a usable battery life (silent mode) the GPU performance is under the original Xbox One.

I’m sure if Nintendo wanted terrible battery life they could beat the PS4, like it does in docked mode, where battery life isn’t a concern.
The Rog Ally at 15W (2h battery) is decisively superior to Xbone.
 

Zathalus

Member
The Rog Ally at 15W (2h battery) is decisively superior to Xbone.
The Switch 2 will be under 10w, and even 15w does not always get you 2 hours in a demanding title on the Rog Ally. Most AAA games would be around 90-100 minutes at 15w. Nintendo probably wants 2.5 hours - 3 hours at minimum with a handheld. Sure you can get the ROG Ally X and at 15w you can get decent battery life, because it packs a massive 80Wh battery and costs probably double what the Switch 2 would.
 

Klosshufvud

Member
The Switch 2 will be under 10w, and even 15w does not always get you 2 hours in a demanding title on the Rog Ally. Most AAA games would be around 90-100 minutes at 15w. Nintendo probably wants 2.5 hours - 3 hours at minimum with a handheld. Sure you can get the ROG Ally X and at 15w you can get decent battery life, because it packs a massive 80Wh battery and costs probably double what the Switch 2 would.
Maybe the bigger problem here is Nintendo's stubborness sticking with terribly low power output? 10W was feasible in handheld hardware a decade ago. But with today's high standards of performance, I would argue that 15-18W should be standard. And yes this means elaborate cooling and 60Wh+ battery. The way graphics computing has developed, absolute low power is not as feasible as once was.

The issue is of course the Joy Cons which means Nintendo can't properly utilize all the handheld space for cooling and battery.
 

Zathalus

Member
Maybe the bigger problem here is Nintendo's stubborness sticking with terribly low power output? 10W was feasible in handheld hardware a decade ago. But with today's high standards of performance, I would argue that 15-18W should be standard. And yes this means elaborate cooling and 60Wh+ battery. The way graphics computing has developed, absolute low power is not as feasible as once was.

The issue is of course the Joy Cons which means Nintendo can't properly utilize all the handheld space for cooling and battery.
Elaborate cooling and 60Wh+ battery will drive up cost, noise, and weight. Nintendo is making a mass market kid friendly device. Those tradeoffs will most certainly impact the appeal to a lot of people.
 
Top Bottom