Panajev2001a
GAF's Pleasant Genius
Making things upBut the experts here told me I was wrong when I said ps3 lost a lot of money. something something... world wide consoles sold.. blah blah.
Making things upBut the experts here told me I was wrong when I said ps3 lost a lot of money. something something... world wide consoles sold.. blah blah.
DevEx maybe, but then again modern software is not screaming “efficiency” not even the ML/AI training these days.It's not that great compared to today's standards. The way modern multi-core, multi-threaded processors work is way more efficient than what Cell tried to do. It had a raw floating point calculation advantage over PC processors at the time, but it was way less versatile and not well suited for the direction CPU's were going overall. That's why it never made it out of specialized applications.
But the shader fragment work is ROPs, which were the responsibility of the Toshiba RSX 2D GPU, so that level of performance was completely expected when the job of the CELL SPUs would have been to simplify the operations to a 2D brute force acceleration, rather than do ROPs on SPUs.he did assist Insomniac tho didn't he? I guess that's why I remember it being Insomniac and other studios showing that they barely can get passed PS2 graphics on a Cell
Dude, he's taking the piss out of xbox taking losses for 20years, hiding it, and using Windows/Office money to fund it. PS post their profits and losses and it was well known, Aaron Greenberg was even celebrating them "hemorrhaging at retail" during the PS3. you suggesting Sony is not highly successful is crazy talk.Yeah. Clearly the numbers Sony have shared, convincing their fans that they are highly successful are bullshit.
Doesn't surprise me that they say they need to increase margins. They are probably still fudging the numbers to hide the real facts.
Really? If so, then that was destined for failure imo. I was thinking more like 16MB of restricted GPU eDRAM and 512MB of unified XDR.The original design was supposed to be 64MB eDram on the GPU and 128 on the Cell (no physical memory chips were planned at first). That's back when 360 was a 256MB machine also though - but yea they've basically designed a deferred shading accelerator with that thing. It would have been - interesting - to see results had those early machines came to pass...
Again, it might seem small on its own, but combined with the inflated OS RAM usage, it does add up. I don't have exact figures for PS3, but when looking at the range of PS3 RAM usage figures reported around launch (84MB-120MB), 360 consistently had a 50MB advantage. The lack of hardware upscaling made that even worse.That's overstating things - if game ran at 720p and upscaled to 1080p - you lost about 4MB of memory. In the end, every bit of memory matters in a console, yes - but it wasn't some large premium, we're talking 0.8% of ram available. And indeed that's why some games upscaled to 1080 vertical and let the upscaler do the rest (so getting further 2MB back).
I think the fact lots of games actually opted for native resolutions (giving perf/quality modes) was more interesting - and later in the gen that only became more common on the 360 as well. Obviously that had far larger memory ramifications - and plenty of software juggled it just fine on both consoles.
The example I mentioned was likely a result of the upscaling though. Sonic Unleashed ran at a constant 880x720 internally on both platforms and the only difference I could notice between both modes was the output resolution.That also wasn't PS3 exclusive. 360 upscaler had a broken gamma, and it was also programmable, where results varied from game to game. And the games running differently at different resolution was the result of not simply upscaling as I mention above - lots of software did this by late in the gen, including titles that specifically did 30fps 1080p and 60fps 720p tagets.
But indeed - on PS3 there were some games doing this from day 1 - and sometimes HD modes were rather broken - there was that Marvel game I remember in particular...
We went from "show me the numbers" to "the numbers are wrong." Looking forward to the saga's next chapter.Dench trying to concoct a narrative that current day Sony is cooking the books is hilarious.
You have to remember these are timelines well ahead of both console launches(with launch planned much earlier than 2006), and things did change with time. But when these designs were worked on, we'd be looking at 192MB vs 256MB console, but the former having 20-30x the bandwidth, which makes up for memory deficit pretty well.Really? If so, then that was destined for failure imo. I was thinking more like 16MB of restricted GPU eDRAM and 512MB of unified XDR.
Nothing that is publicly available sadly. A lot of that early GPU work was pretty hidden stuff.Do you have a source for that info? I'd love to read it since PS3 hardware stuff is so interesting.
Yea I know, that definitely didn't help matters, just saying scaler was less of an issue than some other elements.Again, it might seem small on its own, but combined with the inflated OS RAM usage, it does add up. I don't have exact figures for PS3, but when looking at the range of PS3 RAM usage figures reported around launch (84MB-120MB), 360 consistently had a 50MB advantage. The lack of hardware upscaling made that even worse.
There was that - but like mentioned - multiple resolution targets were a thing from early days for PS3, even some of the launch window titles had two (or three) modes.With later games, I'd say that's more down to developers becoming accustommed to the hardware + Sony's OS optimisations. It's kinda like how PS2 games initially looked underwhelming in comparison to dreamcast, but eventually far exceeded that console as the gen continued.
Not just unscathed for Nintendo but a glorious triumph over their competitors, they made $22 billion with Wii and DS during PS3, PSP, and 360 era, that was a golden age of Nintendo financially before the Switch era.That entire generation was a financial loss for both companies tbh. I recall the RROD incident also costing M$ billions to address. Only Nintendo came out unscathed and even then, their choices that gen ultimately led to the Wii U disaster.
Surely that was a pre STI Group design? before they'd settled on the POWER architecture/instruction set for the SPUs, yes? because I can't see any scenario where the IBM hosted CELL BE documentation about its design ever could have worked for IBM and their needs of the STI group project.You have to remember these are timelines well ahead of both console launches(with launch planned much earlier than 2006), and things did change with time. But when these designs were worked on, we'd be looking at 192MB vs 256MB console, but the former having 20-30x the bandwidth, which makes up for memory deficit pretty well.
But it's entirely possible the change to XDR would have happened later anyway - and you'd end up with something closer to your example - although I don't see any scenario where GPU would have less than 32MB of eDram - Sony's done a fair bit of work with HD resolutions by then, and 64MB wasn't chosen by accident. FB-only approach to eDram that 360 used had a ton of other limitations outside of requiring tiling - ie. Sony/Toshiba approach to rasterization was to actively leverage high bandwidth in the rendering pipeline(also design mindset behind the PS2, but that got limited a bit by cutting the console to 4MB from the original 8MB plan) - MS just put it there to mitigate perceived weakness of original XB UMA.
...
....
Yeah, Wii was on top that gen by far. The only problem I can think of is it's comparatively low software sales, but selling over 100m units hardware without taking a loss clearly offsets that.Not just unscathed for Nintendo but a glorious triumph over their competitors, they made $22 billion with Wii and DS during PS3, PSP, and 360 era, that was a golden age of Nintendo financially before the Switch era.
Profits ranking
1. Switch era - $25 billion
2. Wii/DS era - $22 billion
3. PS5 era - $12 billion
4. PS4/Vita era - $9 billion
Nah, IBM discontinued the Cell in 2009... They proably will not going dig the grave to bring it back :-/The cell is a great piece of technology even at todays standard. Even medical field use cell processor and it was banned to be exported to China due to military implication that tech might be stolen. To and many developers especially from third party complained so much about it even if 1st and 2nd party are amazing with the cell processor. I hope for its comeback on ps6, maybe a hybrid with AMD Zen. This will really help especially the PSSR ML.
As far as I'm aware no - this was built parallel to Cell, it was 'the' GPU for PS3 (until it wasn't).Surely that was a pre STI Group design? before they'd settled on the POWER architecture/instruction set for the SPUs, yes?
I thought the: more than 8 SPUs I read about in 2011-ish was a twin 16 core Cell BE2 for either a PS3 Pro prototype or PS4 prototype long after Ken, I hadn't realised he was pushing that design from the beginning. I'm guessing they must have predicted a lower node for Cell launch chips than what they got if they thought they could fit twice the SPUs in a Cell BE at the design stage.As far as I'm aware no - this was built parallel to Cell, it was 'the' GPU for PS3 (until it wasn't).
Anyway there were other missed targets, like Cell was supposed to be more than just 8-way SPU, Ken apparently really wanted that 1TFlop machine originally.
Sony tvs are much more expensive they let lg & samsung dominate the casual/cheap tvs everyone buys bravia is the best tv ever madeSony lost the TV business to LG and Samsung.
Pretty sad really they invented the LCD TV.
The thermal design on early PS3s was great but it was ultimately the fan curve that saved the day. Afaik, the 360 relied on a static temperature target rather than a fan curve like PS3, which meant that 360s were essentially locked at the GPU killing temps from day one. I think the 360s cooling system was generally good enough to support the console if the GPU didn't have the underfill defect. And even with the defect included, a significantly lower temperature target would have gone a long way. The original Xenons had the GPU running at 97C!