Gavin Stevens
Formerly 'o'dium'
Just noise. White noise.
You’re talking to the wrong person champ. Believe me.
Just noise. White noise.
Is that not something to do with making it easier to determine the hitboxes rather than framerates? I understand that they turn down the foliage and smoke effects to make other players easier to see as well.Some Pro Counter Strike players play in 720p or even SD to get a higher framerate.
Framerate > Resolution. Always.
How does that make any sense whatsoever? The One X blows the “Pro” out of the water![]()
**Waits for someone to complain about his comment**
Edit: Already beaten LMAO so predictable!!![]()
more like 720 to 900p on the Xsad machine, it will not get any better, this machine has bottlenecks up the wazoo.it kind of makes perfect sense if the xsx is going to have crossgen games for a few years. Xsx shoot for~4K 60fps (which would be great) while Xbonex shoot for 30fps at 4K and xbonesad shoot for sub (I dunno, guessing) 1440p 30fps?
Just like your lord and saviour phill "good guy" Spencer
30fps is only encouraged by Sony for all their “cinematic” third person experiences.
Not always. Sitting a few feet from my 43 inch computer monitor, I can instantly tell 1440p from 4k. It's all about use case (of which, mine is admittedly niche).
A more accurate statement would be between 1800p and 4k it becomes really hard to tell. On a 2080 level card and above, you can still push out some really nice 100fps + framerates around 1800p if you're willing to sacrifice a few graphical settings. Again, for sure, I do agree that past 1440p is dimishing returns for a lot of people.
Most people wont even have a use for over 1440p because A. Their monitor is limited to 1440p B. They sit too far from their TV to notice the difference (or their TV doesnt truly support 120hz and over 1440p).
With a 4k 120hz monitor (of which I'm only aware of two bigger than 27 inches), there are a lot more combinations where both the resolution going above 1440p and the framerate going above 100 could be almost equally important
Well yes, they are one of the worst offenders. For me a 30fps game shouldn’t even qualify in any kind of ‘best graphics’ or ‘technical achievement’ category. It’s like gimping the game to make it look good and it needs to end!
Well yes, they are one of the worst offenders. For me a 30fps game shouldn’t even qualify in any kind of ‘best graphics’ or ‘technical achievement’ category. It’s like gimping the game to make it look good and it needs to end!
Oh for the love of God!Their marketing has been all about fidelity, 4k and 8k so far.
I don't disagree with him, I'm going to be gaming at 1440p 60-120hz for the foreseeable future (until 4k 60-120hz is possible without breaking the bank) but considering everything else they've been saying, it comes off as disingenuous.
It also makes you wonder, does he have information that the resolutions on series X might be lower than that on PS5, hence the new messaging?
Is that not something to do with making it easier to determine the hitboxes rather than framerates? I understand that they turn down the foliage and smoke effects to make other players easier to see as well.
Buy a powerfull pc.I prefer both. So what?
Oh for the love of God!
It's not new messaging. Phil has said this, many times in the past.
What one person likes, for themselves, doesn't mean it's the best point to use for marketing. Marketing is making things tangible. Marketing is playing on the lowest common denominator. Marketing is appealing to the masses; 4k, graphics and high-fidelity does just that, and has been doing just that for well over a decade.
Gaf really is a comedic goldmine. Are some of you people serious?! Give me a break
Yeah, it’s inherently subjective and bigger displays or sitting closer will make the resolution more apparent, but when I’m thinking bang for my buck on a display for gaming it’s always going to be stuff like response time, variable refresh rate (gsync preferred) and other factors like an IPS over TN. Resolution is probably the least important factor (though admittedly still a factor, sure) for me because I feel like I can just adjust to whatever it is and get used to it. 60FPS 16 bit games still feel good to play on an old CRT because they’re incredibly snappy and responsive, regardless of the 480i display. Sub 30FPS just feels worse to me and always feels sluggish in comparison. For my own tastes, I’ll always adjust PC settings to hit a rock solid 60FPS over maximizing other options
Making something lower res doesn't make it blurrier. If you have a low res texture that is upscaled and filtered then it becomes blurry. If you are playing at a low res and everything is rendered at that native res then it will be sharp, ugly as fuck but still sharp.Wait, are you suggesting that someone floated making the game blurrier makes the hit boxes easier to make out? How would that work? Definitely turning down effects that create environmental obstructions could be a positive but in something like COD, CS, etc... where shooting first and accurately is key, some baseline of visibility at range would have to be impaired by going under 1080p. I would think that would be pretty important
Seriously, you people thinking that anything he says is from the bottom of his heart is the biggest con since Don Mattrick attempted to convince us that the deadly combination of always on DRM and Kinect was somehow going to be good for us.
Everything this man says is PR/Marketing. Its literally his job and you're a fool to believe otherwise.
The only thing comedic is the cringe worthy hero worshipping of these well paid gaming executives.
Well yes, they are one of the worst offenders. For me a 30fps game shouldn’t even qualify in any kind of ‘best graphics’ or ‘technical achievement’ category. It’s like gimping the game to make it look good and it needs to end!
What about 900p games?![]()
What about them?
If it was up to me then all next gen games would be locked 60fps and some kind of reconstructed 4K. Then the priority can be on great gameplay design and not just making average games look good.
Making something lower res doesn't make it blurrier. If you have a low res texture that is upscaled and filtered then it becomes blurry. If you are playing at a low res and everything is rendered at that native res then it will be sharp, ugly as fuck but still sharp.
I'm talking about this gen games.![]()
Well then the same applies. Frame rate > game mechanics > graphics > resolution
So... Phil should have been saying he never wanted to play on the Xbone.
Developers are not going to always reduce resolution for framerate.... some games don't even need it, imo. Which is why Gears of War 4 wasn't 600p on the Xbone to get to 60fps.Im telling you what I want, not what somebody else wants. Capisce?!
I don't understand the change in PR. Phil and Co have been beating the pixel drum since the first rumours of scorpio and now he prefers frame rate? Something seems off about that. Perhaps the next console is "less powerful" or the games they're bringing to the fore are simply not high in fidelity because they also need to be streamed.
I'm not a "I believe in Phil Spencer" type of gamer. He seems to enjoy pre-emptive PR and contradicts himself way too often imo.
Og Xbox was the first console with ana internal hdd. Sony had to follow suit.
L34rn ab0u7 c0n50l35
Point to the part of the post where I said xbox 360.
All 360s didn't have hdd, so they didn't make them mandatory.
I prefer HFR when it's available but let's be honest - most of the stigma round this is games chugging at 23/24fps or sticking performance mode on games that stutter around 45-58fp giving the impression that 30fps is bad or 60fps is not enough. I'd prefer when they say they are 30fps, they remain 30fps for the entire game.
Having said that, if 4K/60fps is the standard then fair enough. In fact, I'd take a 1080p/60fps as standard
Keep waiting revisionist...Point to the part of the post where I said xbox 360.
I'll wait...
It’s really incredibly simple. The one x still had a garbage cpu so the focus was always on higher res. The next gen consoles will have far better cpus so they will be capable of higher frame rates as well. Why are you acting like it’s some crazy mystery?I don't understand the change in PR. Phil and Co have been beating the pixel drum since the first rumours of scorpio and now he prefers frame rate? Something seems off about that. Perhaps the next console is "less powerful" or the games they're bringing to the fore are simply not high in fidelity because they also need to be streamed.
I'm not a "I believe in Phil Spencer" type of gamer. He seems to enjoy pre-emptive PR and contradicts himself way too often imo.
Designers are lazy and will go the easiest route. At some point we have to set the standard to 60fps or we will never see 120fps.
A higher base franerate is better for vr as well.
Get psvr.Echoes with the rumors of 3.2GHz vs 3.5GHz, I'd prefer 300MHz over 1TF.
Make 60FPS mandatory already, someone with balls needs to do it, enough with the realism crap.