• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is ray-tracing worth it?

Is ray-tracing worth it?

  • Yes. I turn it on for all games

    Votes: 72 16.8%
  • Yes. But only for some games that make good use of it.

    Votes: 130 30.3%
  • No. The performance impact is not worth it for most games. Only for a few games

    Votes: 141 32.9%
  • No. It's never worth turning on, because the performance drops too much.

    Votes: 73 17.0%
  • I don' know / Don' care.

    Votes: 13 3.0%

  • Total voters
    429

proandrad

Member
Ray tracing main purpose is to reduce the workload for game makers, but marketing has tried to sell it as a feature for gamers to sell hardware. An artist will always be able to make a scene look better than ray tracing if they are talented and given enough time.
 

Viruz

Member
The cost of RT is already not worth it on PS5 and because of RT the PS6 games will still be at a horrible 30fps in 2030.
30fps in 2030.
It's a never ending cycle.
 
No.
When I turn it on, in cyberpunk and off again. The only thing I notice is the frame rate plummet.

Maybe if I could see them side by side I would notice the difference. Otherwise I can't.
 

Topher

Identifies as young
Only if you have a high end expensive Nvidia GPU so for the vast majority, probably not.
 
Last edited:

Bojji

Member
No.
When I turn it on, in cyberpunk and off again. The only thing I notice is the frame rate plummet.

Maybe if I could see them side by side I would notice the difference. Otherwise I can't.

In Cyberpunk RT is VERY obvious.

Software raytracing like Lumen. Raytracing with voxel representation of world.

It's the first kind of real time GI, this video is from 2015. It only shows that great results were always possible, developers just didn't care enough (and consoles were too weak).
 
Last edited:

Clear

CliffyB's Cock Holster
Until mass market hardware is powerful enough across the board for it to become the universal default, replacing bakes and traditional techniques, its going to remain a flashy gimmick.
 

HeWhoWalks

Gold Member
Until mass market hardware is powerful enough across the board for it to become the universal default, replacing bakes and traditional techniques, its going to remain a flashy gimmick.
But, like every other taxing feature that we can now run relatively easy, it has to start somewhere.
 

Bojji

Member
Begs the question. Shouldn't the PRO be getting a RTGI patch for Cyberpunk if it's 45% more powerful and supposedly better at handling raytracing?

I think the best they will be able to do it to use RT reflections and maybe medium RT lighting in 30fps? Like AW2. Path Tracing is probably too demanding for Pro.

Of course if they even want to patch this game.
 

bbeach123

Member
Might I chime in as a professional photographer and photo teacher?

Realistic doesn't mean good or even pretty. Half of my job consists of faking/alterating light sources or colors. Cutscene lighting is always fake through and through and even in-game lights are usually heavily modified for mood, artisric or navigational purposes.

Ray-tracing is solving the lightmap baking problem, yes. It can speed up the process if you lack artists or resources to re-bake the lights oftenly. But full RT/PT is too problematic for both art and level design departments, becase the 100% accurate lighting is usually dull, flat and just boring. That's why games, just like movies or photoshoots, will never drop fake light. And it's easier to fake the whole model with occasional inclusion of RT for contact shadows or reflections, not full-on RT. Even CP2077, a champion of the tech, while being good for most of the time with RT/PT is suffering in that mode. In some scenes and locations quite heavily, because art is not adjusted properly and cannot be realistically fully adjusted to full PT regardless. For example dark locations with PT/RT are just... Well, too dark. You need to either add a lot of subtle fake lights, mess with the contrast or just turn off the roof for cutscenes in darker locations. Plus static scenes tend to 'pop' more, because they are adjusted by hand and not by the laws of nature.

There is also a huge problem with faking lights with always-on RT, because of the great strain on hardware. You basically need to add fake reflectors, fake lighting rigs and even fake color panels. All to negate effects of realistic shadows and realistic color bleeding. PT/RT is basically bringing real life on-set problems to videogames and nobody wanna deal with that, especially in big-ass open-world games. That's why, for example, Horizon Forbidden West doesn't use RT and very few games even on PC (outside of sims) rely on RTGI for a complete lighting model.

So yeah, RT will be used, but sparingly. It's not always on even in pre-rendered animated films because there is an artistic intention that can be ruined by the light being 'too real'.
Ignored the stupid intro , RE4 cutscene was kinda like this .
A bunch of invisible lightbub flying around .

 

Clear

CliffyB's Cock Holster
But, like every other taxing feature that we can now run relatively easy, it has to start somewhere.

Totally. But there's a huge difference between a high-end feature versus a standard feature. From a design standpoint you can't lean too hard on a feature/capability that not everyone can access.
 

HeWhoWalks

Gold Member
Totally. But there's a huge difference between a high-end feature versus a standard feature. From a design standpoint you can't lean too hard on a feature/capability that not everyone can access.
That’s what I’m saying though. 11-12 years ago, tessellation was a high-end feature. Now? Hardly anyone mentions it. Of course raytracing is far more complex and impacts more things, but a baby doesn’t walk before it crawls 9/10. These things have to have an origin so developers can gauge what works and what needs improvement.
 
Last edited:
Nowadays with DLSS + AMD frame gen mods, I can sometimes hold 60fps with my 3080.

But my experience typically goes: turn on RT, see how good it looks but how choppy the framerate is, then turn it off because the performance hit isn't worth it.
 
It was, it's a PC game first and foremost and it was its biggest selling point since they first showed it. I remember all the videos showing how they designed the RTGI.
The original PC version had RT support, but I think they added RT GI at a later stage of development, and perhaps only because Nvidia paid them to do so.

Have you played Metro Exodus Enhanced Edition? They added RT GI in all indoor locations, but artistic intent was ruined because of that. The original game was very dark, but in the enhanced edition even the dark metro tunnels became too bright, so pepple who played the original game complained about raised blacks etc.
 
I picked yes but I don't turn it on for all games even when I was PC gaming. I think it's worth it though we just need to wait for the hardware to catch up.
 

danklord

Gold Member
People who "don't notice it" are the same people who would tell you there isn't a difference between 1080p and 4k. "Imperceptible to the human eye"

There is a difference. It's huge. Games look better. Buy a 4090. The end.

And here's a bonus one for you, 8k is going to look better than 4k. Especially with raytracing.
 
Last edited:
The original PC version had RT support, but I think they added RT GI at a later stage of development, and perhaps only because Nvidia paid them to do so.

Have you played Metro Exodus Enhanced Edition? They added RT GI in all indoor locations, but artistic intent was ruined because of that. The original game was very dark, but in the enhanced edition even the dark metro tunnels became too bright, so pepple who played the original game complained about raised blacks etc.

I am pretty sure it was GI on release and later with one of the DLC's they've added emissive lighting which was pretty much the first I believe iteration of pathtracing, but since Metro Exodus wasnt a huge blockbuster game afaik, it didnt get that much attention, and most likely they implemented that in EE which no, I havent not replayed, tho I was planning to before stalker 2 release. the devs of Metro have always pushed the graphics to its limits and with every Metro release they had something new tech-wise, shame they didnt get much attention. Excited for their new game, I'm sure it'll be impressive visually.
 

Mister Wolf

Member
I am pretty sure it was GI on release and later with one of the DLC's they've added emissive lighting which was pretty much the first I believe iteration of pathtracing, but since Metro Exodus wasnt a huge blockbuster game afaik, it didnt get that much attention, and most likely they implemented that in EE which no, I havent not replayed, tho I was planning to before stalker 2 release. the devs of Metro have always pushed the graphics to its limits and with every Metro release they had something new tech-wise, shame they didnt get much attention. Excited for their new game, I'm sure it'll be impressive visually.


I'd say they got a lot of attention where it counts.
 

yogaflame

Member
People who "don't notice it" are the same people who would tell you there isn't a difference between 1080p and 4k. "Imperceptible to the human eye"

There is a difference. It's huge. Games look better. Buy a 4090. The end.

And here's a bonus one for you, 8k is going to look better with 4k. Especially with raytracing.
4090 Graphic card? If only I have $1,999.00 :messenger_loudly_crying:
 

yogaflame

Member
That what credit cards are for.
I know but, got to pay my bills first and expenses on my house repair, etc. Well it is not just the graphic card, but I also need to upgrade my CPU and new Memory and SSD. My CPU and GPU are embarrassingly obsolete. For now my priority in gaming is to get next year Ps5 pro, since it will still have a decent RT at least. And hopefully a magic from PSSR ML will do its trick to, at least, hopefully barely noticeable difference from high end pc, if not overtake or equal high-end pc when it comes to RT and frame rate. But I still plan to upgrade my PC. But not yet a priority.
 
Last edited:

Kataploom

Gold Member
Today? No, unless you can't just stand games not looking like real life (for which I recommend to wait until 2056 or so). RT is too expensive and is put on top of rasterized lighting model which makes it even more expensive than it should be, the reason for making RT viable is mostly easing the development pipeline which will result in games being made easier/quicker/cheaper, but right now is just a gimmick on top of an already built lighting system, once we have actual hardware for RT at cheap and rasterized lighting model gets ditched, that's when RT will "shine" (accidental pun lol)
 

Msamy

Member
Yes very much worth it and any game use it have huge graphics advantage over other brebaked bullshit solutions but in the end those 60 fps crying people's hate it, they don't mind play games with PS2 graphics if it 60fps
 

EverydayBeast

ChatGPT 0.001
Two things and graphic whores will get mad here:

#1 When did people care so much about graphics?

#2 People are walking around judging graphics in games nobody cares about
 

Kataploom

Gold Member
Youre missing out. HDR (when done correctly) can take your breath away. You need a good HDR monitor too. Even most OLED's suck here.

RT is definitely not worth it. I usually save the grunt for more fps.
The problem with HDR is that it's too dependent on many factors: Your screen, your cable, the device running it, the in-game implementation which is often more of a miss than a hit, the actual HDR standard, etc. You need to tinker even more than playing on PC (which mostly requires setting a preset and be done with it almost always) just finally sit and play... Then you'll have to check if it was worth it in that specific game lol.

At least RT is something more straightforward, not currently viable imo but is the actual future for many reasons, specially if games visuals start getting stagnated for a while and costs/times to develop continue to grow.
 

iorek21

Member
RT is pretty and a good benchmark tool for hardware enthusiasts, but that's it. Doesn't affect gameplay, rarely improves overall game enjoyment.

The future use of AI in games will be much more impressive and impactful.
 
This and HDR get turned off when I get play games.
icegif-363.gif
 
You don't know what you're missing. I live in an area where I can gaze at virtually perfect water reflections every time the wind calms down.

F3n8L8s.jpeg
SSR reflections would completely break the immersion in a similar scene, because as soon as you move the camera, the screen space reflections disappears. X360 / PS3 era games typically used planar water reflections which looked absolutely amazing, so when I saw SSR for the first time I wasnt happy :p.

I have studied the rules of light and I love what RT does. Even RT software (lumen, svogi) grounds objects and characters in the scene, taking realism to a whole new level. You can pre-bake GI in games with amazing results if the game has static TOD, but character models will always stand out without dynamic GI because their lighting will never match the scene.

When I saw The Witcher 3 with RT I was amazed, even though the game ran at 7fps with RT on my old PC. Now I get 70-80fps with RT and up to 170fps with the help of DLSS and FG. I would feel really bad playing this game without RT. Once you've experienced this game with RT it's impossible to play it with raster lighting.
 
Last edited:

Bojji

Member
Two things and graphic whores will get mad here:

#1 When did people care so much about graphics?

#2 People are walking around judging graphics in games nobody cares about

1. Since always, the whole point of console upgrades is to provide better graphics and/or performance.

From gen 1 to gen 7 we had graphics and gameplay mechanics upgrades, gen 8 and 9 are just graphics upgrades.
 
If it´s only noticeable and game changing on a few games and with the mandatory use of using high end hardware, that means it still has a long road to walk until be truly revolutionary for most of the gaming userbase. At least on consoles, devs sould focus more on taking rasterization tech beyond it´s very limits, instead of implementing taxing RT tech just for a few reflections or some better shadows and lighting. The console manufacturers should focus on deliver consoles for next gen really meant to take advantage of RTGI, Path tracing and newer RT tech. Basically with the next gen RT as it was meant to be should be mainstream and available for most of the gamers.
 

Zacfoldor

Member
Most people would say no if they had the implementation of RT on PS5 or even on a 3080 like myself.

However, there are two things you aren't considering if only going by personal experience:

1. Ray tracing on some games, giving unlimited resources, is a generational leap in graphics. Want me to prove this one? It was an early PS5 pro video from DF that shows a racing game, I believe one by codemasters. They show the original lighting vs the RT implementation live on screen and without a doubt it is at least a generational leap vs the washed out and muted colors from the bad(standard) lighting. It's huge. The video proves it and they clearly stat it is a generational leap in that video. I won't be looking it up though, so don't ask.

2. With each new hardware device RT becomes more and more offloaded from the main GPU/CPU tasks and more concentrated in hardware nodes that can perform RT at the hardware level without needing to use system resources. This will only get more pronounced until eventually RT becomes a low/no cost implementation.

RT is the future, hardware is holding it back right now but that won't always be the case.
 
Last edited:

Knightime_X

Member
It'll be good when hardware is stronger.
I remember this same conversation but with antialiasing.
AA used to be very expensive, now nobody really gives it a second though.
RTX will be no different over time.

Should have added this:
 
Last edited:
Top Bottom