• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft is preparing DirectX for Neural Rendering

winjer

Gold Member

Microsoft announced updates to the DirectX API that would pave the way for neural rendering. Neural rendering is a concept where portions of a frame in real-time 3D graphics are drawn using a generative AI model that works in tandem with classic raster 3D graphics pipeline, along with other advancements, such as real-time ray tracing. This is different from AI-based super resolution technologies. The generative AI here is involved in rendering the input frames for a super resolution technology. One of the nuts and bolts of neural rendering is cooperative vectors, enable an information pathway between the conventional graphics pipeline and the generative AI, telling it what it's doing, what needs to be done by the AI model, and what the ground truth for the model is.
Microsoft says that its HLSL team is working with AMD, Intel, NVIDIA, and Qualcomm to bring cross-vendor support for cooperative vectors in the DirectX ecosystem. The very first dividends of this effort will be seen in the upcoming NVIDIA GeForce RTX 50-series "Blackwell" GPUs, which will use cooperative vectors to drive neural shading. "Neural shaders can be used to visualize game assets with AI, better organize geometry for improved path tracing performance and tools to create game characters with photo-realistic visuals," Microsoft says.


Neural Rendering: A New Paradigm in 3D Graphics Programming​

In the constantly advancing landscape of 3D graphics, neural rendering technology represents a significant evolution. Neural rendering broadly defines the suite of techniques that leverage AI/ML to dramatically transform traditional graphics pipelines. These new methods promise to push the boundaries of what’s possible in real-time graphics. DirectX is committed to cross-platform enablement of neural rendering techniques, and cooperative vectors are at the core of this initiative.

We are excited to share our plans to add cooperative vector support to DirectX, which will light up cross-platform enablement of the next generation of neural rendering techniques.

What are Cooperative Vectors, and why do they matter?​

Cooperative vector support will accelerate AI workloads for real-time rendering, which directly improves the performance of neural rendering techniques. It will do so by enabling multiplication of matrices with arbitrarily sized vectors, which optimize the matrix-vector operations that are required in large quantities for AI training, fine-tuning, and inferencing. Cooperative vectors also enable AI tasks to run in different shader stages, which means a small neural network can run in a pixel shader without consuming the entire GPU. Cooperative vectors will enable developers to seamlessly integrate neural graphics techniques into DirectX applications and light up access to AI-accelerator hardware across multiple platforms. Our aim is to provide game developers with the cutting-edge tools they need to create the next generation of immersive experiences.

What’s Next For Neural Rendering?​

The HLSL team is working with AMD, Intel, NVIDIA, and Qualcomm on bringing cross-vendor support for cooperative vectors to the DirectX ecosystem. Stay tuned for more updates about cooperative vectors and its upcoming Preview release!

Cooperative vectors will unlock the power of Tensor Cores with neural shading in NVIDIA’s new RTX 50-series hardware. Neural shaders can be used to visualize game assets with AI, better organize geometry for improved path tracing performance and tools to create game characters with photo-realistic visuals. Learn more about NVIDIA’s plans for neural shaders and DirectX here.


The future of graphics might not be polygons and shaders, but just a bunch of neural networks placing assets in our screens.

Basketball Ok GIF by Malcolm France
 

winjer

Gold Member
winjer winjer

I asked the other day what you thought of Nvidia conference, but the more interesting tidbits not even present in the show did you see them?

The new BVH technique with neural networks they call mega geometry is insane.

Almost everything after vertexes they’ve changed for something neural.

I didn't see the presentation, as it was too late for me. Only saw the GPU news the next day.
Can you post that video.
 

Buggy Loop

Gold Member
I didn't see the presentation, as it was too late for me. Only saw the GPU news the next day.
Can you post that video.





Alan Wake 2 is getting mega geometry on day 0 so it’ll be interesting to see.

It’ll work with UE5 nanite

« RTX Mega Geometry intelligently updates clusters of triangles in batches on the GPU, reducing CPU overhead and increasing performance and image quality in ray traced scenes. RTX Mega Geometry is coming soon to the NVIDIA RTX Branch of Unreal Engine (NvRTX), so developers can use Nanite and fully ray trace every triangle in their projects. For developers using custom engines, RTX Mega Geometry will be available at the end of the month as an SDK to RTX Kit »

Path trace all triangles of a project??? No more dumbed down geometry for BVH? Insanity!?
 
Last edited:

onQ123

Gold Member
Can't wait for this because it's going to get crazy but hopefully not too crazy AI is going to be the Wild Wild West for game development.
 

Buggy Loop

Gold Member
How about making games fun first?

So many special marketing terms and technologies yet we still are stuck in the late 2000’s in terms of gameplay logic.

Still waiting for DirectStorage to revolutionize gaming… it came out 3 years ago and there’s like a handful of games that use it and Forspoken is one of them.

Game concepts and mechanics are established way before you get serious pushing an engine so it’s not related really.
 

adamsapple

Or is it just one of Phil's balls in my throat?
So, now we're infering physix effects instead of calculating them, which seems to be a good thing (so it can compute other things, i guess), just guide the entities to be rendered to be inferred and let the tensor cores do their magic... seems fine as long as games still get the artistic touch and not just another interactive and boring "real life"-like movie.
 
Last edited:

Buggy Loop

Gold Member

We're not that far away from Nvidia's vision.

When in the future games are neural shaded and neural radiance cached path traced and neural geometry BVH and neural texture decompressed and neural faces and skins on characters and neural lip syncing and neural AI…oh and of course neural upscaler and frame gen

I think they’ll have been right all along
 
Last edited:
Textures take up a decent chunk of game storage, and if something isn't done we'll soon be approaching storage size bottlenecks.

We've already seen examples of neural texture compression in games like GOWR, basically storing a low resolution texture and then upsampling it via ML in realtime, this is going to be really exciting stuff once more game studios start to embrace it but I guess the hardware will have to come first. I'm really curious to see how the neural rendering will impact geometry and other things like material shaders.
 

Buggy Loop

Gold Member
And watch this never get used like DirectStorage because devs can’t be bothered to change their development environment.

DirectStorage was an answer that all along was not really needed on PC with fast SSDs 🤷‍♂️

Nor is it even flexed that much on consoles that built hype for I/O streaming

Ray tracing all nanite geometry without simplifications for BVH, having cinema quality off line rendered shaders in real time, for sure it will be used. and it’s not Nvidia only, all vendors are participating with DirectX for it.

Alan Wake 2 patch at 5000 series launch later this month will be patched with mega geometry.
 

Gaiff

SBI’s Resident Gaslighter
DirectStorage was an answer that all along was not really needed on PC with fast SSDs 🤷‍♂️
Not true. Your CPU is still needed for decompressing data. A fast SSD alone won’t cut it.
Nor is it even flexed that much on consoles that built hype for I/O streaming
Yeah, a dev problem as I said. When you look at Spider-Man 2 and Demon’s Souls loading levels in less than 3 seconds, but then you get Black Myth Wukon taking 45 seconds, you know someone isn’t using the hardware properly.
Ray tracing all nanite geometry without simplifications for BVH, having cinema quality off line rendered shaders in real time, for sure it will be used. and it’s not Nvidia only, all vendors are participating with DirectX for it.

Alan Wake 2 patch at 5000 series launch later this month will be patched with mega geometry.
Remains to be seen if devs will use it. Still a lot of new tech that is unused such as mesh shaders.
 

nemiroff

Gold Member
I have quite a few "told you"-worthy posts explaining how this was going to be the future. I've received quite a few laughing faces and silence even here, lol.. I do understand though, it's not easy to grasp the scope of this vision since it's not something people have been accustomed to process until it became more practical. Aside of the inevitable death by robots I love this neural rendering part of the future.
 
Last edited:

Ozriel

M$FT
How about making games fun first?

So many special marketing terms and technologies yet we still are stuck in the late 2000’s in terms of gameplay logic.

Still waiting for DirectStorage to revolutionize gaming… it came out 3 years ago and there’s like a handful of games that use it and Forspoken is one of them.

The Direct X development teams aren’t responsible for making fun games.

Let’s optimize for logical comments, please.
 

Buggy Loop

Gold Member
I have quite a few "told you"-worthy posts explaining how this was going to be the future. I've received quite a few laughing faces and silence even here, lol.. I do understand though, it's not easy to grasp the scope of this vision since it's not something people have been accustomed to process until it became more practical. Aside of the inevitable death by robots I love this neural rendering part of the future.

Yeah me too I have too many receipts to count

But this is neogaf. You just have to bring up the switch prediction thread to know what I mean.
 

Gaiff

SBI’s Resident Gaslighter


Your comment: DirectStorage isn’t needed because of fast SSDs.

My reply: It’s not true. You still need a fast CPU.

Your retort: Showing Rift Apart with a fast SSD+CPU.

Yeah, no shit?

Here is Black Myth Wukong running on a 12900K+nvme SSD and taking a whopping 27 seconds to load on PC and 26 seconds to load on PS5.



Contrast that with PS5 exclusives that seldom take more than 5 seconds to load no matter what. You need more than a fast SSD alone on PC. You still need a fast CPU and the developer actually needs to put in the legwork not to bottleneck the I/O. One of the reasons Rift Apart loads so fast is also because the work was already done on PS5 and ported to PC. DirectStorage has its shortcomings. Hell, it's not even needed in Rift Apart. However, it has many use cases, but most developers don't bother with it despite it being an easy win a lot of the time. It's not always good such as in HFW where the dev opted not to use it, but it can produce excellent results. My point is just because you see those technologies being announced doesn't mean you'll see them being used.
 

Buggy Loop

Gold Member
Your comment: DirectStorage isn’t needed because of fast SSDs.

My reply: It’s not true. You still need a fast CPU.

Your retort: Showing Rift Apart with a fast SSD+CPU.

Yeah, no shit?

Here is Black Myth Wukong running on a 12900K+nvme SSD and taking a whopping 27 seconds to load on PC and 26 seconds to load on PS5.



Contrast that with PS5 exclusives that seldom take more than 5 seconds to load no matter what. You need more than a fast SSD alone on PC. You still need a fast CPU and the developer actually needs to put in the legwork not to bottleneck the I/O. One of the reasons Rift Apart loads so fast is also because the work was already done on PS5 and ported to PC. DirectStorage has its shortcomings. Hell, it's not even needed in Rift Apart. However, it has many use cases, but most developers don't bother with it despite it being an easy win a lot of the time. It's not always good such as in HFW where the dev opted not to use it, but it can produce excellent results. My point is just because you see those technologies being announced doesn't mean you'll see them being used.


And you would have a slow CPU and slow SSD... because?

Robin Williams What Year Is It GIF
 

Buggy Loop

Gold Member
Because SSDs cost peanuts? You could have a 2600X with a relatively fast SSD.

A 2600X has a lot more to worry than DirectStorage for Wukong dude.

Oh look, a Ryzen 2700



It loaded even faster than with DirectStorage on

You're right its on devs

As we see here, even without requiring RTX I/O or Direct Storage, on a slow CPU, its the same fucking thing

As I said earlier.

The game was BUILT around streaming assets, rather than putting everything in local memory. Really little to do with fancy tech. Wukong will stream the world almost equivalent to SSD on an HDD because it bloated the memory.
 
Last edited:

Ozriel

M$FT
We should also refrain from making disingenuous ones as well. All these technologies have added very little to core gameplay elements.

APIs like Direct X, Vulkan etc are immensely important to developers. New features benefits gamers as they can bring polish and improved visuals.

Again, none of these people are the ones you should be appealing to make fun games or add gameplay elements. why are you struggling with something so simple?

Next you’ll be calling for Nvidia and AMD to fire the folks working on improving drivers and dedicate them to ‘gameplay research’.
 

Gaiff

SBI’s Resident Gaslighter
A 2600X has a lot more to worry than DirectStorage for Wukong dude.
Yes, hence my point that just a fast SSD won't solve your problems.
Oh look, a Ryzen 2700



It loaded even faster than with DirectStorage on

My previous comment: Hell, it's not even needed in Rift Apart
You're right its on devs

As we see here, even without requiring RTX I/O or Direct Storage, on a slow CPU, its the same fucking thing

As I said earlier.

The game was BUILT around streaming assets, rather than putting everything in local memory. Really little to do with fancy tech. Wukong will stream the world almost equivalent to SSD on an HDD because it bloated the memory.
Yes, it's on the developers. They have to optimize their loading and storage algorithm to take advantage of fast storage. I think it was Star Citizen that with just changing the storage algorithm got massive improvements to the load times. I wish I could find the video, but I completely forgot. My point isn't that DirectStorage is necessary. My point is that there are technologies that could be easy wins such as Mesh Shaders or DirectStorage in some instances, yet most developers don't bother. The same could happen with this.
 
Last edited:

DoubleClutch

Gold Member
APIs like Direct X, Vulkan etc are immensely important to developers. New features benefits gamers as they can bring polish and improved visuals.

Again, none of these people are the ones you should be appealing to make fun games or add gameplay elements. why are you struggling with something so simple?

Next you’ll be calling for Nvidia and AMD to fire the folks working on improving drivers and dedicate them to ‘gameplay research’.

I’m not appealing to anybody. All I’m saying is these features will add little and won’t make games fun. If you polish a turd it’s still a turd.

You’re just taking some moral high ground and arguing for the obvious.
 

Buggy Loop

Gold Member
I’m not appealing to anybody. All I’m saying is these features will add little and won’t make games fun. If you polish a turd it’s still a turd.

So we stop progress because 99% of directors have not evolved gameplay since PS3 days?

What is this

They are not related

Publishers are playing it safe with new coats of paint on, been like that for over a decade. Go make a thread and whine on that. Nothing related to DirectX API updating for new tech.
 

DoubleClutch

Gold Member
So we stop progress because 99% of directors have not evolved gameplay since PS3 days?

What is this

They are not related

Publishers are playing it safe with new coats of paint on, been like that for over a decade. Go make a thread and whine on that. Nothing related to DirectX API updating for new tech.

2c5.jpg



No Kathy Newman, that’s not what I’m saying at all.
 

Myths

Member
It’s a beautiful thing to see.

People fail to understand tensor cores are very much a part of the hardware, so this misconception of performance or enhancements being purely “software” driven underscores a lack of understanding as usual. Engineers and scientists are finding alternate ways to approach problems granularly and form efficient solutions to said problems — not just brute forcing by sheer increase of clock cycle every 2 years.
 

darrylgorn

Member
How about making games fun first?

So many special marketing terms and technologies yet we still are stuck in the late 2000’s in terms of gameplay logic.

Still waiting for DirectStorage to revolutionize gaming… it came out 3 years ago and there’s like a handful of games that use it and Forspoken is one of them.

Don't hold your breath.

We're in for 20 more years of lumen shadowing or some bullshit.
 

Buggy Loop

Gold Member
If given the option to relive 2000 to 2015 in all its glory, I would absolutely take it over what modern games are.

Remember how Halo Infinite was the greatest Halo ever and how the franchise was taken to new heights because of all its technical feats?

So you just derailed this thread just for saying "modern games suck" when we have so many good years in the 2000's and even recently. 2025 is looking like straight fire. GTFO. Go make your own whining bitch ass thread

the-office-steve-carell.gif
 

amigastar

Gold Member
This funny thing is most early HDTVs had frame interpolation. It’s the same function today (adds fake frames in between), but it sounds so much cooler with neural networks, AI, machine learning, and the like.

It’s BS.
I remember Frame Interpolation from TV's, it looked hideous. Frame Gen looks much more pleasent to me.
 

DoubleClutch

Gold Member
So you just derailed this thread just for saying "modern games suck" when we have so many good years in the 2000's and even recently. 2025 is looking like straight fire. GTFO. Go make your own whining bitch ass thread

the-office-steve-carell.gif

You asked me a question and I answered it. I never said “modern game suck”, you’re just making stuff up.

It’s bewildering why you’re so enraged by an opposing view. Why does this upset you so greatly? It can’t be good for your health if you get this riled up over nothing, where you have to make stuff up to try to paint someone as a villain so you can pretend white knight against it.

You need to calm down.
 

DoubleClutch

Gold Member
I remember Frame Interpolation from TV's, it looked hideous. Frame Gen looks much more pleasent to me.

It’s hideous because it would ruin movies that are shot at 23.976. Visually it’ll make games look super smooth and high framerate. But the latency was way too high to play. Frame gen is ultimately the same thing, just with better latency (which I still find unacceptable).
 
Top Bottom