• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia's Random-Access Neural compression just changed the texture compression world

Buggy Loop

Member
Kind of missed this, but I saw the annoying voice dude two minute papers talking about it today..

If you can tolerate his voice




If you want Nvidia's video on it.





Compression techniques even nowadays are still based on on S3 Graphics's iteration (remember that company? One of the many in the early days along the 3DFX, etc), later renamed BC in directX and then many iterations BCx versions. Basically block truncation coding with color cell compression. The later versions were made to store alpha, normal maps, HDR, etc.

teaser.jpg


OOj5jaM.png
iUsr4Js.png



Everything will be AI in the future.

Imagine if Switch 2 uses this, motherfucking game changer. To match previous technique you would just save so much space, smaller downloads, cartridge friendly, and saves a huge amount on hardware bandwidth.
 
Last edited:

omegasc

Member
Dam release that shit, and v-ram issue's suddently are a lot less of a problem.
I believe this is what they thought as well when they went 10GB for the 3080 at the time. I thought we were going to be way ahead in texture compression by now because of that move. Turns out I was wrong because all the talk ended up being about ray tracing. Bummer. :p
 

analog_future

Resident Crybaby
Raster performance is rapidly becoming meaningless.


It's crazy to think about how graphics/fidelity/performance is going to be approached in 10 years time.
 

Buggy Loop

Member
I saw it before. Nvidia has many great techs but they are not coming very soon.

Depends, they implemented their ReSTIR GI quite rapidly from papers to Cyberpunk 2077 overdrive. Guess it always depends on the deal with implementation, when it's based on a game release, it can take time as they have their own pace.

Could be implemented at API level for Switch 2 though.
 
Dang, thats super impressive. We sure are seeing some crazy leaps in gpu tech.

Provided this isnt all just hype, Nvidia is really absolutely blowing their competition out of the water in ways that dwarf their previous levels of blowing the competition out of the water.
 
Pretty sure Sony Santa Monica were doing something similar with God of War Ragnarok on PS5, they used neural texture up-sampling via FP16 compute, the result was the higher resolution textures on the PS5 with the same storage footprint as the PS4 textures.

There's an entire thread on this.

 
Last edited:

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Uncompressed still looks way better. Hopefully it remains an option in the future, at least for PC.
Are you looking at the right pictures? Difference is tiny and in a game would not be noticeable
 

nikos

Member
Are you looking at the right pictures? Difference is tiny and in a game would not be noticeable
Look how zoomed in it is. You would never see the difference. This has big repercussions for VR where unlike most games you can typically put your face really close to objects, grabing items and putting them inches away.

Yeah, the difference may not actually be noticeable in-game. Something like a ground or wall texture may be more noticeable than the page of a book but the tech could get to a point where it's either not noticeable, negligible, or maybe even better with NVIDIA's wizardry.
 

LiquidMetal14

hide your water-based mammals
I really love this stuff. Advancements like this I am sure glad we are allowed to see things like this which make things more optimized and able to extend our hard work further.
 

Buggy Loop

Member
Pretty sure Sony Santa Monica were doing something similar with God of War Ragnarok on PS5, they used neural texture up-sampling via FP16 compute, the result was the higher resolution textures on the PS5 with the same storage footprint as the PS4 textures.

There's an entire thread on this.


Neat, hadn't seen that one before, must have skipped thread. Without knowing the bitrate compression it's hard to compare, but ultimately they still relied on old BC 4x4 pixels block in the end and had a 9ms computational cost (which they could afford for the game). The technique seems to have problems but they preferred the computational hit on GPU and relieve VGPR pressure. And it's comparing the PS4 vs PS5 version, but no idea about the reference sample. But Nvidia's 16x texel rate for same footprint of the BCx at ~1ms cost is, as far as I know, unheard of.

No doubt many devs will dwelve into ML over the coming years so it'll be interesting to see different solutions, all consoles have ML capability now, it's a matter of time.
 

zeroluck

Member
Pretty sure Sony Santa Monica were doing something similar with God of War Ragnarok on PS5, they used neural texture up-sampling via FP16 compute, the result was the higher resolution textures on the PS5 with the same storage footprint as the PS4 textures.

There's an entire thread on this.

Like I said in that thread, this only save disk space, the textures will take the same amount of memory on the GPU, Nvidia's neural compression is entirely different, it compresses material(multiple textures) into a neural network(very small footprint on the GPU) and decompresses them using tensor cores when sampling them in a shader.
 
Last edited:

Buggy Loop

Member
At 40x - 50x the size, I'd expect no less.

Then we have threads like these ..


Pure raw non compressed assets would skyrocket game installs. I don't know, a 1TB if not more per game, a lot more I would assume? There's no games with raw texturing that I'm aware of.
 
Last edited:

GymWolf

Member
Not sure if i give a fuck about making texture looking worse to save space tbh.

I know it's not gonna be feasible to have 8k textures in the future but still...
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Uncompressed still looks way better. Hopefully it remains an option in the future, at least for PC.

You're missing the big picture. Even if the uncompressed version looks better, this way you can fit more textures at a higher quality in the same amount of graphics RAM than we can today. So when you zoom in at every single texture you'll notice a slight loss in quality but as a whole the game will look much better.
 

GymWolf

Member
You're missing the big picture. Even if the uncompressed version looks better, this way you can fit more textures at a higher quality in the same amount of graphics RAM than we can today. So when you zoom in at every single texture you'll notice a slight loss in quality but as a whole the game will look much better.
So better average quality but lower highs basically?
 
Last edited:

Neilg

Member
The performance overhead for this is huge. It's not being used until there are dedicated chips to handle it.
Somewhat of an issue when Nvidia can add those to GeForce cards but AMD would have to invent their own method to get this into consoles. We are a long way off this transforming games in any meaningful way.
 

SABRE220

Member
So better average quality but lower highs basically?
I think its not a downgrade at all. Currently, textures are heavily compressed compared to the original, this would allow the textures to be much closer to the original while using the same space as the compressed texture for e.g if the original is 1.0 in quality, the neural version would be 0.7ish and the standard method would be 3 at equal sizes.
 
Last edited:

GymWolf

Member
I think its not a downgrade at all. Currently, textures are heavily compressed compared to the original, this would allow the textures to be close to the original while using the same space as the compressed texture for e.g if the original is 1.0 in quality, the nueral version would be 0.7ish and the standard method would be 3.
I was confused because i thought the reference was actually used in game and it was a downgrade from that.
 

DonkeyPunchJr

World’s Biggest Weeb
I think its not a downgrade at all. Currently, textures are heavily compressed compared to the original, this would allow the textures to be close to the original while using the same space as the compressed texture for e.g if the original is 1.0 in quality, the nueral version would be 0.7ish and the standard method would be 3.
So basically it allows a much higher quality texture while being roughly the same size as an ugly highly-compressed texture?
 
So better average quality but lower highs basically?
I think we will get same low,med,high settings for textures but now the AI decides how to compress and decompress them rather than manually done by devs. Should improve IQ and/or VRAM usage. Great news if games can take advantage of that.
 

SF Kosmo

Al Jazeera Special Reporter
I saw it before. Nvidia has many great techs but they are not coming very soon.
This is always the issue. New tech comes.out and we get support (often watered down) from a handful of titles unless than ideal ways and it isn't until 5 years later when most of them market has the needed hardware that we were games really leverage them (often with a chorus of complaints from people with 6 year old cards).
 

nemiroff

Gold Member
Uncompressed still looks way better. Hopefully it remains an option in the future, at least for PC.

A compressed texture can have all of the details of that uncompressed reference texture and more (by making the "dpi" larger) and still be smaller. It's the compression ratio and decompress time (together with artist and tech requirements) which is important here (thus the comparison to the old technique). I guess they forgot to explain that context properly.
 
Last edited:

Buggy Loop

Member
You are right but unfortunately RT uses a fugging load of VRAM, so cheaper GPUs will be VRAM constrained anyway

And frame gen also gobbles up VRAM. To the guy above saying Nvidia is doing everything to keep selling us low VRAM cards, It's kind of the opposite mostly so far, most of their tech eat VRAM way more than native res / native frame and rasterization.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Uncompressed still looks way better. Hopefully it remains an option in the future, at least for PC.
What games have you played with uncompressed textures.
A single 4K texture can be a gig easy work, even with 24GBs in the 4090 you'd fill that VRAM in 2 seconds.
 

Buggy Loop

Member
This isnt about shrinking texture size.
Its about increasing quality within the same texture size.

So the savings are actually minimal in terms of space.

But if you were to meet the quality of the BCx, you definitely save on space too. I doubt this is as low as they can go, nor as high as they can, they just took a point of reference for same footprint, it can go either way.
 
Last edited:
So better average quality but lower highs basically?
Not even just that, it's a small, small minority of people who will ever be able to even notice, either by;

a) having the knowledge of what that raw uncompressed texture looks like.

Or

b) stops to inspect the textures closely enough.

Vast majority will just enjoy the game, and tech will continue to get better as compressed textures in 2033 will look better than raw textures today.
 
Last edited:
Top Bottom