• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA DLSS Source Code Leaked

winjer

Gold Member

NVIDIA DLSS Source Code Leaked


he mother of all cyberattacks hit NVIDIA over the weekend, putting out critical driver source-code, the ability to disable LHR for mining, and even insights into future NVIDIA hardware, such as the Blackwell architecture. An anonymous tipster sent us this screenshot showing a list of files they claim are the source-code of DLSS.

The code includes includes C++ files, headers, and assets that make up DLSS. There is also a super-convenient "Programming Guide" document to help developers make sense of the code and build correctly. Our tipsters who sent this screenshot are examining the code to see the inner workings of DLSS, and whether there's any secret sauce. This code leak could hold the key for open-source Linux driver community to bring DLSS to the platform, or even other GPU makers to learn from its design.

If this is true, and these hackers release the source code, it could mean DLSS being hacked into a lot of things.

I doubt other companies will touch, for fear of a lawsuit. But the community might do a lot with this.

EDIT: The version leaked is for DLSS 2.2
 
Last edited:

Teslerum

Member

NVIDIA DLSS Source Code Leaked




If this is true, and these hackers release the source code, it could mean DLSS being hacked into a lot of things.

I doubt other companies will touch, for fear of a lawsuit. But the community might do a lot with this.
Even the community can't. The use of source code would mean an immediate C&D, if not lawsuits up the ass.

This is so stupid.
 

lh032

I cry about Xbox and hate PlayStation.
AMD

rubbing hands GIF
 

Majukun

Member

NVIDIA DLSS Source Code Leaked




If this is true, and these hackers release the source code, it could mean DLSS being hacked into a lot of things.

I doubt other companies will touch, for fear of a lawsuit. But the community might do a lot with this.
nobody in his right mind will touch those files and info with a ten foot pole...unless they wanna be sued to death
that includes the community
 

winjer

Gold Member

AMD can't do that without risking a huge lawsuit.
I bet they are telling their employees not to get even close to this leak, as not to jeopardize the company.

On the other hand, I wouldn't be surprised if Chinese companies were to copy DLSS outright.

nobody in his right mind will touch those files and info with a ten foot pole...unless they wanna be sued to death
that includes the community

Plenty of people are willing to risk it. And a lot of people won't advertise what they are doing.
Besides, there are many countries where patent law is not well enforced. Or countries that have no legal standing with US law.
 
Last edited:

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I wonder if this can injected into VR games for better performance on games like Half Life Alyx.
 

FStubbs

Member
Yeah Chinese companies will copy it wholesale. They may even have been behind the hack.

AMD already has their own tech.
 
Last edited:

zeomax

Member
If this is true, and these hackers release the source code, it could mean DLSS being hacked into a lot of things.

DLSS SDK is available for free and you can already implement it where ever you want.

 

winjer

Gold Member

ToTTenTranz

Banned

JCK75

Member
I have a 1080ti, it seems like AMD is killing it because I can use their solution but not the one from the company that actually made my GPU.
 

winjer

Gold Member
OK. But if I were a developer, I want to use the Tensor cores to free up bandwidth.

Tensor cores also have great dependency on memory bandwidth. A great deal is being done with memory caches to improve AI.
The difference is that tensor cores are faster than DP4A.
Regardless, DP4A is still very fast. Here is an excerpt from nVidias presentation on it.

Floating point numbers combine high dynamic range with high precision, but there are also cases where dynamic range is not necessary, so that integers may do the job. There are even applications where the data being processed has low precision so very low-precision storage (such as C short or char/byte types) can be used.

Figure 2: New DP4A and DP2A instructions in Tesla P4 and P40 GPUs provide fast 2- and 4-way 8-bit/16-bit integer vector dot products with 32-bit integer accumulation.
Figure 2: New DP4A and DP2A instructions in Tesla P4 and P40 GPUs provide fast 2- and 4-way 8-bit/16-bit integer vector dot products with 32-bit integer accumulation.
For such applications, the latest Pascal GPUs (GP102, GP104, and GP106) introduce new 8-bit integer 4-element vector dot product (DP4A) and 16-bit 2-element vector dot product (DP2A) instructions. DP4A performs the vector dot product between two 4-element vectors A and B (each comprising 4 single-byte values stored in a 32-bit word), storing the result in a 32-bit integer, and adding it to a third argument C, also a 32-bit integer. See Figure 2 for a diagram. DP2A is a similar instruction in which A is a 2-element vector of 16-bit values and B is a 4-element vector of 8-bit values, and different flavors of DP2A select either the high or low pair of bytes for the 2-way dot product. These flexible instructions are useful for linear algebraic computations such as matrix multiplies and convolutions. They are particularly powerful for implementing 8-bit integer convolutions for deep learning inference, common in the deployment of deep neural networks used for image classification and object detection. Figure 3 shows the improved power efficiency achieved on a Tesla P4 GPU using INT8 convolution on AlexNet.

Figure 3: Using INT8 computation on the Tesla P4 for deep learning inference provides a very large improvement in power efficiency for image recognition using AlexNet and other deep neural networks, when compared to FP32 on previous generation Tesla M4 GPUs. Efficiency of this computation on Tesla P4 is up to 8x more efficient than an Arria10 FPGA, and up to 40x more efficient than an Intel Xeon CPU. (AlexNet, batch size = 128, CPU: Intel E5-2690v4 using Intel MKL 2017, FPGA is Arria10-115. 1x M4/P4 in node, P4 board power at 56W, P4 GPU power at 36W, M4 board power at 57W, M4 GPU power at 39W, Perf/W chart using GPU power.)
 
Last edited:

Rudius

Member
What is the possibility of modders getting this to work in older GPUs lacking the tensor cores?
 

Tarnpanzer

Member
AMD can't do that without risking a huge lawsuit.
I bet they are telling their employees not to get even close to this leak, as not to jeopardize the company.

Of course they can´t use anything out of this leak for their own work. But I bet they indirectly will look into it, to get some "impressions" for their own efforts in this regard.
 

winjer

Gold Member
Of course they can´t use anything out of this leak for their own work. But I bet they indirectly will look into it, to get some "impressions" for their own efforts in this regard.

One of the fundamentals when dealing with legal stuff like patent or copyright infringement is "access".
If AMD accesses this leak it's pretty much an instant loss, for any potential legal suit, between nVidia and AMD.
 

Rentahamster

Rodent Whores
How impactful is this really? For any company to actually utilize this for their own advantages, wouldn't they also need access to Nvidia's internal hardware too? It's sort of like Twitch's source code being leaked. Twitch an an entity is so much more than just code, which is why you didn't see clones popping up left and right after their leak.
 

IntentionalPun

Ask me about my wife's perfect butthole
Of course they can´t use anything out of this leak for their own work. But I bet they indirectly will look into it, to get some "impressions" for their own efforts in this regard.
Being caught looking into it at all would be a big no-no.

So unless some engineer does it on their own time, never mentions it, and then utilizes that knowledge to improve something, and is never caught... AMD won't be using it.

The above scenario isn't impossible, but it's still unlikely IMO. Keep in mind that code is using loads of patented hardware tech most likely too.
 

IntentionalPun

Ask me about my wife's perfect butthole
How impactful is this really? For any company to actually utilize this for their own advantages, wouldn't they also need access to Nvidia's internal hardware too? It's sort of like Twitch's source code being leaked. Twitch an an entity is so much more than just code, which is why you didn't see clones popping up left and right after their leak.
The only "Server side" aspects would be the building of the data models that are then used by DLSS itself.

It's definitely an important aspect of what DLSS is, but it's not quite the same as something like Twitch. It's not really "internal hardware" but the work of data scientists.

And it's possible information about the server side "training" of the data model used by DLSS could be included in this leak.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
PS5 and XSX saved.

After playing HFW in checkerboarding, I can wait for something like DLSS on consoles.
 

Rentahamster

Rodent Whores
The only "Server side" aspects would be the building of the data models that are then used by DLSS itself.

It's definitely an important aspect of what DLSS is, but it's not quite the same as something like Twitch. It's not really "internal hardware" but the work of data scientists.

And it's possible information about the server side "training" of the data model used by DLSS could be included in this leak.
Sure, it's not the same, but I was pointing to the similarity in that the source code is just one piece of a very complicated puzzle that forms the overall end product. Competitors might be able to glean a few insights here and there by examining the code, but that's about it.
 

IntentionalPun

Ask me about my wife's perfect butthole
Sure, it's not the same, but I was pointing to the similarity in that the source code is just one piece of a very complicated puzzle that forms the overall end product. Competitors might be able to glean a few insights here and there by examining the code, but that's about it.
In thinking more about this, the data model itself would be a part of the source code.

So just like the source code they’d get a full snapshot of working DLSS.

They just wouldn’t get any improvements in time to the data model but the same can be said of the source code.
 

Rentahamster

Rodent Whores
In thinking more about this, the data model itself would be a part of the source code.

So just like the source code they’d get a full snapshot of working DLSS.

They just wouldn’t get any improvements in time to the data model but the same can be said of the source code.
That's possible, but I don't know enough one way or the other to feel like I can guess what specific things can or can't be possible, since we also don't even know what specific files and documentation are in the leak either.
 

IntentionalPun

Ask me about my wife's perfect butthole
That's possible, but I don't know enough one way or the other to feel like I can guess what specific things can or can't be possible, since we also don't even know what specific files and documentation are in the leak either.
In order for DLSS to execute the data model has to be there.. it would be a part of any DLSS "dll" or library for instance, would be likely checked into the same source code system as the code itself, and then would be part of the "build" process used by local build tools for devs to make changes and test.

Only DLSS v 1.0 used a data model "per game", the newer DLSS all uses a common data model, so one data model across all games.

It'd be highly strange for that common data model to not be included in the source code.
 

.Pennywise

Banned
""""""""""Leaked"""""""""

Imagine someone breaking into your house and taking your belongings. I highly doubt you'd say your stuff got leaked instead of stolen.
 

Rentahamster

Rodent Whores
In order for DLSS to execute the data model has to be there.. it would be a part of any DLSS "dll" or library for instance, would be likely checked into the same source code system as the code itself, and then would be part of the "build" process used by local build tools for devs to make changes and test.

Only DLSS v 1.0 used a data model "per game", the newer DLSS all uses a common data model, so one data model across all games.

It'd be highly strange for that common data model to not be included in the source code.
OK, but even if so, how would that contribute to a DLSS "clone" popping up anytime soon? I assume the end user would still need Nvidia tensor capable hardware to even make sense of that common data model.
 

Skifi28

Member
""""""""""Leaked"""""""""

Imagine someone breaking into your house and taking your belongings. I highly doubt you'd say your stuff got leaked instead of stolen.
If they make a photocopy to take and leave all my stuff intact, I'm cool with it.
 

intbal

Member
Good job, hackers.

Now go steal the full source code and design documents for the NV2A, so emulation can be perfected.
 
EDIT: The version leaked is for DLSS 2.2

OOOOOOOOH FUUUUUUUUUCK! :messenger_grinning_sweat:
I know people lie a lot on the internet, hope this is true.
But... there's that Intel competitor that was made by the same person and will be open source, right? So I think it's pointless once that other solution ships publicly.
 
Everyone is going to look at the source code for DLSS, for what little good it will do them without having the intended hardware in place that it was built for (AMD, Intel, etc).
 

M1chl

Currently Gif and Meme Champion
Seems like there is fuck all to that, given the size of those files... But this is probably just "front-end" with most of it being implemented in CUDA SDK
 
Top Bottom