-
-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add DLSS 2.0 support #2239
Comments
accroding to wikipedia DLSS 2.0 works as follows:[14]
|
DLSS 2.0 Cannot be added to the game engine (or any game engine really).. It has to be implemented on a Game by Game basis, and is not general enough to work on a Game engine basis. DLSS 3.0 or further maybe when the AI gets more general, or efficient enough that the AI training for the game can take place on local hardware instead of needing very powerful Servers to do so. |
I am sorry to say but DLSS 2.0 is not per game basis. It's per Engine. See: https://developer.nvidia.com/dlss |
Ok my bad, I was under the impression that DLSS 2.0 was still not general enough and that it too needed a per-game training. |
DLSS is a proprietary library; we can't integrate it officially in Godot. Since this feature must be implemented as a third-party effort, I'll close this as godot-proposals is only meant to discuss features to be added to core Godot.
When (and only when) AMD makes this solution public, it should be discussed in a separate proposal. We'll need to see if it's actually open source-friendly, because GPUOpen solutions have historically not always been open source (see RadeonRays 2.0 vs 4.0). |
|
Add DLSS now! Its already open source https://github.com/NVIDIA/DLSS |
DLSS is still under a proprietary EULA: https://github.com/NVIDIA/DLSS/blob/main/LICENSE.txt Only NIS (NVIDIA's spatial upscaler, equivalent to FSR 1.0) is open source. Since FSR 1.0 is already present in 4.0.beta, I don't see much point in adding NIS. In general, FSR 1.0 is considered to look slightly better in practice while also being faster. NIS goes for a more conservative approach but ends up being more expensive, making it a less "bang for buck" spatial upscaler. There are already plans to integrate FSR 2.0 in future 4.x releases. Edit: FSR2 will be available in Godot 4.2: godotengine/godot#81197 |
https://github.com/NVIDIAGameWorks/Streamline They have an open-source wrapper for both XeSS and DLSS and it's MIT licensed so should be compatible. Obviously, developer will still need to put proprietary blobs to actually enable those features, but this makes it possible to have almost ready-to-use support and keep blobs out of the engine. |
It's better than nothing and DLSS-G plugin can be added by game developer without help from engine, though maybe not optimal in that case.
As far as I know, currently all Linux DLSS enabled title are running through proton and using an open source NVAPI implementation to pass through the data to Linux native DLSS binaries. I understand this limitation and the wish to have native Linux port of games, but if this wine passthrough thing could help more gamer to leave Windows I still think it is a good idea to have it than not.
That's a valid issue. But I think as Streamline isn't changing a lot maybe it could be kept as a separate blob build. Especially for its Windows only situation for now.
It doesn't support FSR2 for reasons and I think everyone agrees that we don't want a Starfield situation happens on game developers. I hope AMD could let it support FSR2 so no more complains. And adding streamline support on top of FSR2 implementation is quite straightforward.
That's on game developer. You don't need to inform NVIDIA or Intel for using Streamline. Since developer need to get DLSS/XeSS SDK blobs from them anyway, they should follow what EULA that came with those files. I'm not making this something higher priority than it should be, there's much more important work to be done, just we may have to deal with this since FSR will always lagging behind by quite some margin until AMD finally decide to put some CDNA gene into RDNA to properly run a better kernel. And by that time nobody can guarantee AMD will make that FSR version open-source. Plus this one should be a lower hanging fruit since we have FSR2 support. Making developer to do the modder's job isn't ideal anyway. |
Oh where do I start...
DLSS is not even close to being a low-hanging fruit, nor a priority given it's only for Nvidia users, and we already have a good enough upscaling tech thanks to AMD that also works for Nvidia users. And before somebody tries to call me an AMD shill, I've ran nothing but Nvidia for the last 22 years of my life and I've only like 2 months ago switched for the first time in my life to AMD, as I was not keen on getting screwed over by Nvidia and their insane price premium and getting a significantly worse product (240€ for RX6600 vs 300€ for RTX3050 in my region at the time of purchase) |
@mrjustaguy And no FSR2 is not "good enough" as it's clearly worse quality comparing to XeSS/MetalFX or DLSS. High quality XeSS/DLSS means those cards can drop down render scale to get better performance.
That's even better, we could replace FSR implementation with a streamline and just use FSR though streamline then.
I'm not referring to the poor quality in general about Starfield. I'm talking about PC gamer angry about only supporting FSR2. It's clear AMD blocked DLSS in some game and this action makes Starfield super suspicious. And day 1 mod in DLSS/XeSS support makes it look even worse.
It is, but same goes for FSR2. FSR2 is still unpredictable. All TAAU solutions are QA nightmare but we are not some AAA studio that have to make every single story goes through same QA procedure to make shareholders happy. I think we are doing pragmatic practice here instead of theoretically fixing everything.
It's quite simple -- we don't. DLSS/XeSS bugs are shared across all games including big releases. If they can live with it there's no reason we need to be nitpicking. Our audience want it and it does bring meaningful improvement then why should we block it just by it's not bug free? Streamline benefit Intel and NVIDIA users. And down the line you need to have DLSS to fully utilize Switch Next. It will happen sooner or later. You cannot say FSR2 is good enough for a low power device that need the advantage of DLSS And open sourcing DLSS isn't helping anyone-- you have to rely on the ML model blobs. Even if NVIDIA provided full script code and training dataset for that model. you don't have a world's top 20 super computer to run the training anyway. So model blob or model + some wrapper blob isn't a huge different. Especially when we have an open-source MIT licensed wrapper around that blob (plus Intel XeSS blob). Even if NVIDIA open the vendor lock, the fact AMD does not have anything more than 2xFP32 throughput is still limiting its quality. It will not help AMD user anyway and Intel have similar quality XeSS already so also nothing for Intel users. Streamline have support for everyone already and it's not like we are implementing DLSS and block XeSS. The problem is you never look into NVIDIA and Intel solutions. FSR2 quality mode have more artifact than DLSS/XeSS performance mode. It's good enough only for AMD users due to that is most likely as good as possible for them. But Intel and NVIDIA user deserves a better option since this is under-utilizing their hardware. |
If DLSS is known to be buggy, why integrate it? |
It definitely bring more benefits than being buggy. Plus this is not only for DLSS, you got XeSS for free from integrating Streamline. FSR is also buggy yet still got integrated. This is under-utilizing Intel and NVIDIA hardware. |
I'm sorry, Devs supporting FSR but not DLSS is absolutely no evidence to suggest DLSS is blocked and that's the ONLY evidence that I've seen for it, while there's plenty of evidence that's been piled on against those accusations. Look at it from their perspective. You have to QA only one tech instead of 2, so let's go with the one more people can use. It needs to be open source with a compatible license to Godot to be officially integrated, so yes Open source with compatible license is very important and just because AMD users (don't forget Intel here now.. they don't have specialized/optimized hardware for DLSS too) wouldn't benefit from DLSS now doesn't mean they couldn't in the future get proper hardware acceleration if were open sourced. |
Streamline is MIT licensed so it is definitely compatible. Intel does have dedicated hardware for XeSS. They have better int8 throughput and XMX unit for discrete GPUs. For the QA thing. All machine learning AI stuff have this problem but you cannot avoid them forever. It’s bring more benefits than troubles and it should be up for game developers to decide not the engine. Fortnite added DLSS back since they know with the cost of QA they still need this to meet the expectations from their audience. FSR2 runs on NVIDIA and Intel but nobody wants that since they have better solutions that utilize their hardware for better quality and performance. It's like say you are building your game for PC and just include an Android simulator. It is an unfair treatment. btw GPUs are full of firmware blobs and there’s no way to avoid that. Underutilization in the name of open-source is a bad excuse. Especially when vendor provided universal wrapper to work around the license. You cannot say we only support open-source driver and crash for NVIDIA user on Linux. |
streamline is MIT, but none of the tech that we'd need it for that we haven't implemented is compatible, That's the problem. RDNA3 afaik has int8 too but I'm not sure how that's relevant. I mean everything can be awful for QA, the problem with DLSS (maybe XeSS too, though never heard of this type of issue with that yet) is that every single version "Upgrade" often comes with a massive downgrade in some other region, which is a well known issue, and why techpowerup's DLSS dll library is so useful for tinkering, for switching out which version you are using to fix issues by going to older (or newer) versions for a game till you find one that works best. I know, hence Temporarily removed, but non the less removed due to QA, plenty of devs just gonna go "No thanks" to all that work. DLSS-XeSS-FSR2 all have more or less identical performance (especially DLSS & FSR2), and better quality depends. They're called drivers, Anything that isn't optimized is underutilized and Godot does have intentional decisions that are "unoptimized" in the name of open source. How does only Supporting FSR which everyone supports but not being able to support DLSS equate to making an experience unusable for an Nvidia user? |
@mrjustaguy This is not hypothetical, it's real and since you never really look into those techniques you never believe this is what happening. DLSS/XeSS is a must have feature not something you could argue. This is basically unfair since AMD lacking such hardware is dragging NVIDIA and Intel users, especially for those who have weak GPU that could really benefit from those features. 1080p -> 4k is at least 2x performance for NVIDIA and Intel users. When using FSR2 you have to start from a much higher base resolution and sometimes that's still a pixelated mess. Same performance for different result is already different performance. It's always up to developer to choose if they want to enable these feature (and include the blobs) or not. You can not do the decision for them. And please stop opensource gating Streamline: Godot support DX12 and that's not open source anyway.
PS: If your mindset is let some (infact majority of them) PC player using a lower resolution while under utilize their hardware unit is fair then I don't know what keeps you working on Open-Source projects. Just tell everyone drop their resolution and call it a day. Most ppl will not notice 720p upscaled to 4k anyway right? I hate to say this but if any reasonable feature proposals have to become a drama before getting accepted then it will be a huge waste of time for all of us. As I'm running a 4090 this feature will absolutely not benefiting me from an end user prospective. But hey we are making games for others and why would we intentionally make NVIDIA and Intel audience un-optimized to match AMD? |
I happen to have been testing FSR2 integration in Godot, and I can say right away, I'd rather play at 720p FSR2 to 1440p vs 1080p bilinear to 1440p, so clearly it's better compared to just dropping native res, though I'd only ever use DLSS/FSR/XeSS/TSR/TAAU only as a last resort anyhow. Your assumption is wrong. I've been following these techniques since they came out. Also how am I fooled by AMD's marketing if I stated they support something that they do, and it's just not as good as the competition which I never claimed it was? I can argue that DLSS/XeSS is not a must have feature. Argument is to hell with Upscaling and optimize your games. Yes. DX12 is a requirement for Platform Support (Console) while DLSS and XeSS are not a requirement for ANYTHING. I just stated a fact, my mind set is to hell with any upscaling, but the point I made is that plenty of people just don't care as it's good enough to be successfully fooling them. Also you are ALWAYS under utilizing your hardware, and there's always tradeoffs to be made between things like - Simplicity, Performance, Quality, Flexibility, Size, Compute Usage, Memory Usage, Maintainability. You Cannot get the best of everything. You started dramatizing this whole thing really, most of the time the discussion is kept civil though there are some proposals that can be controversial. Optimization is a relative term. What are you optimizing for? Godot is Optimizing for ease of use and maintaining, and highly portable design over Performance, unless the Performance difference is Massive, which here isn't the case. |
@mrjustaguy I still hate to call FSR2/DLSS upscaling due to they are not doing any upscale anyway. I'm not saying anyone should cover their suboptimization with DLSS and usually those are just excuse as DLSS could not help them for the frame spikes. You have to have a stable game to begin with to benefit from these techniques. When they making up excuses it does not matter what is involved in those claims. FSR2 is just like a variant of DLSS with less quality. But this quality difference is making DLSS/XeSS not a tradeoff of image quality anymore. Plus this decision put game developer in danger of being called "AMD shill", since it is quite clearly benefiting AMD by suboptimizing Intel and NVIDIA. As always, I'm not saying this feature should be there next week. It should be put in backlog and got picked up sometime down the line, maybe early next year or so. |
you said it's barely better than not having it, that's what I commented on. Upscaling is exactly what it's doing if you're not running Native by definition as it's adding pixels that didn't exist in the image before. and if it were free 2x performance, there'd be no reason to not use it and you wouldn't care which one you're using for that reason, but it's not as image quality suffers the lower you go. Upscaling shouldn't be used for that, totally agree there, However excuses matter if they're legit. Like for example I'm making a game on an IGPU that has no RT support, my excuse for not making a AAA blockbuster Path traced masterpiece is that I don't have the ability to make it. I cannot say I understand what you're trying to say here. I don't see any game dev being called an Nvidia shill because they only support DLSS, I do however hear it a ton about AMD, and occasionally even Intel which I personally find a bit funny.. and besides, smaller devs got different expectations vs AAA studios that have the resources to do this, or when they don't support something that their Engine does like was the case with Star Wars. |
Wait a second. They are not magickly generating pixels and with good enough frame rate the result can be quite good or even better than native due to more pixels got reused and becomes super sampling. |
yeah. and that still fits in the definition of Upscaling. I know it's not magic.. though DLSS 1 was just generating pixels based on input data as if I remember right it didn't take temporal data, just had to be trained on all the scenes at ultra high res, which is why it had such insanely horrible artifacts and was just AI doing educated guess work, though I may be wrong about this, as I didn't bother going into DLSS 1 when It was indisputable garbage that's worse than simple interpolation and sharpening (essentially NIS and FSR1) |
You are absolutely correct about DLSS 1 as far as I know. I still don't think that's Upscaling when you have more than native pixels to work with. Those pixels got wasted in traditional render and reusing them is just a clever way of render -- such innovation was always happening though out the 3D gaming era. |
by dictionary definition it is Upscaling, but Reconstruction works too. |
Anyway I still think Streamline is worth the consideration as it really could help low end NVIDIA Reconstruction sometimes give you quality boost compare to native. Especially when you have to enable TAA. This will pave the way for next Switch enablement and enabling Intel iGPU user to play godot engine games at much higher performance (expecting 2x with quality improvement). Since native Metal support is on the roadmap, MetalFX is obviously coming with that or even before that if MoltenVK support it somehow. I think bring in NVIDIA and Intel optimization from a open-source wrapper is pretty much needed. |
Hello, I'm interested in this proposal and I wonder what is the current status of it. I noticed that in the SDK v3.5, support for Linux kernel 2.6.32 and newer is mentioned (https://developer.nvidia.com/rtx/dlss/get-started#sdk-requirements). Does this mean that Godot will be able to use DLSS on Linux platforms soon? Thank you for your work and attention. |
As mentioned above, DLSS 2.x and later has had native Linux support for a long time (even if no Linux-native games currently support DLSS 2). It's also possible to use DLSS on Linux via Proton thanks to Proton's NVAPI support1. The issue is that we can't integrate proprietary libraries in core, so it must be a community extension or engine fork. Right now, this will likely require an engine fork until more of the rendering engine is exposed to extensions. The engine fork can distribute precompiled editor and export template binaries on relevant platforms (Windows/Linux) to make it easier to use. Similar concerns apply to XeSS, although it doesn't even have native Linux libraries available (it can still be used in Proton). Developing community extensions or engine forks doesn't require going through the proposal process. Someone can already start working on it on their own 🙂 Footnotes
|
With the addition of The Compositor in 4.3 would it be viable to create extensions now instead of an engine fork or would the compositor still need to be expanded in the future to support this? |
There is no reliable way exposed to properly control camera jitter outside of core atm. |
Describe the project you are working on
I am working on a 3D Horror Game World with 2K textures
Describe the problem or limitation you are having in your project
As the GPU Market tends to shift to AI up- and downscaling, games will tend to have GLSS in the future more likely. So first of all why shouldn't Godot have it, but everyone else? 2. It gives pretty much every game a major performance boost. 3. Even AMD will soon announce their FidelityFX driver, which will have their so called Super Resolution.
@ https://techengage.com/best-gaming-graphics-cards-gpus/
Describe the feature / enhancement and how it helps to overcome the problem or limitation
DLSS helps games with bad performance on higher settings in an immense way. It would overcome some of the limitations Developers get with considering the Hardware of Players.
Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams
I personally don't know. NVIDIA for example has a DLSS beta program or you need to consultate a NVIDIA Developer to talk it out.
If this enhancement will not be used often, can it be worked around with a few lines of script?
I don't think it can be worked around as it is from the companys themselves
Is there a reason why this should be core and not an add-on in the asset library?
You could see DLSS as an Rendering enhancement and I don't think changing the rendering as an add-on is such a good idea. I don't know how big the program itself is, so if it is too big for Godot (in terms of Volume Space) then it could be added the same way as the Android Build Template.
Thanks for reading :)
The text was updated successfully, but these errors were encountered: