-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow users to determine package resolution strategy during package restore - direct or transitive #5553
Comments
Lowest should be the default, but that's just my opinion. It's the best way to get a "working set of packages". Everyone in the graph gets what they compiled against and things only roll forward minimally if sibling packages need higher versions.
It's more of a "safest is default strategy" because people want to get a working set of packages by default. In a system where you need binary compatibility between components to in order to make things function things have a better chance of working if you stay closer to what components build, and test against.
This is already possible when using packages.config based projects by picking the dependency behavior on install: We don't own NPM, that's the node package manager, not sure if you mean something else. To properly do this with PackageReference, a lock file needs to be introduced (which has been discussed anyways). |
I am not going to debate what the default ought to be. I really am not concerned over that. Technically this is NOT a default when you can't configure it otherwise. It's only a default if you can change it. We can't. My company is frustrated that Microsoft will not allow businesses to DECIDE Nuget resolution behavior for themselves. It makes absolutely no sense why Microsoft should have a monopoly on the business policies of the private sector: this is in effect what Microsoft does here. Make package resolution configurable via a simple switch. By NPM I mean NuGet Package manager in Visual Studio. I don't see why any lock file is required. Just add an additional switch in nuget.exe and Nuget Package Manager that indicates if lowest, highest etc should be the expected behavior. Package.Config is for NuGet 1 and 2. That's well deprecated by now. This is clearly a business decision that Microsoft currently has constrained to Microsoft to decide. There are many switches in Nuget already. I don't understand why we cannot simply add one more. I fail to understand why it takes over two years for Microsoft to resolve this situation. The current work around is to explicitly declare EVERY dependency in package references (for Nuget 4) and in Project.Json (for Nuget 3). This is a clumsy and labor intensive work around. Just add a darn switch to indicate Nuget's expected resolution behavior so users can manage it themselves. It makes no sense why Microsoft must control this in the first place. Just add a switch. |
It's required with |
I think having the flexibility to choose resolution strategies is just as important as having a deterministic build. I do agree a deterministic build is important, but so is flexibility, which has been in short supply for long enough. There is so much debate as to what the restore default should be (highest or lowest), but the point is irrelevant: higher or lower should be a choice the business makes, not Microsoft. If I want a predictable build I can always just stick with project.json, check in my project.lock.json file with lock set to true, or just use fixed version numbers (the latter is the most predictable way, but the least flexible and most labor intensive). A package can get de-listed at any moment, so defaulting to lowest version does not guarantee a predictable build output either. Microsoft needs to add switches to allow business to define their own resolution defaults. Microsoft has no business making these decisions for businesses in the private sector. |
Sure. Like I said, it's possible with packages.config and that functionality should be ported to PackageReference (project.json is pretty much deprecated at this point) along with a way to lock the graph after resolution.
I think you can have both though. IMO it wouldn't be great if we had a feature that most people couldn't use because we didn't think through the end to end.
That doesn't work. Locked: true isn't supported anymore (hasn't been for a while).
Fixed versions of the entire dependency graph would work, but it's extremely labor intensive.
I believe this isn't respected when resolving dependencies in |
project.json will resolve to the lowest version it can find. If the version the build uses normally gets de-listed your deterministic build is out the window. If you are going to use dependencies from nuget.org or any other third party site, you cannot get a deterministic build every time, even with nuget restore working as it does now. When you depend on a third party for some of your libraries you cannot determine what your build will look like every time no. The alternative is to host any libraries you use locally so you have complete control over their list status. I think we both agree package.config with fixed version numbers is far too labor intensive. Project.json is far more flexible than package.config. Package.config is even more deprecated and out of date than project.json. We still use project.json where I work along with Nuget 3.5 since Nuget 4 does not offer any more flexibility with regards to package resolution options. When Microsoft smartens up and lets go of its monopoly on package resolution strategy logic we will adopt a later version. As it stands now Nuget 4 does not offer anything that Nuget 3.5 doesn't already do for us. Locked: true does work for us with Nuget 3.5. Though we don't want fixed version numbers. We can update a library quite regularly. We need flexibility here more than we need predictability. We use TFS automated builds with a fully featured Artifact Server here as well. If we need any build of any configuration we can pull that from TFS in just seconds. Flexible builds are a more pressing need for my company than deterministic builds. The Lower version is no guarantee of safer. If that was the case we would all still be running Windows 95. |
You can 98% of the time (I made that up), because de-listing on nuget.org is rare and deleting is impossible. Now, if you're mirroring your dependencies to a myget feed or controlled feed, it's even better since you'd rarely delete or de-list anything that was being depended on. However, de-listing/deleting is rare, if we support resolving the latest dependency version without a sane story to lock it, we're turning the 2% case into the 98% case.
Not following this argument.
Hopefully you'll switch to PackageReference since that's where all of our energy is at the moment. Good to know All of that said I agree with you, I just think when we do that feature it'll be accompanied with a lock file so that things remain deterministic. That shouldn't be too hard to accomplish regardless. |
I would have to double check for Nuget 4, but you can get a deterministic build using highest version with a version number limit. I don't think Nuget 4 will allow this scenario out of the box either. It's not 100% predictable, but the point is we cannot even configure that. If you want to lock, then use VS 2015 and Nuget 3.5. You just have to check in your project.lock.json. I like the high level design of NuGet, but the implementation is dreadful on many levels. MS has had over five years now to get NuGet to a sensible level and it's still not there. If locks are out the window with Nuget 4, and Nuget 4 STILL won't offer a switch for businesses to configure their own business policies how is Nuget 4 any better? Moving references to the project file accomplishes what? That work effort could have been put into offering what businesses need: configurable restore policies and restore locks. I don't recall seeing a single post of anyone complaining that the project.json file format urgently needs a renovation. There are hundreds (I haven't counted, but I stopped a long time ago) of people complaining about the inflexible lowest is best logic hard coded into Nuget right now. On the face of it, it looks like MS wants everyone to use the latest version of their software, but the oldest version of everyone else's software. I find it ironic how aggressively Microsoft endorses adoption of new releases of their products, but Nuget logic has a "lowest is safer" constraint hard coded into the binary. If lowest was safer we would all still be running Windows 95. |
Not only is it not predictable, it's probably more broken (especially for dependencies out of your control).
VS 2017 with PackageReference is the future of nuget and it needs a proper lock file story. |
How is getting the latest version "out of control?" Our approach at our company is to use a fixed version number for third party dependencies and use the highest version for in house packages. We don't update third party dependencies nearly as often as we update internal packages. We have a very large in house class library framework that relies on nuget to be sensible: right now it's not. We can make nuget sensible, but we have to hack project.json to make that happen. This is not out of control. Our dependencies are quite stable. We have a solid change management process and automated testing. Adopting a new version when that version gets released does not make a process out of control. It just makes your company agile and aggressive in adopting the new. This goes back to my original point: this is a controversial debate left to the business to decide, not a tool vendor. I cannot say for any given company if lower or higher version resolution strategy is better. I think it depends on the company and how aggressively they want to adopt new tools. Again this is such a controversial topic it is absurd to have a tool vendor make these decisions for everyone and hard code those policies into their binaries. That being said, just offer a darn switch. That ends all debate on the matter. lol |
You misread, I said "dependencies that are out of your control".
Sure. How do you restrict the resolution strategy to only update certain dependencies?
I think I agreed with you each time you made this comment but my position hasn't changed when it comes to needing a lock file with this feature.
I have little say in what the nuget team does here so I can't say how or when this would ever happen. I'm guessing it won't work on VS 2015 with |
You can use project.json syntax to restrict to specific versions or simply de-list other packages so you only have one version in your package line. It's a hack, but it works. Yes, you would want a switch or option for both NPM and the command line restore of course. |
That's a long thread over the weekend. :)
Both # 1 and # 2 seem to be lacking with PackageReference. # 2 seems like already working with lowest is the best strategy but will falter with # 1 coming into the fore. So I think NuGet will have to solve both together atleast for PackageReference. |
What is the timeline to resolve this? This is very much needed. Requests for this goes back to 2015 (see #4789 and aspnet/dnx#2657). |
I think they just closed it and ignored me. And yes, MANY people have raised this as a SERIOUS flaw in the Nuget system. It's been over five years and it still behaves the same way with no flexibility. I have seen MANY tickets opened for this and they just close them and walk away. |
Closed what? The issue is still open. Like I said before, a lock file is needed before any realistic progress can be made here.
It has been raised many times but honestly when you talk through the implications, people ignore the consequences that don't affect their immediate needs. We need to think about the larger ecosystem impact when we design features like this. As a super basic simple example, any package that transitively references Newtonsoft.Json say 6.0.8, will now start pulling in 10.0.1. Not only that, but you end up with a completely jagged untested dependency graph by default. Here's an example of that going bad: I also think @anangaur wrote a spec for a proposed solution to the lock file but I can't find it on the wiki. |
@Wil73 We closed many related issues that were essentially the same ask in this issue. This issue is still open :) Here is the "Enable repeatable builds via lock file" spec. |
I have yet to grasp what this has to do with adding a switch of some sort to nuget RESTORE to ensure that packages resolve by default to the HIGHEST version available? |
I fail to see how on earth a lock file has any relevance at all here either. |
We already discussed this in the earlier comments. We would need to handle both scenarios together. |
Just use a default that maintains the current behavior, and allow for different expressions of version ranges. Npm does this brilliantly with the |
Sorry, but that's just silly. Every time you come up with a new version you then have to edit all the packages that use the new version to increase the darn range. We have a * as an option right now, but that only works for direct dependencies... child dependencies still resolve to LOWEST version despite the * in the grandparent library. Ranges will not resolve this problem; ranges will just create a maintenance nightmare. |
Would be nice to have the Footnotes |
My issue is in large part that there is no pretty way to solve outdated transient dependencies, especially when there is a critical security issue which is often. If I want to update the transient dependency I currently need to add it as a direct dependency, even though I have no actual direct dependency on it. This both bloats the project file, confuses the hell out of any project readers and observations of dependency trees either using tools or lock files. I would prefer to be able to set both default transient dependency resolution and/or (preferably and) override transient dependencies in a clearer manner which doesn't add it as a direct dependency. Ideally for specific transient dependencies. |
It's crazy that there is no plan articulated for this problem. Our vulnerability scanners check dependencies and report problems on dependencies we don't even deploy. Manually managing these alerts is not practical. Automation is needed. |
Take a hintz! Microsoft will never fix this issue until there is some bad press about it. Microsoft is being apathetic to the risk they are putting their own customers in, in regard to critical vulnerabilities. Maybe this project can become mainstream... https://fsprojects.github.io/Paket/ |
I just re-read this, and there's been lots of changes to nuget since this was filed ~5 years ago. Have the problems changed? There are now lock files and central package management that allows you to control the resolution of packages whether they are transitive or not. I don't want to come off as being "paternalistic" and I want developers to have control, but package dependency resolution requires a global view of the world (the world being your project's transitive closure). Packet had to work around many assumptions in how package versioning works, and some strategies will never work with certain packages. What might help move this forward are examples of problems and existing workarounds. That would help narrow down how the existing tools have gaps. Some questions on the top of my head:
|
|
|
I'm in the same boat, however there are other approaches that could be implemented for this specific concern, I've suggested these in a separate ticket #12341 |
I stumbled upon this discussion looking to do exactly as the OP stated, however reading the discussion and thinking long and hard about it, I definitely changed my mind: I don't think this feature is really needed nor desirable as stated, but I think some tweaks to the current features could help. Keep in mind I'm talking only about transitive packages, since we have the ability, to specify SecurityI think relying on "highest possible" dependency version resolution for security is a terrible idea, since there is absolutely no guarantee you will get a version without security issues. And assuming we could specify -- Another point: we can classify packages in two types.
Self-contained packagesI don't think there is a need to change anything. As long as they work, they work, even if the version used is "ancient". And if you do rely on specific optimizations or features of a transitive package, well it should definitely be part of your direct project dependencies. External resources packagesThere is usually NO good reason to use an older version of those packages. Most of the time, you do want the latest package possible. There is usually only 1 instance of a Web API you want to consume, that definitely works with the latest packages, and most definitely do not work with older packages. So those packages sound like a good candidate for the feature. However, those packages are well identified, you definitely know which resources you are consuming (or else you have more pressing concerns), and can already use Central Package Management Transitive Pinning to tell which version you want (but you cannot at the moment specify TL;DR: I think we only really need custom transitive dependency resolution for a small well known subset of packages, allowing global "Highest"-version for transitive packages will cause more trouble than it is worth. -- So then, a couple of suggestions: 1. Allow floating versions in Central Package Management (CPM)Basically allow the exact syntax we have in 2. Add a
|
@clyvari If this strategy can be set individually for each dependency/package source, I'd turn it on for packages in my private feed, so whenever |
@voroninp That's already possible for direct dependecies of your projet. If you add a depency with |
@clyvari , let's replace System.Text.Json with MyLib and imagine larger (deeper) dependency graph. Now imagine, we found a bug in MyLib. How many manipulations/rebuilds are required to update end project with the fixed MyLib depending on resolution strategy? |
@voroninp Assuming a bug in any dependency, at the moment you can (and probably should) use CPM with Transitive Pinning. If at the root of a solution, it will be common to all projects in a solution, useful if the dependency is used in multiple projects. Here is why using "highest possible" resolution strategy isn't sufficient : The version that satisfy your requirement is Despite your "Highest possible version" as a resolution strategy, And worse, because it is not intuitive (the version requirements might be deep down the dependency tree), you might be mislead into thinking: "VulnerableLib was bumped, all my projects use "Highest possible version" for transitive packages, I'm safe." |
Its true that “highest possible” doesn’t always fix the vulnerability, but in my experience it fixes most. An audit report will tell you if it didn’t fix something and enables you to focus on the less easily resolved issues.
|
@clyvari > another VulnerableLib [5.0, 5.7) I'd question first this range ;-) |
I never understood why Nuget had to re-invent the wheel here and come up with this bonkers system that we now sit with. It's confusing for users coming from anywhere else and almost ensures you are going to have security vulnerabilities, even though patches are available. It's also encouraged an ecosystem where only a minimum version constraint is specified on dependencies, MS packages being a prime example of this. Pre-dating Nuget, package managers had some simple concepts:
|
NuGet developed (or maybe always was) into a new kind of DLL hell - source of random "Method not found" exceptions at runtime (can't the exe-compiler or at least the publish task just check everything once?) I'm just right now adding the net6 package Thats what "Preview Changes" tells me: Uninstalling:
Microsoft.Win32.Registry.5.0.0
System.Globalization.4.3.0
System.IO.4.3.0
System.Reflection.4.3.0
System.Reflection.Primitives.4.3.0
System.Resources.ResourceManager.4.3.0
System.Threading.Tasks.4.3.0
Updates:
Azure.Core.1.24.0 -> Azure.Core.1.25.0
Azure.Identity.1.6.0 -> Azure.Identity.1.7.0
Microsoft.Data.SqlClient.5.0.1 -> Microsoft.Data.SqlClient.5.1.0
Microsoft.Data.SqlClient.SNI.runtime.5.0.1 -> Microsoft.Data.SqlClient.SNI.runtime.5.1.0
Microsoft.Identity.Client.4.45.0 -> Microsoft.Identity.Client.4.47.2
Microsoft.IdentityModel.Abstractions.6.21.0 -> Microsoft.IdentityModel.Abstractions.6.24.0
Microsoft.IdentityModel.JsonWebTokens.6.21.0 -> Microsoft.IdentityModel.JsonWebTokens.6.24.0
Microsoft.IdentityModel.Logging.6.21.0 -> Microsoft.IdentityModel.Logging.6.24.0
Microsoft.IdentityModel.Protocols.6.21.0 -> Microsoft.IdentityModel.Protocols.6.24.0
Microsoft.IdentityModel.Protocols.OpenIdConnect.6.21.0 -> Microsoft.IdentityModel.Protocols.OpenIdConnect.6.24.0
Microsoft.IdentityModel.Tokens.6.21.0 -> Microsoft.IdentityModel.Tokens.6.24.0
Microsoft.NETCore.Platforms.5.0.0 -> Microsoft.NETCore.Platforms.1.1.0
Microsoft.Win32.SystemEvents.5.0.0 -> Microsoft.Win32.SystemEvents.6.0.0
System.Buffers.4.5.1 -> System.Buffers.4.5.0
System.Configuration.ConfigurationManager.5.0.0 -> System.Configuration.ConfigurationManager.6.0.1
System.Diagnostics.DiagnosticSource.5.0.0 -> System.Diagnostics.DiagnosticSource.6.0.0
System.Drawing.Common.5.0.0 -> System.Drawing.Common.6.0.0
System.IdentityModel.Tokens.Jwt.6.21.0 -> System.IdentityModel.Tokens.Jwt.6.24.0
System.Runtime.Caching.5.0.0 -> System.Runtime.Caching.6.0.0
System.Security.AccessControl.5.0.0 -> System.Security.AccessControl.6.0.0
System.Security.Cryptography.ProtectedData.5.0.0 -> System.Security.Cryptography.ProtectedData.6.0.0
System.Security.Permissions.5.0.0 -> System.Security.Permissions.6.0.0
System.Text.Encoding.CodePages.5.0.0 -> System.Text.Encoding.CodePages.6.0.0
System.Windows.Extensions.5.0.0 -> System.Windows.Extensions.6.0.0
Installing:
ErikEJ.EntityFrameworkCore.SqlServer.DateOnlyTimeOnly.7.0.1
System.Runtime.CompilerServices.Unsafe.6.0.0 It's a wild mix of up- and downgrades, outtake: Microsoft.NETCore.Platforms.5.0.0 -> Microsoft.NETCore.Platforms.1.1.0
Microsoft.Win32.SystemEvents.5.0.0 -> Microsoft.Win32.SystemEvents.6.0.0
System.Buffers.4.5.1 -> System.Buffers.4.5.0 I can just only hope this has no serious effect on the final app. |
In my company, we develop several nuget packages and consume them in other programs. P1 references N1 with floating version (e.g. 1.*). So that P1 (and all other Ps) will take all the bug fixes automatically in next build. Now if a newer version of E1 was released (e.g. 1.0.1), we want P1 to get it (cause it's might contain a bug fix, or security fix). What we usually do is update N1 to reference the newer N1 (1.0.1), otherwise, we will need to go over all the consuming projects (P1, P2, …) and reference E1 (1.0.1) directly, which is ugly and time consuming. But since P1 might be directly refencing older version of E1 (1.0.0) it might cause a warning of package downgrade, and because treat warnings as errors it breaks P1. So eventually the change to N1 is a breaking change. Which is not ideal. Either we break some builds or we increase major of N1 each time we change some minor version of some referenced nuget package (time consuming cause all P's need to be changed). Is there a better way to do it, that will solve both problems? What I would like to do is that N1 references the minimum version it requires (e.g. E1 1.0.0) and when P1 is compiled it takes the highest E1, with the bug fixes. This way there won't be a package downgrade error and still the fixes will arrive to all refencing programs (Px). This is not feasible currently since there is no "Biggest major" dependency resolution. We don't really care about deterministic builds. Our builds are not deterministic anyway, since we use floating versions and for other reasons. |
Long time no response anymore, any update on this? I really need to be able to use the latest minor version of dependencies without having to recompile all other packages |
Is this even being worked on? |
@Madajevas It doesn't seem that this is even being discussed, they just released their roadmap for .NET 9 and it is not in it. See #13143. |
Pretty sure similar examples have been made but here is another one: I was addressing CVE-2024-21319 identified in My code was working fine so to fix this, I explicitly brought in This is a simple example but imagine there are more. Now I am having to manage versions of dependencies for dependencies and at scale I can see this causing conflicts and a lot of work to get it right. I don't want to specify every transitive package explicitly and I don't want to miss out on minor fixes and improvements. If I do specify, I run the risk of it not actually being needed if whatever library used it changes. I'd prefer to use the latest patch or even minor version automatically and if there are issues, explicitly "downgrade" as needed. Like maybe Lots of opinions here, I think everyone would be happy if we had the option though. For fun, run: |
Hello, I’m proposing a solution to this issue by introducing an opt-in mode for SemVer compatibility, which would align NuGet with the default behaviors of other modern package managers. This approach would allow developers to choose whether they want to enable SemVer-compatible resolution for both top-level and transitive dependencies. I've created a new issue to discuss this proposal in more detail: Please feel free to add your comments and upvote over there. Since this issue has over 100 comments and has been open for a while, shifting the conversation to a new issue will help us focus on whether this feature makes sense and how it could resolve the original challenges brought up here. Do also check out some of our docs on this topic here which provides guidance today: https://learn.microsoft.com/nuget/concepts/auditing-packages https://github.com/NuGet/docs.microsoft.com-nuget/pull/3336/files |
I am not going to debate if highest or lowest should be the default: leave the actual behavior up to the consumer.
Why on earth should Microsoft FORCE users to adopt the "safest is best" strategy in the first place?
Is there any reason why we cannot add a switch to nuget.exe and NPM that allows the user to decide their own package restore strategy?
This way consumers of nuget can align nuget behavior to their own business strategies, be they conservative or aggressive.
What I don't understand is why Microsoft has to decide this for consumers in the first place.
To me this is a straight up consumer decision based on private business strategy and policy.
It makes absolutely no sense why a tool vendor should allow themselves to unilaterally decide business strategy for all nuget consumers on the planet without an option to override that.
It's been two years and there's no apparent movement on this issue.
Leave lowest version as the default strategy if you want Microsoft, but for goodness sakes offer a switch to allow consumers to make their own business decisions.
The text was updated successfully, but these errors were encountered: