-
Notifications
You must be signed in to change notification settings - Fork 258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dotnet restore --no-cache --force completes without error with no internet connection #5619
Comments
From @emgarten on July 19, 2017 21:53 The global packages folder (.nuget\packages) is considered a source not a cache.
These flags aren't the most obvious, we're currently looking at improving this for NuGet. To force a restore to level you might be expecting here you would need to delete all packages and caches first, you can do this with: |
FWIW chasing these issues as they're moved between repos for what appears to the user to be a
...this is pretty much the opposite of everything I've ever been told about
That, to me, means both of them, which is consistent with messaging elsewhere. Why would I care about HTTP cache most of the time? The I believe this behavior should change. |
Would you provide some examples of when this is needed? Are you reusing the same package id/versions? The best way to bypass .nuget is to set the packages folder path so that the project or solution restores to its own folder and avoids the user wide folder.
Can you explain what you expect restore to do with the global folder when |
I think the problem is:
|
Sure! Right now I'm testing restore times. If we get a bad cache somehow on a build server we want to be able to bypass it without nuking the cache.
I'd expect it to do nothing with the global folder at all.
I was referring to the proposed workaround: In general there's a large disconnect between team and public terminology here, evidenced in talks by Microsoft folks as well. To a programmer, this is a cache. It's quite literally caching content locally and not going to the source. If we explained exactly how it works and asked 100 programmers, I'd fully expect no less than 90 of them to think of this as a cache. I get that naming is hard, but is this case it's especially problematic because it's used elsewhere in the same product and in commands that don't match expectations. |
I agree on better messaging. For this repo which covers the restore part of this I don't see any actionable work here. I'm going to close this issue unless there is further input on that. |
@emgarten Then where should this live? It's been moved and closed once already. There is an issue, closing this doesn't make it the unintuitive behavior and improper terminology go away. |
@NickCraver which places contain improper terminology? As for the The issue here seems to be a general misunderstanding of the packages folder, however in NuGet itself we have been very carefully to always message this correctly. Issues need to be filed on the docs which are not messaging this correctly. |
@emgarten I asked others as a sanity check: https://twitter.com/Nick_Craver/status/890160768007770113 This is a problem. It is, in every behavioral way, a cache. It seems that overwhelmingly people expect |
I understand the confusion and appreciate the feedback. Here is the problem: Restore's purpose is to put packages on disk for build/publish/run to consume. If this doesn't occur then build will be missing inputs and everything will be broken. Having a flag that puts users in this broken state would be confusing. The packages need to go somewhere, and they need to be persisted for future operations to use them. This is why setting If for example
Would you define where you think the problem is at this point? Your poll indicates that is with the outputs of restore. |
@emgarten If I specify I would expect it to ignore the copy already sitting in the global packages folder and re-download a new copy of it to the global packages folder, overwriting the existing folder if necessary. If this isn't the current behavior, I suspect that will come as a surprise to just about everyone using |
What @bording describes is exactly the behavior I'd expect, don't use any cached copies. I'm not proposing an elaborate new system of storage and resolution here, only that a re-download is forced. And I'd really expect I honestly have yet to meet anyone who cares about the HTTP cache in practice. In fact I googled it and just found this doc as the first result: https://docs.microsoft.com/en-us/nuget/consume-packages/managing-the-nuget-cache A screenshot from that doc: It'd been called a cache for a long time, let's treat it as users expect. |
The global packages folder could be shared between different restores, builds, and executing the app. Removing files, even temporarily would cause problems. For this reason packages are never replaced or removed from the global folder. The exception to this is the @bording do you have a scenario where a package needs to be replaced in the global folder? I'd like to understand if extra validations/replacing on the global packages folder are needed, or if the issue is that users do not trust restore and the packages to be correct and feel that they need to give it a push, in which case I'd like to find a way to improve stability without requiring additional downloads. |
@emgarten I do have a scenario where this would be useful. I have a project that produces packages, and as part of testing, those packages are consumed from another project by having a local folder configured as the package feed. During development, I could end up with different builds of those packages, but they won't have a different version, so the restore for the consuming test project will pick up the old copy in the global folder instead of always looking at the local folder to get the newly built package. Wiping out the entire global folder every time before doing the restore seems like overkill, but currently I have to manually delete the package from the global folder, which is a pain. |
We're also testing restores from remote repos, firewall rules, VPN configs, custom NuGet servers, multiple builds of the same package (fixing NuGet/.NET tooling issues - we've filed many), and dealing with local source packages between builds for library buildings (again, the version doesn't increment every time). For example, when a bad cache from a source happens, I need to be able to force a re-download from a good source after removing the bad one. This can also happen when people do a bad push from MyGet to NuGet and the version doesn't differ. The global cache has no concept of feeds, AFAIK. I might also be testing that pulling from one of our other data centers is working as expected. Common usage is why I came across this in the first place: we often need a way to force a redownload of packages. There are plenty of scenarios where we want to force a re-download. The objection seems to be if we replace the file at any point in time that may cause issues. How is that any different from the manual deletion we're already doing today to work around this? Also, I agree that emptying the entire cache (on a build server, for example) is total overkill to refresh bad packages specific to a certain build. |
This is a known pain point when developing packages, I think something should be done here to improve this work flow. Replacing packages in the global packages folder would help solve this. In general, re-using the same id/version of a package is not a supported scenario for NuGet. Packages are expected to be unique on id/version and the same across all sources and the global packages folder so that they can be used interchangeably. This is because nuspecs references other packages using just the id and version, not a hash, which gives no way to differentiate and select between different variations of a package with the same id/version. The recommended way to deal with this is to increase the version number, this is often done with git versioning or by using a time based prerelease label, anything to make it unique and of a higher version. In summary, NuGet should look into replacing packages in the global folder when no cache is set. The down side of this is:
|
@emgarten I am using git versioning, but that only helps me if I'm making commits between package builds, which is not always going to be true when I'm in the middle of development.
Can you explain these two a bit more? I'm not sure I understand them. Are you saying performance would always be decreased, or just when the |
I have similar concerns as @bording and @NickCraver already clearly explained. But let me describe my use-case. On my local machine I have two projects both are libraries with different release cycles (e.g. LibA and LibB). LibB depends on LibA and consumes nuget-packages from it. It depends on some version of LibA in csproj:
Both libs are being developed in parallel. So at a time we have some dev (unreleased) version of LibA which is being consumed in LibB (at a moment of LibB releasing that LibA version should become released). To propagate new changes of LibA to LibB I build LibA packages and do restore in LibB. LibA packages are built into a folder which is used as package source for LibB restore. Despite @emgarten said:
it worked pretty well with "old" nuget for years.
In LibB I had
So all packages for LibB are put inside that folder. And it's being used during build both via msbuild/nuget and VS. If I needed to refresh a package I just went and delete it inside "packages" folder in LibB, then run restore. Now with "new" nuget and dotnet cli I have headache. It's simply not working at all. repositoryPath is ignored. All restored packages are put inside a new global "cache" (which is not a cache as it turned out) - I mean C:\Users\UserName.nuget\packages.
Overriding a
All packages are put inside the specified folder. But VS won't see that folder and continue to use global So currently I have to go and delete manually my packages in global Moreover I have no idea what to do in CI scenario (on build-server). It seems that the only workaround is clearing the global If supporting |
@evil-shrike use Packages.config uses a different folder format than PackageReference which is why the same property cannot be applied for both project types. The new equivalent is |
@emgarten Thank you, using |
Even a |
I just stumbled upon weird default restore behaviour:
Checking the NuGet server (with my browser) did show the new package immediately. I solved this by adding "-NoCache" to my build script after reading about dotnet/nuget restore caching http results. Caching packages for some time might be ok dependent on use case, but caching http results when this will lead to a restore error isn't useful. BTW: I tried to updating the package with Visual Studio's built-in NuGet dialog and VS couldn't find the new package, either. One can argue whether caching might be ok here -- but I would expect to use "-NoCache" here when explicitely hitting Reload button. (VS 15.6.0, NuGet 4.1.0.2450, dotnet 2.1.100) |
How I ran into this annoyance: We have switched package source from MyGet to Azure Artifacts. Wanted to test that nuget was able to still actually find (correct URL, authentication working, etc.) all the packages after the move. Cleaning out the whole local cache of packages in this case is quite unwanted in case a package actually haven't been transferred correctly. So tried to run with no-cache and force, but yeah, as for others here, but yeah, nuget kept using the local "cache" still. I could tell, because nuget never got to a point where it asked me to authenticate with Azure. Not even after deleting the projects How |
Similar to @Svish, I'm switching to a single NuGet feed, which has a bunch of additional feeds configured as upstream sources. I'd like to be able to validate that the feed is configured correctly but the global package cache gets in the way. Honestly changing it from |
I just hit this with .NET SDK 7.0.100 Preview 6 and @bording's scenario. Deleting |
Team Triage: Unfortunately the The easiest way to "disable" the global package folder is to set the environment variable For example, from the root directory of a repository:
|
I'm developing a set of libraries and testing the resulting nugets locally before the release process. As you might guess, I'm building several times the same version number, and whenever one nupkg build is stored under On projects consuming my local build, it would be useful having a csproj option, telling dotnet (core) not to restore packages from
|
I went through the pain of setting up a local nuget server, only to find there's no way to bypass global package cache, so back to powershell/bash recursive delete scripts. It's a wild decision to name a flag "no cache" but it precisely uses the one cache you would expect it to bypass. |
From @NickCraver on July 19, 2017 21:6
This is with .NET Core 2.0 Preview 2 tooling. I'd expect
--no-cache
restores instructed to bypass cache to fail when no upstream sources are available...but that doesn't happen. Looking at procmon, it seems to be completely ignoring the--no-cache
argument.Here's what
dotnet restore --no-cache --force
registers during a run:Steps to reproduce
--no-cache
) and forced (--force
)Expected behavior
The restore would fail, unable to connect to any package sources and bypassing local cache.
Actual behavior
The restore succeeds without issue:
Environment data
dotnet --info
output:/cc @rrelyea @emgarten
Copied from original issue: dotnet/cli#7195
The text was updated successfully, but these errors were encountered: