Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FR: allow restoring repo without snapshot #3616

Closed
0xdeafbeef opened this issue May 3, 2024 · 9 comments
Closed

FR: allow restoring repo without snapshot #3616

0xdeafbeef opened this issue May 3, 2024 · 9 comments
Labels
polish🪒🐃 Make existing features more convenient and more consistent

Comments

@0xdeafbeef
Copy link
Member

Is your feature request related to a problem? Please describe.
Your app accidentally created a billion files, and you want to restore the repository to a clean state.
If one of the files is larger than snapshot.max-new-file-size, you're in trouble.
You can either snapshot it to delete or do it by hand, both of which are cumbersome.

 cd $(mktemp -d)
 jj git init .
 echo a > a
 jj describe -m 'init'
 jj new
 dd if=/dev/urandom of=rand_10m bs=1M count=10
 jj restore .
#Error: Failed to snapshot the working copy
#The file '/tmp/tmp.CQHWzRWxd6/rand_10m' is too large to be snapshotted: it is 10.0MiB; the maximum size allowed #is ~1.0MiB.
#Hint: This is to prevent large files from being added on accident. You can fix this error by:
#  - Adding the file to `.gitignore`
#  - Run `jj config set --repo snapshot.max-new-file-size 10485760`
#    This will increase the maximum file size allowed for new files, in this repository only.
#  - Run `jj --config-toml 'snapshot.max-new-file-size=10485760' st`
#    This will increase the maximum file size allowed for new files, for this command only.

Describe the solution you'd like
jj restore --force, which will skip snapshotting of repo. Maybe we can introduce new jj reset subcommand.

@noahmayr
Copy link
Contributor

noahmayr commented May 3, 2024

Have you tried jj restore . --ignore-working-copy?

@0xdeafbeef
Copy link
Member Author

Have you tried jj restore . --ignore-working-copy?

❯ jj restore . --ignore-working-copy
Nothing changed.

/tmp/tmp.fmt69QzBUM [🍐 kxrwyuon 2d9285c5 on <no branch> 📝]
❯ ls
a  rand_10m

@yuja
Copy link
Contributor

yuja commented May 4, 2024

iirc, there are some discussion about demoting the max-new-file-size error to warning. It will allow jj restore in that situation. I'm not sure if that's a good idea because doing that can be a footgun.

@0xdeafbeef
Copy link
Member Author

iirc, there are some discussion about demoting the max-new-file-size error to warning. It will allow jj restore in that situation. I'm not sure if that's a good idea because doing that can be a footgun.

Perhaps it would be feasible to actually remove newly created files when --ignore-working-copy is set?

@yuja
Copy link
Contributor

yuja commented May 5, 2024

Perhaps it would be feasible to actually remove newly created files when --ignore-working-copy is set?

--ignore-working-copy is the option to not touch working-copy files. It seems scarier than allowing jj restore in dirty working copy.

@noahmayr
Copy link
Contributor

noahmayr commented May 5, 2024

maybe jj restore --untracked which would get rid of all files that are not tracked or ignored?

Maybe we could even extend the error and hint to include suggestions for commands to:
a. increase the max file size
b. ignore the file
c. restore untracked files

@yuja
Copy link
Contributor

yuja commented May 6, 2024

maybe jj restore --untracked which would get rid of all files that are not tracked or ignored?

Yeah, hg purge/git clean-like command is also an option. #3154
(but we'll need to deal with the max-new-file-size error somehow because jj purge --ignore-working-copy doesn't make sense.)

@martinvonz
Copy link
Member

Another option might be some new global flag that's similar to --ignore-working-copy but instead of leaving the working copy stale, it updates the working state to the new target commit without touching the files in the working copy (calling WorkingCopy::reset()).

@thoughtpolice
Copy link
Member

thoughtpolice commented May 6, 2024

I think jj purge is a good idea and the right solution to the "My app created too many files" problem. I regularly use git clean -xfd in order to purge working directories of stuff (e.g. delete all the build artifacts from make so you can do a complete clean rebuild.) However, this requires a lot of care to make sure you're not in the wrong state. (Normally, we would be safe in these cases thanks to autosnapshots, but of course they're broken.)

There's another problem that isn't spelt out directly in the OP. From a UX perspective, it's important to know what you're going to destroy first, but that currently isn't possible, because jj st is broken by the very thing you're trying to fix. jj st should not fail in the case of a max file snapshot error. Actually, it should probably always try to succeed, despite outright corruption. It's one of the most important tools to understand the repository state. Destroying it because a tool made a file too large is not great.

Instead, it should report that some files are ?? or something, to indicate that these files are not part of the snapshot. So something like the following:

$ jj st
Working copy changes:
A  foobar
M  barbaz
R  bazbaz
?? too-large.txt

So I think we need to fix both of these for these cases to not hurt so badly.

@PhilipMetzger PhilipMetzger added the polish🪒🐃 Make existing features more convenient and more consistent label May 8, 2024
yuja added a commit to yuja/jj that referenced this issue Dec 10, 2024
I think this provides a better UX than refusing any operation due to large
files. Because untracked files won't be overwritten, it's usually safe to
continue operation ignoring the untracked files. One caveat is that new large
files can become tracked if files of the same name checked out. (see the test
case)

FWIW, the warning will be printed only once if watchman is enabled. If we use
the snapshot stats to print untracked paths in "jj status", this will be a
problem.

Closes jj-vcs#3616, jj-vcs#3912
yuja added a commit to yuja/jj that referenced this issue Dec 11, 2024
I think this provides a better UX than refusing any operation due to large
files. Because untracked files won't be overwritten, it's usually safe to
continue operation ignoring the untracked files. One caveat is that new large
files can become tracked if files of the same name checked out. (see the test
case)

FWIW, the warning will be printed only once if watchman is enabled. If we use
the snapshot stats to print untracked paths in "jj status", this will be a
problem.

Closes jj-vcs#3616, jj-vcs#3912
@yuja yuja closed this as completed in 168c797 Dec 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
polish🪒🐃 Make existing features more convenient and more consistent
Projects
None yet
Development

No branches or pull requests

6 participants