Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optionally don't redownload/reinstall a file if it exists #984

Open
zany130 opened this issue Nov 21, 2023 · 7 comments
Open

Optionally don't redownload/reinstall a file if it exists #984

zany130 opened this issue Nov 21, 2023 · 7 comments
Labels
enhancement New feature or request

Comments

@zany130
Copy link
Collaborator

zany130 commented Nov 21, 2023

System Information

  • SteamTinkerLaunch version: v14.0.20231120-2
  • Distribution: Garuda Linux

As was discussed previously
It may be beneficial (though in many cases negligible, to be honest) to only download updates to specific tools like SpecialK when there is an update.

I want to use this issue to brainstorm ideas for my possible PR.


Doing a bit of googling, I found it is possible to tell Wget not to download a file if it already exists.

https://www.reddit.com/r/wget/comments/34nxry/will_wget_overwrite_files_if_theyre_already/

so, in theory, could we not add the option -nc to

"$WGET" -q "$DLSRC" -O "$DLDST" 2>&1 | sed -u -e "s:\.::g;s:.*K::g;s:^[[:space:]]*::g" | grep -v "SSL_INIT"

That way, steamtinkerlaunch will not redownload already existing files.


This should be a toggle, with the default behavior being to overwrite files.

The reasoning is overwriting files can be helpful if the file gets corrupted, and usually, these downloads are only a few MB anyway.

The other half of this would be not to install tools like Reshade or Specialk if the version installed is already up to date

This should probably be a separate PR, and you can use this: https://github.com/sonic2kk/steamtinkerlaunch/wiki/PEV-PE32-file-analysis.

Let me know if I misunderstood anything or if this is a horrible idea lol 😅

EDIT:

One glaring flaw I should have thought through, what if the filename in the download link is always the same?

For example, myexample.com/myfile.zip is always the latest version of myfile.zip?

in the case of specialk nightlys this is not a problem I think as the redirected URL will always have the version in the URL

EDIT2:

seems we can use the -N option which seems to check the timestamp of the remote file vs the local
https://stackoverflow.com/a/16840827

@zany130 zany130 added the enhancement New feature or request label Nov 21, 2023
@zany130 zany130 changed the title Optionally don't redownload a file if it exists Optionally don't redownload/reinstall a file if it exists Nov 21, 2023
@sonic2kk
Copy link
Owner

sonic2kk commented Nov 21, 2023

I think this is a good idea overall :-) I don't see it being quite as straightforward to implement as we may hope, at least not in a way that's generic for most (any?) downloaded file.

The place you've suggested, in dlCheck, makes perfect sense to me. I am also fully in agreement of having this as a toggle.

The other half of this would be not to install tools like Reshade or Specialk if the version installed is already up to date

ReShade may already have this check, to some extent, but on the DLLs themselves. I think it does redownload ReShade but it won't reinstall if the version doesn't need an update. I believe this was the cause of auto update ReShade not working if ReShade Addon was used, but I might be remembering wrong.

SpecialK does not check at all though to my understanding.

The reasoning is overwriting files can be helpful if the file gets corrupted, and usually, these downloads are only a few MB anyway.

Users can still remove the files manually in the meantime, but this could be a good convenience factor.


I tried to write out my implementation concerns, but honestly, I just have no idea if any of the proposed ideas would in all scenarios. I think it starts off ok, for archive names we can tell wget not to download if the file already exists.

When the file already exists and has a fixed name though, a timestamp check might be a little bit... unreliable? I guess even if the archive created/updated time are not tracked properly on downloaded files, there shouldn't really be a case where the downloaded files have a mismatch, as newer files would always be newer than the time they were downloaded at, otherwise the newer file would've been downloaded in the first place.

My thoughts on this overall are: I'm all for this change if it can be done, and if you're willing to work on it absolutely go for it and I'll try to help however I can, though I would also say that I'm glad I'm not the one attempting this 😅


Thanks for taking the initiative on this one. For implementation, feel free to start with trying this out for one tool initially. You could pick SpecialK if you wanted. I guess the steps would be:

  • Conditionally remove the -nc flag from wget in dlCheck based on a parameter (maybe NOCHK=1 for "no check"?)
  • For SpecialK Nightly: (different filename based on version and hash)
    • Don't remove the archive (add an exception in the file cleanup logic for the variable with the archive name, SPEKARCH I think is the name) -- We can't remove this block because it manages cleaning up the stray innoextract files when extracting custom SpecialK versions
    • If the archive with the name already exists, don't download it (set the toggle to 1)
  • For SpecialK Stable: (fixed file name)
    • Compare the timestamps for the local and hosted archive version. If the hosted timestamp is newer than the existing file's timestamp, no timestamp comes back for the hosted file ([ -z $DLFILETIMESTAMP ] or something I think)

We would have to be careful for this network timestamp check though. It's very important that it's a toggle, in case for other tools (i.e. x64dbg) a user manually places an archive, which could be newer if just placed than the one from the download link.


It would be pretty awesome if you could do this, I think I would end up bashing my head against the keyboard trying to figure this one out. You've already done a great deal of investigation work and seem to have a reasonable idea of where to go from here, which is a lot further than I would've gotten I think. Thank you for considering this :-)

@sonic2kk
Copy link
Owner

If all else fails, we could potentially try to implement some way to compare files using an md5 hash check, with that and all of the following behind the toggle you mentioned. If we have to, we could download the existing file under a different name (change DLDST I think to have an underscore, or to go to a new folder at /tmp/stl maybe). Then we could get an md5 for both the new and existing archives, and compare that to see if they're different.

We could also, in theory, check the files in the archive before extracting them to see if they are actually different, either with a version check like we do for ReShade or with a checksum like above. This and the above are less reliable I think because they don't necessarily tell you whether the file is newer or older, just if they are different in some way. Although this would still handle the corruption case, it would not handle the case where an archive may have been downloaded elsewhere even if it is more or less the same, i.e. it might contain different a couple of extra files.

And both of these would be a last resort, since it means we'd download the archives anyway, we just wouldn't use them if the version did not match.


Also, I alluded to this with very rough my SpecialK implementation outline, but this check no matter how we implement it will only work if the archive is already present for us to compare against. That's not necessarily an issue but it may require a change in behaviour when implementing this, like we'd need for SpecialK.

@zany130
Copy link
Collaborator Author

zany130 commented Nov 21, 2023

No promises this will turn into a proper PR that can be merged, but I at least want to get the research and at least try to implement this

Maybe I pave the way for someone else to pick up where I left off, or maybe I teach someone how to not do this lol.

I am actually surprised by how many a simple tools like wget

Just out of curiosity, do you prefer we stick to wget? I don't think curl or any other tool would have options that can better serve us for this but just an idea.

maybe one of those tools can compare checksums with the remote file or something? Another research topic for latter I guess ( I honestly have no clue lol)

@sonic2kk
Copy link
Owner

Yup, there is no hurry with this, and I want to reiterate again that I'm very glad you're even considering looking at this. Anything we discuss here will not go to waste, as it'll either serve as a reference for why this was not implemented, or just like you said, it'll be a springboard for someone else.

Just out of curiosity, do you prefer we stick to wget? I don't think curl or any other tool would have options that can better serve us for this but just an idea.

Honestly, not really 😅 I have no particular attachment. If SteamOS has curl, and if it makes anything easier I'd say go for it. I'm fairly sure it does though and I think the SteamGridDB stuff uses curl.

If there's something else you want to use, try to keep it "light" if at all possible, in that it has few dependencies, or the dependencies are things we already have or reasonable things to expect most systems to have. Ideally, a new dependency would be something that can have its binary dragged and dropped into the STL deps folder on SteamOS.

We'd also have to be mindful of downstream packages for STL and how it may impact them.

Having said all that, I really doubt we're going to end up in territory like that. Curl is pretty much everywhere and is lighter than some of our other dependcies, I have no issue if you can use that to do the job better. The above is in case you find some external dependency that makes this a billion times easier, we'll have to evaluate it a little 🙂


Honestly, I'm just very happy someone else is interested in contributing :-) But there's no time pressure on this, it's not massively critical or anything, so don't feel like you have to rush.

And of course let me know if you need help with anything!

@Mte90
Copy link
Contributor

Mte90 commented Mar 15, 2024

So to recap this ticket:

  • Add an option to disable overwrite, but where? Something globally?
  • Every wget call should follow this rule in STL?
    • In that case could make sense a function that wrap wget

@zany130
Copy link
Collaborator Author

zany130 commented Mar 15, 2024

  • Add an option to disable overwrite, but where? Something globally?

Yeah I think that would make the most sense probably in misc options above reset log is where I would put it

image

  • In that case could make sense a function that wrap wget

I belive there is a function that handles and the downloading with wget if I remember correctly in the code (it been a while since I looked at it 😅 )

I believe I got as far creating a config variable for the toggle in yad and adding a conditional into the wget function to add the -nc parameters (no clobber) if the variable ( I was thinking just calling it Redownload) is set to 1

EDIT: NVM there is no generic download function. I was testing on this https://github.com/sonic2kk/steamtinkerlaunch/blob/dd9cd5343e02985b10cd4a397d66a9fd21847cba/steamtinkerlaunch#L15628 which is called every time we want to get the latest version from GitHub

@Mte90
Copy link
Contributor

Mte90 commented Mar 20, 2024

There is a generic download function but there is a variable for wget so it is possible to add there the parameter and it will be applied everywhere.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants