-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors when caching torch objects in rmarkdown #1199
Comments
A possible strategy for caching objects with torch might be:
This schema automatically skips training once the chunk is cached (as long as the chunk is not changed). Dependency can be explicitly added to other cached chunks with |
Thanks to @atusy, the My understanding is that these functions are best included in the package (see yihui/knitr#2340 (comment)) so that caching works transparently for the end user. |
Caching torch objects in Rmarkdown document them give errors (see the issue in
knitr
repository for more details). The underlying issue is that torch objects are implemented with classes (R6/Rcpp) that use reference semantics.A possible solution is to save reload the object in each cached chunk. However,
torch_save
andtorch_load
work only withtorch tensors
andtorch modules
, they don't work withtorch datasets
andtorch dataloaders
afaik. Morevover, torch objects can be very big which makes this solution not ideal.Given the complexity of
knitr
caching mechanism, it would be great to have some guidelines for torch users (see also here and here).The text was updated successfully, but these errors were encountered: