You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice if the api would support pathlib.Path paths.
methods like
load_from_checkpoint
save_checkpoint
and variables like
default_root_dir (in Trainer)
Motivation
pathlib.Path paths are concise, e.g. root_dir / "model.pt"
this could be done via e.g. a Union[str, Path] or similar.
Pitch
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.
@dyollb Thanks for noticing this! We actually do support it 🎉 . load_from_checkpoint just needs a tiny update in the type annotation. See my changes in #15540 . Trainer.save_checkpoint also supports it today.
Would you be interested in looking into Trainer.default_root_dir? Everything should be working there too, just needs a type update in the Trainer argument.
Proposed refactor
It would be nice if the api would support
pathlib.Path
paths.methods like
and variables like
Trainer
)Motivation
pathlib.Path
paths are concise, e.g.root_dir / "model.pt"
Pitch
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.
cc @Borda
The text was updated successfully, but these errors were encountered: