Shiweiliuiiiiiii
Follow
Royal Society Newton International Fellow
-
University of Oxford
- UK
- https://shiweiliuiiiiiii.github.io/
Pinned Loading
-
In-Time-Over-Parameterization
In-Time-Over-Parameterization Public[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
-
VITA-Group/GraNet
VITA-Group/GraNet Public[Neurips 2021] Sparse Training via Boosting Pruning Plasticity with Neuroregeneration
-
VITA-Group/Random_Pruning
VITA-Group/Random_Pruning Public[ICLR 2022] The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training by Shiwei Liu, Tianlong Chen, Xiaohan Chen, Li Shen, Decebal Constantin Mocanu, Z…
-
VITA-Group/SLaK
VITA-Group/SLaK Public[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.