I made this repo as a leanring experience for myself, so I could bettter understand fasta source code. I have done that, so I am archiving this repo. Although, I did not push all my changes to the repo, as my leanring continued I learnt that the functions that fastai are necessary for ease.
fwf is not made with the goal of creating a library from where you can import functions, for that you have fastai. The motivation behind this project is to make the useful functions of fastai in PyTorch like lr_find, fit_one_cycle available.
(If you are not familiar with Cyclic learning, look at the papers of Leslie N. Smith. I highly recommend to read his papers, as you would get to know about the state-of-art methods for training neural networks in 2019 and also fastai courses, where Jeremy teaches his own best methods).
There are two main folders:
- src
- notebooks
In the src
folder you will find .py scripts with the function name that it implements and in the notebooks
folder you would find notebook with the same name, which would show you how to use the functions in .py script. I also compare my results, with fastai library results in the notebooks.
In the scripts you will find something like this in the beginning
# NOT -> ParameterModule
# NOT -> children_and_parameters
# NOT -> flatten_model
# NOT -> lr_range
# NOT -> scheduling functions
# NOT -> SmoothenValue
# YES -> lr_find
# NOT -> plot_lr_find
It lists all the functions/classes in that script. NOT means you don't need to modify that function and YES means you will have to modify the function in some cases. Example of the changes you would have to do
################### TO BE MODIFIED ###################
# Depending on your model, you will have to modify your
# data pipeline and how you give inputs to your model.
inputs, labels = data
if use_gpu:
inputs = inputs.to(device)
labels = labels.to(device)
outputs = model(inputs)
loss = loss_fn(outputs, labels)
#####################################################
Ideally, I wrote the .py scripts such that if you want to understand all the underlying code, you can start from the beginning of the file and read the code sequentially.
-
lr_find.py
- lr_find
- plot_lr_find
-
fit_one_cycle.py
TODO
-
Python type-checking
def func(a:int, b:bool=True, model:nn.Module=None)->int: pass
To understand the code first remove all the types and you get
def func(a, b, model): pass
Now what the initial-code meant is,
a
is of type int,b
is of type bool with default value=True, andmodel
is of type torch.nn.Module with default None.The last
->int
means, that the function returns an int.Python type-checking is not necessary for these functions to work, but it help's understand the arguments better.
I have a written a blog post summarizing the latest techniques used to train neural networks. The research is proposed by Leslie N. Smith and is one of the main training techniques used in fastai. I have summarized the 4 research papers in this notebook, Reproducing Leslie N. Smith's papers using fastai
If some documentation is missing or some piece of code is not clear, open an issue and I would be happy to clarify. Also, if any bug is found, file a PR or open an issue.
fastai_without_fastai is MIT licensed, as found in the LICENSE file.