Skip to content

GMAvaliani/pykan

 
 

Repository files navigation

kan_plot

Kolmogorov-Arnold Networks (KANs)

This is the github repo for the paper "KAN: Kolmogorov-Arnold Networks". Find the documentation here.

Kolmogorov-Arnold Networks (KANs) are promising alternatives of Multi-Layer Perceptrons (MLPs). KANs have strong mathematical foundations just like MLPs: MLPs are based on the universal approximation theorem, while KANs are based on Kolmogorov-Arnold representation theorem. KANs and MLPs are dual: KANs have activation functions on edges, while MLPs have activation functions on nodes. This simple change makes KANs better (sometimes much better!) than MLPs in terms of both model accuracy and interpretability. A quick intro of KANs here.

mlp_kan_compare

Accuracy

KANs have faster scaling than MLPs. KANs have better accuracy than MLPs with fewer parameters.

Example 1: fitting symbolic formulas Screenshot 2024-04-30 at 10 55 30

Example 2: fitting special functions Screenshot 2024-04-30 at 11 07 20

Example 3: PDE solving Screenshot 2024-04-30 at 10 57 25

Example 4: avoid catastrophic forgetting Screenshot 2024-04-30 at 11 04 36

Interpretability

KANs can be intuitively visualized. KANs offer interpretability and interactivity that MLPs cannot provide. We can use KANs to potentially discover new scientific laws.

Example 1: Symbolic formulas Screenshot 2024-04-30 at 11 04 56

Example 2: Discovering mathematical laws of knots Screenshot 2024-04-30 at 11 05 25

Example 3: Discovering physical laws of Anderson localization Screenshot 2024-04-30 at 11 05 53

Example 4: Training of a three-layer KAN

kan_training_low_res

Installation

There are two ways to install pykan, through pypi or github.

Installation via github

git clone https://github.com/KindXiaoming/pykan.git
cd pykan
pip install -e .

Installation via pypi

pip install pykan

Requirements

# python==3.9.7
matplotlib==3.6.2
numpy==1.24.4
scikit_learn==1.1.3
setuptools==65.5.0
sympy==1.11.1
torch==2.2.2
tqdm==4.66.2

To install requirements:

pip install -r requirements.txt

Computation requirements

Examples in tutorials are runnable on a single CPU typically less than 10 minutes. All examples in the paper are runnable on a single CPU in less than one day. Training KANs for PDE is the most expensive and may take hours to days on a single CPU. We use CPUs to train our models because we carried out parameter sweeps (both for MLPs and KANs) to obtain Pareto Frontiers. There are thousands of small models which is why we use CPUs rather than GPUs. Admittedly, our problem scales are smaller than typical machine learning tasks, but are typical for science-related tasks. In case the scale of your task is large, it is advisable to use GPUs.

Documentation

The documentation can be found here.

Tutorials

Quickstart

Get started with hellokan.ipynb notebook.

More demos

More Notebook tutorials can be found in tutorials.

Citation

@misc{liu2024kan,
      title={KAN: Kolmogorov-Arnold Networks}, 
      author={Ziming Liu and Yixuan Wang and Sachin Vaidya and Fabian Ruehle and James Halverson and Marin Soljačić and Thomas Y. Hou and Max Tegmark},
      year={2024},
      eprint={2404.19756},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Contact

If you have any questions, please contact [email protected]

About

Kolmogorov Arnold Networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 97.4%
  • Python 2.6%