Spiking Neural Network (SNN), as a brain-inspired machine learning algorithm, is attracting attention due to its
event-driven computing style. Most unsupervised SNNs are trained through competitive learning with Spike-Timing-Dependent Plasticity (STDP).
But previous SNNs trained through this approach are limited by slow learning speed and/or sub-optimal learning capability.
To ease these limitations. We proposed a Spiking Inception architecture for unsupervised SNN.
Compared to widely used Fully-Connected (FC) and Locally-Connected (LC) architectures, STDP-based unsupervised SNNs using our architecture
have much improved learning capability, learning efficiency, and robustness.
For more details, please refer to our paper. [Elsevier] [arXiv].
There is a Split-and-Merge strategy in the Inception module: The input is split into a few parallel pathways with a set of specialized filters (e.g. 3×3, 5×5, 7×7 convolutional kernels, pooling, etc.), and then all pathways merge by concatenation. Under this strategy, the Inception modules can integrate multi-scale spatial information and improve the network’s parallelism. We also designed an Inception-like multi-pathway network architecture. To further improve the architecture’s learning efficiency and robustness, we divided each pathway into multiple parallel sub-networks by partitioning competition areas. Finally we constructed a high-parallelism Inception-like network architecture consisting of 21 parallel sub-networks.
Here we provide an implementation of our spiking Inception architecture. All code is written in Python2.
- Python-2.7
- Brian-2.2.1
Other versions of Brian (>=2.0) might work as well, but there is no guarantee on it.
Please cd
to the directory containing all source code such as cd /path_to_src
, and then you can train the SNN with a simple commond:
python Train.py
If it's the first time you run Train.py
, you need to download the MNIST dataset
and set the MNIST_data_path
in the Functions.py
to specify the directory containing the data.
The trained weight file will be saved in a directory named weights
.
You need to set the load_ending
in the Test.py
to specify which weight file you want to load from weights
.
Then, you can test the trained SNN with a simple command:
python Test.py
Note that running Test.py
won't directly give you a testing result (accuracy) - It saves the spiking activities in a directory named activity
.
You can use the following command to get a result (accuracy) on the testing set of MNIST.
python Evaluation.py
Note that you need to set the trained_sample
in the Evaluation.py
to specify which activity file you want to load from activity
.
If this repository helps your work, please kindly cite our papers:
- Mingyuan Meng, Xingyu Yang, Lei Bi, Jinman Kim, Shanlin Xiao, Zhiyi Yu, "High-parallelism Inception-like Spiking Neural Networks for Unsupervised Feature Learning," Neurocomputing, vol. 441, pp. 92-104, 2021, doi: 10.1016/j.neucom.2021.02.027. [Elsevier] [arXiv]
- Mingyuan Meng, Xingyu Yang, Shanlin Xiao, Zhiyi Yu, "Spiking Inception Module for Multi-layer Unsupervised Spiking Neural Networks," International Joint Conference on Neural Networks (IJCNN), pp. 1-8, 2020, doi: 10.1109/IJCNN48605.2020.9207161. [IEEE] [arXiv]