Have you ever wondered how your data would look like in neuromorphic? Have you ever wondered what is actually going on in a neuron and how does a bunch of them sound like? Just run the WiN_GUI.py file and you will find out!
Further information can be found in our related publication WiN-GUI: A graphical tool for neuron-based encoding.
Here we present the WiN-GUI (Watch inside Neurons-GUI) to interactively change neuron models and parameters. It allows to load any sample-based time-series data (for more details see Data structure) and converts it into membrane voltage traces and spike-trains. Playing with the parameter sliders helps to understand their interaction, the neuron mechanics and how your encoded data changes accordingly. The neuron model can be changed with a single click and the GUI will update the visualization. If you have your own neuron model and want to tune tha parameters check out the How-to: include custom neuron models chapter.
Available neuron models are:
- Mihalas-Niebur neuron.
- Izhikevich neuron.
- Leaky integrate-and-fire neuron.
- Recurrent leaky integrate-and-fire neuron.
The WiN-GUI is based on PyQT6 and PyTorch. Some other packages are requiered, too. Use the requirements file provided to set everything up Further libraries we face missing often are listed in Packages you might need to install.
The WiN-GUI is intentionally designed to visualize data from robotic experiments different classes and multiple recordings per class. Nonetheless does the GUI support single class and single recording. In that case you only have to make sure, that the data is structured like: nb_time_steps x nb_sensors
. If you provide the data as python dictionary with an entry 'label' the dataselection over the drop down menu is availabale, otherwise over the dial. If you want to visualize multi-class and multi recordings you have to provide each record with the according label (e.g. 'label': class_name).
Below you can find some example data structures:
- single trial:
nb_time_steps x nb_sensors
- single class, multiple repetitions:
nb_trial x nb_time_steps x nb_sensors
(do not forget to provide the label 'repetition' as dictionary key) - multi-class, no repetition:
nb_classes x nb_time_steps x nb_sensors
(do not forget to provide the label 'class' as dictionary key) - multi-class, multiple repetitions:
(nb_classes x nb_repetitions) x nb_time_steps x nb_sensors
Please have a look at the example data provided for further details.
Neuron models are implemented using PyTorch. The models can be found here. The according parameters here. Implementing a neuron requires setting all parameters with default values. Parameters that should be manipulatable in the GUI surface must be added by the same name in the parameters file. For all parameters not added, the GUI will fall back to the default values specified in the neuron model. Each parameter must start with the lower boundary, followed by the upper boundary, the step size, and finally the initial value.
We use the Multidimensional image processing (scipy.ndimage
) for filtering, and signal processing (scipy.signal
) for resampling.
The channel selection panel allows enabling or disabling single channels to improve the readability of the remaining. Channels to visualize are highlighted in green, whereas hidden channels are highlighted in red. The channel selection can be adapted to personal needs and preferences regarding shape, spacing, and location.
If you want to change the filter properties you can do this here. To change the filter type replace the default filter here.
The WiN-GUI can be extended with further neuron models! To use your custom neuron model, the model has to be added here and the according parameter here. Finally, you need to load your model to be used in the GUI. Brief overview showing a custom neuron model in PyTroch you can find here.
Here comes a step by step how to guide based on the LIF neuron:
- Add your neuron model:
- Add the neuron parameter:
- Add a new dictionary in the neuron_parameters file, e.g..
- Each entry has to match a single member in the neuron model "init" list.
- Load the neuron model and parameter:
If you implement a new model, and want to make it available for the community, let us know!
For those who want to give there GUI the perfect personal glance can change the sensor visualization to reflect the physical setup. Implementing a custom visualization must be able to handle the splitting of each sensor stream into two streams. In other words: two layouts are needed: 1. the orignal and 2. twice as many as the original.
apt-get packages: apt-get -y install '^libxcb.*-dev' libx11-xcb-dev libglu1-mesa-dev libxrender-dev libxi-dev libxkbcommon-dev libxkbcommon-x11-dev
apt packages: apt -y install ffmpeg, portaudio19-dev
When running on an older Python version with the most recent PyQT6 package you might ancounter the following error: '''symbol lookup error: [your_path]/python3.8/site-packages/PyQt6/Qt6/plugins/platforms/../../lib/libQt6WaylandClient.so.6: undefined symbol: wl_proxy_marshal_flags'''
To fix this you can enforce the use of x11 instead of Wayland by adding the following line to your bashrc: '''export QT_QPA_PLATFORM=xcb'''