Experiments on bio inspired plasticity to improve reservoir computing
HAG introduces an innovative, biologically-inspired approach to improve Reservoir Computing networks. Grounded in Hebbian plasticity principles, HAG dynamically constructs and optimizes reservoir architectures to enhance the adaptability and efficiency of time-series prediction and classification tasks. By autonomously forming and pruning connections between neurons based on Pearson correlation, HAG tailors reservoirs to the specific demands of each task, aligning with biological neural network principles and Cover’s theorem.
-
A Bio-Inspired Model for Audio Processing
Tanguy Cazalets, Joni Dambre
Presented at C1 Conference 2023
This paper introduces a biologically-inspired approach to audio processing, emphasizing homeostatic mechanisms and plasticity for efficient neural network performance. -
A Homeostatic Activity-Dependent Structural Plasticity Algorithm for Richer Input Combination
Tanguy Cazalets, Joni Dambre
Presented at P1 Conference 2023
This work explores an innovative algorithm for structural plasticity, enhancing neural network adaptability to diverse input combinations.
Feature | Summary | |
---|---|---|
🔬️ | Dynamic Reservoirs | Dynamically generates connectivity matrices using Hebbian-inspired rules, ensuring optimized, task-specific reservoir properties for enhanced linear separability and efficiency. |
🧩 | Structural Plasticity | Implements biologically plausible mechanisms to create or prune connections based on activity levels and correlations, enabling the reservoir to self-organize around task requirements. |
⚡️ | Performance Boost | Outperforms traditional Echo State Networks (ESNs) across various benchmarks, offering higher accuracy in classification and reduced error in time-series prediction tasks. |
📊 | Comprehensive Metrics | Evaluates reservoirs with advanced metrics including Pearson Correlation, Spectral Radius, and Cumulative Explained Variance to ensure enriched dynamics and decorrelated feature representations. |
Before getting started with HAG, ensure your runtime environment meets the following requirements:
reservoirPy
for reservoir computing training and inferenceoptuna
for hyperparameter optimizationlibrosa
for time-series preprocessing
Install HAG using one of the following methods:
Build from source:
- Clone the HAG repository:
❯ git clone https://github.com/Finebouche/HAG
- Navigate to the project directory:
❯ cd HAG
- Install the project dependencies:
❯ conda env create -f environment.yml
This project is protected under the MIT License License.
- This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 860949