Neurons — 122 Models¶
122 neuron models spanning 83 years of computational neuroscience
(1943-2026): 109 individual bio model files under neurons/models/ plus 5 core
stochastic computing neurons plus 9 AI-optimized models (ArcaneNeuron + 8
novel designs in ai_optimized.py).
Quick Start¶
# Flat import (any model)
from sc_neurocore.neurons import HodgkinHuxleyNeuron, AdExNeuron
# Individual file import
from sc_neurocore.neurons.models.hodgkin_huxley import HodgkinHuxleyNeuron
Core SC Neurons (bitstream-capable)¶
| Class | Domain |
|---|---|
StochasticLIFNeuron |
Software simulation (fast) |
FixedPointLIFNeuron |
Bit-true Q8.8 hardware model |
HomeostaticLIFNeuron |
Self-regulating firing rate |
SCIzhikevichNeuron |
Rich dynamics (bursting, chattering) |
StochasticDendriticNeuron |
XOR dendritic processing |
sc_neurocore.neurons.base.BaseNeuron
¶
Bases: ABC
Abstract base class for stochastic neuron models.
All neurons should expose: - step(input_current) -> spike (0 or 1) - reset_state() - get_state() -> dict
Source code in src/sc_neurocore/neurons/base.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | |
step(input_current)
abstractmethod
¶
Advance the neuron by one time step and return a spike (0 or 1).
Source code in src/sc_neurocore/neurons/base.py
22 23 24 25 | |
reset_state()
abstractmethod
¶
Reset the internal state to default / initial values.
Source code in src/sc_neurocore/neurons/base.py
27 28 29 30 | |
get_state()
abstractmethod
¶
Return a dict with the internal state (e.g., membrane potential).
Source code in src/sc_neurocore/neurons/base.py
32 33 34 35 | |
sc_neurocore.neurons.stochastic_lif.StochasticLIFNeuron
dataclass
¶
Bases: BaseNeuron
Discrete-time noisy leaky integrate-and-fire neuron.
dv/dt = -(v - v_rest) / tau_mem + R * I + noise
Parameters use normalised units (voltage [0,1], time in ms). Defaults from Gerstner & Kistler, Spiking Neuron Models, 2002.
Example¶
neuron = StochasticLIFNeuron(v_threshold=1.0, tau_mem=20.0, noise_std=0.0) spikes = [neuron.step(1.5) for _ in range(50)] sum(spikes) > 0 True neuron.get_state() # membrane voltage + refractory counter
Process a bitstream as input current:
import numpy as np bits = np.array([1, 0, 1, 1, 0, 1, 0, 0], dtype=np.uint8) neuron.reset_state() out = neuron.process_bitstream(bits, input_scale=2.0) out.shape (8,)
Source code in src/sc_neurocore/neurons/stochastic_lif.py
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 | |
process_bitstream(input_bits, input_scale=1.0)
¶
Process a bitstream (array of 0s and 1s) as input current. Returns an array of spikes (0s and 1s).
input_scale: scaling factor to convert bit (0/1) to current amplitude.
Source code in src/sc_neurocore/neurons/stochastic_lif.py
117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 | |
sc_neurocore.neurons.fixed_point_lif.FixedPointLIFNeuron
dataclass
¶
Bit-true fixed-point model of the Verilog sc_lif_neuron.
All arithmetic is performed in signed Q(FRACTION) fixed-point with explicit bit-width masking so that overflow/wrap behaviour matches the hardware exactly.
Parameters¶
data_width : int Total bit width of all fixed-point values (default 16). fraction : int Number of fractional bits (default 8, giving Q8.8). v_rest, v_reset, v_threshold : int Membrane parameters in Q(FRACTION) fixed-point. refractory_period : int Number of clock cycles to hold after a spike.
Example¶
neuron = FixedPointLIFNeuron() spike, v = neuron.step(leak_k=240, gain_k=16, I_t=100) spike in (0, 1) True neuron.reset()
Source code in src/sc_neurocore/neurons/fixed_point_lif.py
33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 | |
step(leak_k, gain_k, I_t, noise_in=0)
¶
Execute one clock cycle — bit-true match to Verilog RTL.
Parameters¶
leak_k : int – ALPHA_LEAK in Q(FRACTION) gain_k : int – GAIN_IN in Q(FRACTION) I_t : int – Input current in Q(FRACTION) noise_in : int – External noise in Q(FRACTION)
Returns¶
(spike, v_out) : tuple[int, int]
Source code in src/sc_neurocore/neurons/fixed_point_lif.py
79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 | |
reset()
¶
Reset neuron state to power-on defaults.
Source code in src/sc_neurocore/neurons/fixed_point_lif.py
125 126 127 128 | |
reset_state()
¶
Reset internal state (alias for :meth:reset).
Source code in src/sc_neurocore/neurons/fixed_point_lif.py
131 132 133 | |
get_state()
¶
Return dict with internal state.
Source code in src/sc_neurocore/neurons/fixed_point_lif.py
135 136 137 138 139 140 | |
sc_neurocore.neurons.sc_izhikevich.SCIzhikevichNeuron
dataclass
¶
Bases: BaseNeuron
Stochastic Izhikevich neuron (software-only).
Standard Izhikevich model (IEEE TNN 14(6), 2003): v' = 0.04v^2 + 5v + 140 - u + I + noise u' = a(bv - u)
When v >= 30 mV: spike, then v <- c, u <- u + d.
Example¶
neuron = SCIzhikevichNeuron(noise_std=0.0) spikes = [neuron.step(10.0) for _ in range(100)] sum(spikes) > 0 # regular spiking with I=10 True
Source code in src/sc_neurocore/neurons/sc_izhikevich.py
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 | |
sc_neurocore.neurons.homeostatic_lif.HomeostaticLIFNeuron
dataclass
¶
Bases: StochasticLIFNeuron
LIF neuron with homeostatic threshold adaptation.
Self-regulates firing rate toward a target setpoint via exponential moving average of spike rate. Based on Turrigiano (2012).
Example¶
neuron = HomeostaticLIFNeuron(target_rate=0.1, noise_std=0.0) for _ in range(200): ... neuron.step(1.5) neuron.v_threshold != 1.0 # threshold adapted True
Source code in src/sc_neurocore/neurons/homeostatic_lif.py
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 | |
sc_neurocore.neurons.dendritic.StochasticDendriticNeuron
dataclass
¶
XOR-nonlinearity neuron with shunting inhibition.
Implements d1 + d2 - 2*d1*d2 (XOR truth table for binary inputs).
Based on Koch, Biophysics of Computation, 1999, Ch. 12.
Source code in src/sc_neurocore/neurons/dendritic.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 | |
reset_state()
¶
Reset internal state to defaults.
Source code in src/sc_neurocore/neurons/dendritic.py
40 41 42 | |
get_state()
¶
Return dict with internal state.
Source code in src/sc_neurocore/neurons/dendritic.py
44 45 46 | |
Extended Model Library (109 models in neurons/models/)¶
Integrate-and-Fire Variants (21)¶
| Model | File | Reference |
|---|---|---|
| AdEx | adex.py |
Brette & Gerstner 2005 |
| ExpIF | expif.py |
Fourcaud-Trocme 2003 |
| Lapicque | lapicque.py |
Lapicque 1907 |
| QIF | quadratic_if.py |
Latham 2000 |
| GLIF (5 levels) | glif.py |
Teeter 2018, Allen Institute |
| MAT | mat.py |
Kobayashi 2009 |
| SFA | sfa.py |
Benda & Herz 2003 |
| Stochastic IF | stochastic_if.py |
Brunel & Hakim 1999 |
| Escape-rate | escape_rate.py |
Gerstner 2000 |
| Fractional LIF | fractional_lif.py |
Lundstrom 2008 |
| COBA LIF | coba_lif.py |
Conductance-based |
| Perfect Integrator | perfect_integrator.py |
Non-leaky IF |
| NLIF | nlif.py |
Cubic nonlinearity |
| Adaptive Threshold | adaptive_threshold_if.py |
Dynamic threshold |
| PLIF | plif.py |
Fang 2021, learnable tau |
| Non-Resetting LIF | non_resetting_lif.py |
Kobayashi 2009 |
| Gated LIF | gated_lif.py |
Yao 2022, NeurIPS |
| Sigma-Delta | sigma_delta.py |
Yoon 2017 |
| TC-LIF | tc_lif.py |
AAAI 2024 |
| Benda-Herz | benda_herz.py |
Benda 2003 |
| Integer QIF | iqif.py |
Lo 2021, fixed-point |
| Complementary LIF | clif.py |
ICML 2024, dual paths |
| K-LIF | klif.py |
Learnable scaling |
| Inhibitory LIF | ilif.py |
2025, temporal inhibition |
| E-prop ALIF | e_prop_alif.py |
Bellec 2020, eligibility |
| Energy LIF | energy_lif.py |
Fardet 2020 |
Biophysical / Conductance-Based (11)¶
| Model | File | Reference |
|---|---|---|
| Hodgkin-Huxley | hodgkin_huxley.py |
HH 1952 (Nobel Prize) |
| Connor-Stevens | connor_stevens.py |
Connor 1977, A-type K+ |
| Wang-Buzsaki | wang_buzsaki.py |
Wang 1996, FS interneuron |
| Pinsky-Rinzel | pinsky_rinzel.py |
Pinsky 1994, 2-compartment |
| Destexhe | destexhe_thalamic.py |
Destexhe 1993, T-current |
| Huber-Braun | huber_braun.py |
Braun 1998, cold receptor |
| Gutkin-Ermentrout | gutkin_ermentrout.py |
Gutkin 1998 |
| Traub-Miles | traub_miles.py |
Traub 1991, hippocampal |
| Golomb FS | golomb_fs.py |
Golomb 2007, Kv3 channels |
| Mainen-Sejnowski | mainen_sejnowski.py |
Mainen 1996, axonal Na |
| Pospischil | pospischil.py |
Pospischil 2008, 5 types |
Oscillatory / Qualitative (7)¶
| Model | File | Reference |
|---|---|---|
| FitzHugh-Nagumo | fitzhugh_nagumo.py |
FitzHugh 1961 |
| Morris-Lecar | morris_lecar.py |
Morris 1981 |
| Hindmarsh-Rose | hindmarsh_rose.py |
HR 1984, chaotic bursting |
| Resonate-and-Fire | resonate_and_fire.py |
Izhikevich 2001 |
| Theta | theta.py |
Ermentrout 1986 |
| FitzHugh-Rinzel | fitzhugh_rinzel.py |
FitzHugh 1976, 3D |
| Terman-Wang | terman_wang.py |
Terman 1995, LEGION |
Bursting (5)¶
| Model | File | Reference |
|---|---|---|
| Chay | chay.py |
Chay 1985, pancreatic beta |
| Butera | butera_respiratory.py |
Butera 1999, respiratory |
| Sherman-Rinzel-Keizer | sherman_rinzel_keizer.py |
Sherman 1988 |
| Plant R15 | plant_r15.py |
Plant 1981, Aplysia |
| Bertram Phantom | bertram_phantom.py |
Bertram 2008 |
| Pernarowski | pernarowski.py |
Pernarowski 1994 |
Multi-Compartment (4)¶
| Model | File | Reference |
|---|---|---|
| Hay L5 Pyramidal | hay_l5.py |
Hay 2011, 3-compartment BAC firing |
| Booth-Rinzel | booth_rinzel.py |
Booth 1995, bistable motoneuron |
| Dendrify | dendrify.py |
Beniaguev 2022, active dendrite |
| TC-LIF | tc_lif.py |
AAAI 2024, soma+dendrite |
Synaptic (3)¶
Alpha, Synaptic (dual-exp), Tsodyks-Markram (STP)
Map-Based / Discrete (6)¶
Rulkov, Chialvo, Courbage-Nekorkin, Medvedev, Ibarz-Tanaka, Cazelles
Stochastic (4)¶
Poisson, Inhomogeneous Poisson, Galves-Locherbach, GLM (Pillow 2008)
Population / Neural Mass (7)¶
Wilson-Cowan, Jansen-Rit (EEG), Wong-Wang (decision), Ermentrout-Kopell (exact mean-field), Amari (neural field), Wendling (extended JR, epilepsy EEG), Larter-Breakspear (TVB whole-brain)
Hardware-Specific (9)¶
Loihi CUBA, Loihi 2, TrueNorth, BrainScaleS AdEx, SpiNNaker LIF, SpiNNaker2, DPI/DYNAP-SE, Akida, Sigma-Delta
Rate Models (3)¶
McCulloch-Pitts (1943), Sigmoid Rate, Threshold-Linear (ReLU)
Other (5)¶
SRM/SRM0 (kernel), McKean (piecewise FHN), Leaky-Compete-Fire (WTA), Prescott (Type I/II/III), Compte (NMDA working memory)
Multi-Compartment (3)¶
Pinsky-Rinzel (2-comp), Booth-Rinzel (motoneuron), TC-LIF (soma+dendrite)
PyTorch Training Cells (10)¶
Differentiable spiking neurons for surrogate gradient training:
| Cell | Module | Reference |
|---|---|---|
| LIFCell | training.snn_modules |
Standard LIF |
| IFCell | training.snn_modules |
No leak |
| SynapticCell | training.snn_modules |
Dual-exponential |
| ALIFCell | training.snn_modules |
Bellec 2020 |
| RecurrentLIFCell | training.snn_modules |
Orthogonal init |
| ExpIFCell | training.snn_modules |
Exponential |
| AdExCell | training.snn_modules |
Adaptive exponential |
| LapicqueCell | training.snn_modules |
RC circuit |
| AlphaCell | training.snn_modules |
Alpha synapse |
| SecondOrderLIFCell | training.snn_modules |
Inertial term |