Skip to content

Framework Integrations

Module: sc_neurocore.integrations Source: src/sc_neurocore/integrations/ — 2 files, 157 LOC Status (v3.14.0): one bridge present (Intel Lava / Loihi); 3 tests exist but all skip without lava-nc installed; __init__.py does not re-export anything; the bridge's own usage docstring references a class name that does not exist (§3 honesty notice).

This page covers the entire integrations subpackage: today that means the Lava bridge for deployment on Loihi 2 neuromorphic hardware (or the Lava CPU simulation). Other planned bridges (PyTorch, ONNX, Brian2) are not implemented in v3.14.0.


1. Public surface

sc_neurocore.integrations.__init__ re-exports the 5 always-importable symbols from lava_bridge.py plus, conditionally, the 2 Lava-only classes:

Python
# Always works (lava-nc not required)
from sc_neurocore.integrations import (
    HAS_LAVA,
    LoihiNetworkConfig,
    SCtoLavaConverter,
    export_weights_loihi,
    loihi_threshold_from_sc,
)

# Only when HAS_LAVA is True
if HAS_LAVA:
    from sc_neurocore.integrations import SCDenseProcess, PySCDenseModel

The submodule path from sc_neurocore.integrations.lava_bridge import ... continues to work. Closes task #33.

The module-level HAS_LAVA: bool flag in lava_bridge.py is True when the lava-nc library is importable. When HAS_LAVA is False, the SCDenseProcess and PySCDenseModel classes (§4) are not defined, but everything else still imports cleanly.

Symbol Defined in Available without lava-nc?
HAS_LAVA lava_bridge.py:37,39 yes (always set)
LoihiNetworkConfig lava_bridge.py:66 (dataclass) yes
SCtoLavaConverter lava_bridge.py:79 yes
export_weights_loihi lava_bridge.py:42 yes
loihi_threshold_from_sc lava_bridge.py:60 yes
SCDenseProcess lava_bridge.py:121 (in if HAS_LAVA block) no — undefined when lava-nc absent
PySCDenseModel lava_bridge.py:135 (in if HAS_LAVA block) no — undefined when lava-nc absent

Importing a name from the conditional block when lava-nc is missing raises ImportError at use time (the if HAS_LAVA guard simply omits the class definitions).


2. lava-nc dependency

lava_bridge.py:26-39 performs a multi-import inside try/except ImportError. The block depends on the following lava modules:

Python
from lava.magma.core.process.process import AbstractProcess
from lava.magma.core.process.ports.ports import InPort, OutPort
from lava.magma.core.process.variable import Var
from lava.magma.core.model.py.model import PyLoihiProcessModel
from lava.magma.core.model.py.type import LavaPyType
from lava.magma.core.model.py.ports import PyInPort, PyOutPort
from lava.magma.core.resources import CPU
from lava.magma.core.decorator import implements, requires
from lava.magma.core.sync.protocols.loihi_protocol import LoihiProtocol

If any one of these fails, HAS_LAVA is set to False and all 9 names go out of scope.

Install instructions (per the bridge's module docstring): pip install lava-nc. Note that lava-nc officially supports Python 3.10 (the test file test_lava_integration.py documents this constraint at line 11). On Python 3.12 — the system interpreter on this workstation — lava-nc is not available, so all three Lava tests skip.


3. Docstring + re-exports (FIXED by task #33)

The earlier lava_bridge.py module-docstring usage example imported a non-existent LoihiDenseProcess. The class has always been called SCDenseProcess, paired with PySCDenseModel. The docstring is now corrected:

Python
from sc_neurocore.integrations.lava_bridge import (
    SCtoLavaConverter, export_weights_loihi, SCDenseProcess,
)

Plus the package __init__.py now re-exports the 5 always-importable symbols (so from sc_neurocore.integrations import X works for the helpers without lava-nc) and conditionally exposes SCDenseProcess + PySCDenseModel only when HAS_LAVA is True. See §1 for the import patterns.

Regression coverage: tests/test_lava_integration.py::test_integrations_init_reexports_helpers asserts the 5 always-importable names; the companion test_integrations_init_lava_classes_only_when_has_lava verifies the SCDenseProcess / PySCDenseModel exposure flips with HAS_LAVA.


4. Weight + threshold quantisation

The two helper functions handle the SC ↔ Loihi numeric mapping.

4.1 export_weights_loihi(weights, weight_bits=8, weight_exp=0)

Python
max_val =  (1 << (weight_bits - 1)) - 1     # 127 for 8-bit
min_val = -(1 << (weight_bits - 1))         # -128
scaled = (weights * 2.0 - 1.0) * max_val    # [0, 1] → [-127, 127]
quantised = np.clip(np.round(scaled), min_val, max_val).astype(int32)
return quantised * (2 ** weight_exp)

SC weights live in [0, 1] (probability scale); Loihi expects signed integers with configurable bit width. The mapping is weight - 0.5 shifted to ±max: - w = 0.0-127 - w = 0.50 - w = 1.0+127

weight_exp > 0 shifts the result left by weight_exp bits, giving the caller a pre-scaled magnitude for Loihi's wgtExp field (Loihi uses effective_weight = stored_weight * 2**wgtExp). The default weight_exp = 0 returns the raw quantised value.

The mapping treats 0.5 as the SC zero-point. This matches Loihi's balanced bipolar encoding but differs from many SC frameworks where the zero-point is 0 (unsigned probabilities). Document the convention before importing existing SC weights.

4.2 loihi_threshold_from_sc(sc_threshold, weight_bits=8)

Python
max_val = (1 << (weight_bits - 1)) - 1
return int(np.round(sc_threshold * max_val))

Threshold scales linearly: sc_threshold = 1.0127. Used by SCtoLavaConverter.convert_dense_layer to set the per-output thresholds array (defaults to 1.0 → 127 for every output).


5. LoihiNetworkConfig and SCtoLavaConverter

Python
@dataclass
class LoihiNetworkConfig:
    n_inputs: int
    n_outputs: int
    weights: np.ndarray
    thresholds: np.ndarray
    weight_bits: int = 8
    weight_exp: int = 0
    decay: int = 128

A POD struct that captures everything PySCDenseModel.run_spk needs to instantiate a Loihi-style integrate-and-fire layer. decay = 128 is the default linear decay factor used in the integer membrane update (§6).

5.1 SCtoLavaConverter

Python
class SCtoLavaConverter:
    def __init__(self, weight_bits: int = 8): ...

    def convert_dense_layer(self, sc_layer) -> LoihiNetworkConfig: ...
    def convert_training_model(self, spiking_net) -> list[LoihiNetworkConfig]: ...

convert_dense_layer(sc_layer) reads sc_layer.weights (a 2-D array of shape (n_out, n_in)), calls export_weights_loihi, and builds a single LoihiNetworkConfig with thresholds set uniformly to the quantised value of 1.0.

convert_training_model(spiking_net) walks the result of spiking_net.to_sc_weights() (assumed to return an iterable of (n_out, n_in) weight tensors, possibly PyTorch — handled by hasattr(w, "numpy")) and produces one LoihiNetworkConfig per layer. There is no explicit handling of layer-specific thresholds, biases, or sparse connectivity — every layer gets the same threshold default.

Both methods carry # type: ignore[attr-defined] markers because sc_layer.weights and spiking_net.to_sc_weights are duck-typed. A Protocol class (e.g. class _ConvertibleLayer(Protocol)) would let the type-ignores go away.


6. SCDenseProcess and PySCDenseModel (Lava only)

When HAS_LAVA is True, two Lava classes are defined.

6.1 SCDenseProcess(AbstractProcess)

A Lava Process with: - s_in: InPort(shape=(n_inputs,)) — incoming spike vector - s_out: OutPort(shape=(n_outputs,)) — outgoing spike vector - weights: Var(shape=(n_out, n_in)) — quantised weight matrix - v: Var(shape=(n_outputs,)) — membrane state, initialised to zeros - threshold: Var(shape=(n_outputs,)) — per-output spike threshold - decay: Var(shape=(1,)) — single decay factor

6.2 PySCDenseModel(PyLoihiProcessModel)

The Python-side model implementation that runs in Lava's Loihi simulator. The hot path is run_spk (lava_bridge.py:143-149):

Python
def run_spk(self) -> None:
    spikes_in = self.s_in.recv()
    current = self.weights @ spikes_in
    self.v[:] = (self.v * self.decay[0]) // 256 + current
    spikes_out = (self.v >= self.threshold).astype(int)
    self.v[spikes_out == 1] = 0
    self.s_out.send(spikes_out)

This is integer current-based LIF: - (v * decay) // 256 — linear decay scaled by the integer decay field (default 128 → keeps half the membrane each step) - + current — synaptic input from weights @ spikes_in - threshold-and-reset — binary spike emission, hard reset to zero

The // 256 divisor implies Loihi-style 16-bit fixed-point decay scaling (decay value d in [0, 255] scales by d / 256). This matches Loihi 1's CUBA integer model qualitatively but is not a verified Loihi 2 parity model — Loihi 2 has additional features (threshold adaptation, conductance-based COBA mode, programmable neuron logic) that this bridge does not cover. Tracked as task #34.

The @implements(proc=SCDenseProcess, protocol=LoihiProtocol) and @requires(CPU) decorators bind the model to the Lava simulator on CPU. The bridge does not currently expose a hardware-deployment helper for actual Loihi silicon.


7. Performance — not measured

lava-nc is not installable on Python 3.12, so end-to-end benchmarks (Lava-CPU vs sc-neurocore Python) cannot be exercised in this environment. Document the gap; do not fabricate numbers.

The pure-Python helpers (export_weights_loihi, loihi_threshold_from_sc, SCtoLavaConverter.convert_dense_layer) run in microseconds for typical layer sizes (≤1000 neurons). Quantisation cost is dominated by np.round + np.clip — both NumPy-vectorised.


8. Pipeline wiring

Surface How it's wired Verifier
from sc_neurocore.integrations.lava_bridge import HAS_LAVA top-level try/except tests/test_lava_integration.py::test_lava_import (skips without lava)
SCtoLavaConverter.convert_dense_layerexport_weights_loihiLoihiNetworkConfig direct call inside the converter test_sc_to_lava_converter (skips without lava)
PySCDenseModel.run_spk integer LIF bound by @implements(proc=SCDenseProcess) test_spike_train_parity (skips without lava)
loihi_threshold_from_sc called from convert_dense_layer covered transitively

The __init__.py re-export is missing, breaking the from sc_neurocore.integrations import X pattern. Tracked as part of task #33.


9. Audit (7-point checklist)

# Dimension Status Detail
1 Pipeline wiring ✅ PASS lava_bridge.py internals wire correctly; __init__.py re-exports 5 always-importable symbols + 2 conditional Lava classes (closes task #33; see §1)
2 Multi-angle tests ⚠️ WARN 3 tests defined but all skip without lava-nc (Python 3.10 only). No tests cover export_weights_loihi / loihi_threshold_from_sc / SCtoLavaConverter independently of Lava.
3 Rust path N/A I/O + integer quantisation; downstream Lava is the compute layer
4 Benchmarks ❌ FAIL None — lava-nc unavailable in this env.
5 Performance docs ⚠️ WARN §7 is honest but empty.
6 Documentation page ✅ PASS This page
7 Rules followed ⚠️ WARN SPDX header on every file ✅. Docstring + __init__ re-exports FIXED by task #33 (§3, §10.1, §10.2). 3+ undocumented # type: ignore markers (lines 87, 101, 121, 135) — still open.

Net: 3 WARN, 1 FAIL. Three WARNs trace back to the lava-nc Python-version constraint; the FAIL is the inability to benchmark without it. Task #33 closed in this session.


10. Known issues

10.1 LoihiDenseProcess docstring (FIXED by task #33)

The earlier docstring referenced a non-existent class. Now corrected to SCDenseProcess. See §3.

10.2 __init__.py re-exports (FIXED by task #33)

integrations/__init__.py now re-exports 5 always-importable symbols (HAS_LAVA, LoihiNetworkConfig, SCtoLavaConverter, export_weights_loihi, loihi_threshold_from_sc) and conditionally adds SCDenseProcess + PySCDenseModel to __all__ only when HAS_LAVA is True. See §1.

10.3 PySCDenseModel.run_spk is not a verified Loihi 2 parity model

The integer LIF in §6.2 matches Loihi 1 CUBA qualitatively but does not verify against Loihi 2 reference behaviour. The bridge ships this as "deployment on Intel Loihi 2" in the module docstring but there is no parity test against Loihi 2 silicon or the official Lava Loihi 2 simulator. Tracked as task #34.

10.4 Three # type: ignore markers without rationale

  • lava_bridge.py:87# type: ignore[attr-defined] on sc_layer.weights (duck-typed)
  • lava_bridge.py:101 — same on spiking_net.to_sc_weights
  • lava_bridge.py:121# type: ignore[misc] on class SCDenseProcess(AbstractProcess) (Lava base type)
  • lava_bridge.py:135 — same on class PySCDenseModel(...)

The first two can go away with a Protocol defining the duck-typed methods. The latter two are Lava base-class typing limitations and need a per-line rationale comment.

10.5 No to_sc_weights typing contract documented

convert_training_model calls spiking_net.to_sc_weights() and assumes it returns an iterable of 2-D weight tensors. The type contract is enforced by # type: ignore rather than by a Protocol class or an __abstractmethod__ declaration. Callers learn the contract by reading the converter source.

10.6 Weight-zero-point convention may surprise users

export_weights_loihi treats SC 0.5 as the integer zero. SC frameworks that use unsigned probabilities (zero = "always off") will wrap to the negative half of the integer range. Document the convention prominently in the function docstring or add an unsigned: bool = False flag.


11. Tests

Bash
PYTHONPATH=src python3 -m pytest tests/test_lava_integration.py -v
# 3 skipped (lava-nc not installed) - verified 2026-04-17

Test breakdown (all skip without lava-nc):

Test What it would check
test_lava_import import lava.lib.dl.slayer and from lava.proc.lif.process import LIF succeed
test_sc_to_lava_converter Build a Dense + LIF chain, run 100 steps on Loihi1SimCfg, assert the membrane shape
test_spike_train_parity Compare an sc_neurocore.neurons.StochasticLIFNeuron 100-step spike train against a Lava LIF(shape=(1,), vth=256) 100-step run; asserts both produce >0 spikes (does not assert exact count parity)

What is NOT covered (independent of lava-nc):

  • export_weights_loihi — no test of [0, 1] → [-127, 127] mapping, no test of weight_exp shift, no test of the zero-point convention
  • loihi_threshold_from_sc — no test
  • SCtoLavaConverter.convert_dense_layer — no test with a mock sc_layer (could be done without Lava)
  • LoihiNetworkConfig dataclass round-trip — no test

These should all be addable as no-skip tests since they don't need Lava.


12. References

  • Davies M. et al. "Loihi: A Neuromorphic Manycore Processor with On-Chip Learning." IEEE Micro 38(1):82-99 (2018). The Loihi 1 architecture this bridge mostly targets.
  • Orchard G. et al. "Efficient Neuromorphic Signal Processing with Loihi 2." 2021 IEEE Workshop on Signal Processing Systems (SiPS). Loihi 2 architecture overview.
  • Lava framework — github.com/lava-nc/lava — Intel's open-source Python framework for neuromorphic computing.

Internal:

  • Stochastic LIF neuron used in the parity test: api/neurons.md
  • Compiler Q-format reference (Loihi uses 8-bit, sc-neurocore uses Q8.8 for FPGA): api/cli.md

13. Auto-rendered API

sc_neurocore.integrations.lava_bridge

Intel Lava / Loihi integration bridge.

Maps SC-NeuroCore networks to Lava Processes for deployment on Intel Loihi 2 neuromorphic hardware or Lava CPU simulation.

Requires: pip install lava-nc (optional dependency)

Usage

from sc_neurocore.integrations.lava_bridge import ( SCtoLavaConverter, export_weights_loihi, SCDenseProcess, )

The two Lava classes (SCDenseProcess and PySCDenseModel) are only defined when HAS_LAVA is True; otherwise they are absent from the module namespace. Helpers (the converter and the weight/threshold encoders plus LoihiNetworkConfig) are always importable.

HAS_LAVA = True module-attribute

LoihiNetworkConfig dataclass

Configuration for a deployed Loihi network.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
73
74
75
76
77
78
79
80
81
82
83
@dataclass
class LoihiNetworkConfig:
    """Configuration for a deployed Loihi network."""

    n_inputs: int
    n_outputs: int
    weights: np.ndarray
    thresholds: np.ndarray
    weight_bits: int = 8
    weight_exp: int = 0
    decay: int = 128

SCtoLavaConverter

Convert SC-NeuroCore layer stack to Lava Process network.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
class SCtoLavaConverter:
    """Convert SC-NeuroCore layer stack to Lava Process network."""

    def __init__(self, weight_bits: int = 8):
        self.weight_bits = weight_bits

    def convert_dense_layer(self, sc_layer: object) -> LoihiNetworkConfig:
        """Convert an SCDenseLayer or VectorizedSCLayer to Loihi config."""
        weights = np.array(sc_layer.weights)  # type: ignore[attr-defined]
        loihi_weights = export_weights_loihi(weights, self.weight_bits)
        thresholds = np.full(weights.shape[0], loihi_threshold_from_sc(1.0, self.weight_bits))
        return LoihiNetworkConfig(
            n_inputs=weights.shape[1],
            n_outputs=weights.shape[0],
            weights=loihi_weights,
            thresholds=thresholds,
            weight_bits=self.weight_bits,
        )

    def convert_training_model(self, spiking_net: object) -> list[LoihiNetworkConfig]:
        """Convert a trained SpikingNet to a list of LoihiNetworkConfigs."""
        configs = []
        sc_weights = spiking_net.to_sc_weights()  # type: ignore[attr-defined]
        for w in sc_weights:
            w_np = w.numpy() if hasattr(w, "numpy") else np.array(w)
            loihi_w = export_weights_loihi(w_np, self.weight_bits)
            n_out, n_in = w_np.shape
            thresholds = np.full(n_out, loihi_threshold_from_sc(1.0, self.weight_bits))
            configs.append(
                LoihiNetworkConfig(
                    n_inputs=n_in,
                    n_outputs=n_out,
                    weights=loihi_w,
                    thresholds=thresholds,
                    weight_bits=self.weight_bits,
                )
            )
        return configs

convert_dense_layer(sc_layer)

Convert an SCDenseLayer or VectorizedSCLayer to Loihi config.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
def convert_dense_layer(self, sc_layer: object) -> LoihiNetworkConfig:
    """Convert an SCDenseLayer or VectorizedSCLayer to Loihi config."""
    weights = np.array(sc_layer.weights)  # type: ignore[attr-defined]
    loihi_weights = export_weights_loihi(weights, self.weight_bits)
    thresholds = np.full(weights.shape[0], loihi_threshold_from_sc(1.0, self.weight_bits))
    return LoihiNetworkConfig(
        n_inputs=weights.shape[1],
        n_outputs=weights.shape[0],
        weights=loihi_weights,
        thresholds=thresholds,
        weight_bits=self.weight_bits,
    )

convert_training_model(spiking_net)

Convert a trained SpikingNet to a list of LoihiNetworkConfigs.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
def convert_training_model(self, spiking_net: object) -> list[LoihiNetworkConfig]:
    """Convert a trained SpikingNet to a list of LoihiNetworkConfigs."""
    configs = []
    sc_weights = spiking_net.to_sc_weights()  # type: ignore[attr-defined]
    for w in sc_weights:
        w_np = w.numpy() if hasattr(w, "numpy") else np.array(w)
        loihi_w = export_weights_loihi(w_np, self.weight_bits)
        n_out, n_in = w_np.shape
        thresholds = np.full(n_out, loihi_threshold_from_sc(1.0, self.weight_bits))
        configs.append(
            LoihiNetworkConfig(
                n_inputs=n_in,
                n_outputs=n_out,
                weights=loihi_w,
                thresholds=thresholds,
                weight_bits=self.weight_bits,
            )
        )
    return configs

export_weights_loihi(weights, weight_bits=8, weight_exp=0)

Convert SC probability weights [0,1] to Loihi fixed-point format.

Loihi uses signed integer weights with configurable precision. Maps [0,1] → [-128, 127] for 8-bit weights.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
def export_weights_loihi(
    weights: np.ndarray,
    weight_bits: int = 8,
    weight_exp: int = 0,
) -> np.ndarray:
    """Convert SC probability weights [0,1] to Loihi fixed-point format.

    Loihi uses signed integer weights with configurable precision.
    Maps [0,1] → [-128, 127] for 8-bit weights.
    """
    max_val = (1 << (weight_bits - 1)) - 1
    min_val = -(1 << (weight_bits - 1))
    # SC weights are [0,1], shift to [-1,1] then scale
    scaled = (weights * 2.0 - 1.0) * max_val
    quantised = np.clip(np.round(scaled), min_val, max_val).astype(np.int32)
    return quantised * (2**weight_exp)

loihi_threshold_from_sc(sc_threshold, weight_bits=8)

Convert SC normalised threshold to Loihi integer threshold.

Source code in src/sc_neurocore/integrations/lava_bridge.py
Python
67
68
69
70
def loihi_threshold_from_sc(sc_threshold: float, weight_bits: int = 8) -> int:
    """Convert SC normalised threshold to Loihi integer threshold."""
    max_val = (1 << (weight_bits - 1)) - 1
    return int(np.round(sc_threshold * max_val))