Chaos — Chaotic RNG for Stochastic Computing¶
Deterministic-chaos random number generators for stochastic computing bitstream encoding. Provides alternatives to linear PRNGs (LFSR, Mersenne Twister) with desirable statistical properties for SC arithmetic.
Why Chaotic RNG for Stochastic Computing?¶
Stochastic computing encodes values as the probability of a 1-bit in a random bitstream. The quality of the random source directly affects arithmetic accuracy. Linear PRNGs have short-range correlations that bias SC multiplication (AND gates). Chaotic maps produce sequences with:
- Broadband spectrum — no periodic structure to alias with SC gate frequencies
- Low short-range autocorrelation — adjacent bits are nearly independent
- Deterministic reproducibility — same seed → same bitstream on hardware
- Minimal state — one float (logistic map) vs 624 words (MT19937)
Available Generators¶
ChaoticRNG — Logistic Map¶
The logistic map x_{n+1} = r * x_n * (1 - x_n) at r=4.0 is fully chaotic with Lyapunov exponent ln(2) ≈ 0.693. The invariant density is Beta(0.5, 0.5) on (0, 1) — values cluster near 0 and 1. The generate_bitstream() method applies the inverse CDF (2/π) * arcsin(√x) to uniformize before thresholding.
| Parameter | Default | Meaning |
|---|---|---|
r |
4.0 | Bifurcation parameter. Must be in (3.57, 4.0] for chaos. |
x |
0.37 | Initial condition in (0, 1). Avoid 0, 0.5, 1 (fixed/periodic points). |
burn_in |
100 | Steps to discard before first output. |
Analysis methods:
lyapunov_exponent(n_steps)— Estimate maximal Lyapunov exponent via derivative averaging. At r=4.0 the theoretical value is ln(2) ≈ 0.6931.shannon_entropy(n_samples, n_bins)— Estimate Shannon entropy in bits. At r=4.0, values follow Beta(0.5, 0.5) with entropy ~log2(n_bins) - 0.27 bits below uniform.autocorrelation(n_samples, max_lag)— Compute autocorrelation up tomax_lag. A good chaotic RNG shows near-zero autocorrelation for all lags > 0.
TentMapRNG — Piecewise Linear Alternative¶
The tent map x_{n+1} = μ * min(x_n, 1 - x_n) is topologically conjugate to the logistic map at r=4 but has uniform invariant density on (0, 1) — no CDF correction needed for SC bitstreams.
| Parameter | Default | Meaning |
|---|---|---|
mu |
1.9999 | Slope parameter. Must be in (1, 2]. Default slightly below 2.0 to avoid float64 degeneracy. |
x |
0.37 | Initial condition in (0, 1). |
Reference: Phatak & Rao, "Logistic map as a random number generator", Physical Review E 51(4), 1995.
Usage¶
from sc_neurocore.chaos import ChaoticRNG, TentMapRNG
# Logistic map — generate SC bitstream
rng = ChaoticRNG(r=4.0, x=0.37)
bitstream = rng.generate_bitstream(p=0.7, length=10000)
print(f"P(1) = {bitstream.mean():.3f}") # ≈ 0.700
# Quality check
print(f"Lyapunov: {rng.lyapunov_exponent():.4f}") # ≈ 0.6931
print(f"Entropy: {rng.shannon_entropy():.2f} bits")
# Tent map — uniform output, no CDF correction
tent = TentMapRNG(mu=1.9999, x=0.37)
samples = tent.random(10000)
print(f"Mean: {samples.mean():.3f}") # ≈ 0.500
# Vectorized parallel maps for bulk generation
bulk = rng.random_vectorized(size=100000, n_maps=8)
sc_neurocore.chaos.rng
¶
Chaotic random number generators for stochastic computing bitstreams.
Provides deterministic-chaos alternatives to linear PRNGs (LFSR, MT19937) for SC bitstream encoding. Chaotic maps produce sequences with desirable statistical properties for stochastic arithmetic: uniform marginal distribution (at r=4), low short-range autocorrelation, and tunable correlation structure.
References
Phatak & Rao, "Logistic map as a random number generator", Physical Review E 51(4), 1995.
Li et al., "Chaotic random bit generator realized with a microcontroller", J. Micromech. Microeng., 2019.
ChaoticRNG
dataclass
¶
Logistic-map chaotic RNG for SC bitstream generation.
x_{n+1} = r * x_n * (1 - x_n)
At r=4.0 the logistic map is fully chaotic with Lyapunov exponent ln(2) ~ 0.693 and an invariant density Beta(0.5, 0.5) on (0, 1). The 100-step burn-in discards transients from the initial condition.
Parameters¶
r : float Bifurcation parameter. Must be in (3.57, 4.0] for chaos. Default 4.0 gives maximal chaos. x : float Initial condition in (0, 1). Avoid 0.0, 0.5, 1.0 exactly (these are fixed/periodic points at r=4). burn_in : int Steps to discard before first output.
Example¶
rng = ChaoticRNG(r=4.0, x=0.37) bits = rng.generate_bitstream(p=0.5, length=1000) 0.4 < bits.mean() < 0.6 True
Source code in src/sc_neurocore/chaos/rng.py
32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 | |
state
property
¶
Current internal state.
random(size)
¶
Generate size chaotic floats in (0, 1).
Uses scalar iteration (logistic map is inherently sequential).
For bulk generation, see :meth:random_vectorized.
Source code in src/sc_neurocore/chaos/rng.py
75 76 77 78 79 80 81 82 83 84 85 86 87 88 | |
random_vectorized(size, n_maps=8)
¶
Generate samples from n_maps independent logistic maps in parallel.
Seeds each map with a different initial condition derived from the primary state. Returns size samples by round-robin interleaving. Faster than scalar iteration for large size at the cost of slightly different correlation structure.
Source code in src/sc_neurocore/chaos/rng.py
90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 | |
generate_bitstream(p, length)
¶
Generate SC bitstream where P(bit=1) ~ p.
Applies the inverse CDF of Beta(0.5, 0.5) to map chaotic samples to uniform, then thresholds at p. This corrects for the non-uniform invariant density of the logistic map at r=4.
Parameters¶
p : float Target probability in [0, 1]. length : int Number of bits.
Returns¶
np.ndarray Binary array of shape (length,), dtype uint8.
Source code in src/sc_neurocore/chaos/rng.py
113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 | |
lyapunov_exponent(n_steps=10000)
¶
Estimate the maximal Lyapunov exponent via derivative averaging.
For the logistic map: lambda = (1/N) * sum(ln|r(1 - 2x_n)|). At r=4.0, the theoretical value is ln(2) ~ 0.6931.
Source code in src/sc_neurocore/chaos/rng.py
137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 | |
shannon_entropy(n_samples=10000, n_bins=100)
¶
Estimate Shannon entropy of the chaotic sequence in bits.
Uniform distribution on (0, 1) has entropy log2(n_bins). At r=4.0, values follow Beta(0.5, 0.5) with theoretical entropy ~ log2(n_bins) - 0.27 bits below uniform.
Source code in src/sc_neurocore/chaos/rng.py
154 155 156 157 158 159 160 161 162 163 164 165 | |
autocorrelation(n_samples=10000, max_lag=50)
¶
Compute autocorrelation of the chaotic sequence up to max_lag.
A good chaotic RNG should show near-zero autocorrelation for all lags > 0. Returns array of shape (max_lag + 1,) with R[0] = 1.0.
Source code in src/sc_neurocore/chaos/rng.py
167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 | |
reset(x=None)
¶
Reset to initial condition (with fresh burn-in).
Source code in src/sc_neurocore/chaos/rng.py
187 188 189 190 191 | |
TentMapRNG
dataclass
¶
Tent map chaotic RNG — piecewise linear alternative to logistic map.
x_{n+1} = mu * min(x_n, 1 - x_n)
At mu=2.0 the tent map is topologically conjugate to the logistic map at r=4.0 but has uniform invariant density on (0, 1) — better for SC bitstream generation where uniform marginals are desired.
Parameters¶
mu : float Slope parameter. Must be in (1, 2] for chaos. Default 2.0. x : float Initial condition in (0, 1).
Source code in src/sc_neurocore/chaos/rng.py
199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 | |
random(size)
¶
Generate size chaotic floats in (0, 1).
Source code in src/sc_neurocore/chaos/rng.py
240 241 242 243 244 245 246 247 248 | |
generate_bitstream(p, length)
¶
Generate SC bitstream where P(bit=1) ~ p.
Source code in src/sc_neurocore/chaos/rng.py
250 251 252 253 | |
reset(x=None)
¶
Reset to initial condition (with fresh burn-in).
Source code in src/sc_neurocore/chaos/rng.py
255 256 257 258 259 | |