Wire-9M (H32 L8)
WireNative 9M โ n_harmonics=32, 8 layers, best BPB checkpoint
Part of the Harmonic GPT research into oscillator-based neural computation.
Architecture: WireNative
| Property | Value |
|---|---|
| Parameters | 8,894,520 |
| BPB | 3.0866 |
| Training step | 5,000 |
| n_harmonics | 64 |
| n_layers | 8 |
| n_groups | 7 |
| d_model | 896 |
| Vocab | 256 (raw bytes) |
Usage
from huggingface_hub import hf_hub_download
from safetensors.torch import load_file
weights = load_file(hf_hub_download("MonumentalSystems/wire-9m-best", "model.safetensors"))
config = json.load(open(hf_hub_download("MonumentalSystems/wire-9m-best", "config.json")))
All operations are native Clifford algebra / harmonic oscillator dynamics โ no softmax attention, no MLP, no ReLU.
- Downloads last month
- 259