Aethon-N1-Base-Open-Structure
Introduction
Aethon-N1-Base-Open-Structure is a public release built around a simple claim:
intelligence does not have to live inside transformer weights to be real, portable, updateable, and useful.
This release presents Open Structure as an alternative model artifact.
Instead of shipping only a frozen parameter block, Aethon ships the learned structure itself:
- persistent memory
- semantic grounding
- query-form understanding
- reasoning policy
- surface realization
- contradiction history
The released artifact is:
metadata.jsongraph.sqlite3
That is the model.
Model Summary
What Open Structure Means
Transformer releases usually center on:
- attention
- weights
- steps
- epochs
- checkpoint snapshots
Aethon centers on:
- durable learned structure
- direct one-shot integration
- revision
- contradiction tracking
- abstraction
- reusable memory
This is why the release is called Open Structure, not open weights.
Why Not Weights
Weights are only one storage strategy.
Aethon stores usable intelligence in explicit, portable structure:
- concepts
- active relations
- contradiction records
- semantic aliases
- query forms
- reasoning rules
- surface realization patterns
The artifact is not just “parameters that once learned.”
It is “the learned structure that still knows.”
Why Not Attention
Aethon does not depend on transformer attention as its primary persistence mechanism.
Its memory is not a temporary prompt window that disappears after generation.
Its memory persists across sessions inside the released bundle.
Why One-Shot Instead Of Epoch Training
For Aethon, one-shoting is not a weaker substitute for training.
It is the core learning act.
One-shot structural integration means the model grows by:
- absorbing new knowledge
- binding it into persistent structure
- preserving contradictions instead of washing them out
- materializing abstractions from learned structure
- reusing that structure on future prompts
What SC Means
SC means Structural Capacity.
It is Aethon’s size unit.
SC is used because parameter count does not describe what this system really is.
SC reflects growth in usable learned structure:
- concepts
- explicit relations
- abstractions
- revisions
- persistent memory
Why We Believe This Is AGI-Shaped
AGI means intelligence that transfers across many human task types rather than staying trapped in one narrow lane.
The release path for Aethon is built around that transfer claim.
The current release wall includes:
- multilingual mixed prompts
- planning, business, and scheduling transfer
- longer story continuity
- adversarial unseen cross-domain composition
- code, math, world, identity, and reasoning transfer
That is why Aethon is described here as AGI-shaped:
- it learns persistently
- it transfers across domains
- it reasons over what it has learned
- it generalizes into prompts it has not seen exactly before
Human-Like Learning
The claim is not that Aethon is biologically human.
The claim is that its learning behavior is closer to human-style accumulation and revision than to frozen transformer replay.
Human learners:
- absorb new facts
- keep durable memory
- revise beliefs when conflicting evidence appears
- transfer what they know across domains
- reuse prior structure to answer new questions
Aethon does the same in structural form:
- new facts are integrated
- contradictions are recorded
- abstractions are materialized
- prior knowledge is transferred into new answers
- memory persists after the prompt ends
Capability Snapshot
Can It Learn
Yes.
Aethon learns by structural integration and keeps the result as persistent structure.
Can It Reason
Yes.
Aethon reasons through:
- multi-hop traversal
- composition
- revision tracking
- structural derivation
- cross-domain transfer
Can It Write Long-Form
Yes.
Aethon can produce longer reasoning text, story continuation, and multi-part responses rather than collapsing into one short canned line.
Can It Generalize
Yes.
Aethon generalizes by reusing learned structure across:
- unseen prompts
- multilingual prompts
- planning prompts
- ontology prompts
- adversarial mixed-domain prompts
Quickstart
Aethon is released with a model-facing runtime path.
There is no PyPI package named aethon-open-structure-python.
Use the Hugging Face release directly.
Install from the release
git lfs install
git clone https://huggingface.co/OkeyMetaLtd/Aethon-N1-Base-Open-Structure
cd Aethon-N1-Base-Open-Structure
pip install -r requirements.txt
Python usage
from aethon_open_structure import AethonOpenStructureModel
model = AethonOpenStructureModel.from_hub("OkeyMetaLtd/Aethon-N1-Base-Open-Structure")
try:
reply = model.ask(
"Amina used to live in Lagos, now lives in Accra, and keeps her notebook where she sleeps. "
"Where is the notebook now, and explain the reasoning clearly."
)
print(reply.text)
instructed = model.ask_messages(
[
{"role": "system", "content": "Answer in exactly three sentences and keep each sentence grounded."},
{
"role": "user",
"content": "Take this carefully and answer each part in one flowing response: where is Amina, what does regional launch depend on, and what is your tokenizer?",
},
]
)
print(instructed.text)
finally:
model.close()
CLI usage
python run_aethon.py --ask "What is your tokenizer?"
python run_aethon.py --ask "Amina moved from Lagos to Accra. What changed about her location?"
Runtime files included in the release:
aethon_open_structure/...examples/aethon_open_structure_python.pyrun_aethon.pyruntime/aethon/...
Prompt Examples
These are examples, not a fixed prompt menu.
Long reasoning
Amina used to live in Lagos, but she moved to Accra and now keeps her blue notebook in the same place she sleeps. If someone asks where her notebook is now, answer directly and explain the reasoning in your own words.
Planning and scheduling
Tunde has a client call at 2 PM, lunch at 2 PM, and a report that must be finished before the client call. What should happen first, what should be rescheduled, and why?
Story continuity
Tell me the story of Zainab starting from the point where she misses the last train, finds a stranger's map, and decides not to give up. Then continue the story after she reaches the station and discovers the map was outdated.
Multilingual mixed prompt
Donde esta Amina now, and what changed about her location after she left Lagos? Puis explique la relation between Amina and Nigeria in simple words.
Adversarial cross-domain composition
If a module depends on CPython, a planner says the deployment must happen before testing, and the meeting time conflicts with the deployment window, what should be revised first and how would you explain that plan to a human teammate?
Evaluation
Current Ship Candidate
| Item | Value |
|---|---|
| Bundle | aethon_n1_base_full_parallel_v26 |
| Public contract | aethon.n1.bundle.v1 |
| Release class | open-structure |
| Size unit | Structural Capacity (SC) |
| SC | 112,897 |
| Concepts | 27,344 |
| Explicit relations | 81,899 |
| Abstractions | 3,654 |
| Raw unit residue | 0 |
Native Benchmark Wall
| Suite | Result | Accuracy | Time |
|---|---|---|---|
aethon_n1_benchmark_v6.jsonl |
43 / 43 |
1.0 |
3.476s |
aethon_n1_benchmark_v7.jsonl |
15 / 15 |
1.0 |
18.488s |
aethon_n1_benchmark_v8.jsonl |
10 / 10 |
1.0 |
89.170s |
What This Wall Covers
- multilingual mixed prompts
- planning, business, and scheduling transfer
- longer story continuity
- adversarial unseen composition
- code, math, world, identity, and reasoning transfer
- open-grounded answers on unseen prompts
- religion transfer under fresh setup facts
- instruction-sensitive prompt checks
- native system-guided instruction following
- long mixed prompts with exact sentence-shape pressure
One-Shot Data
This ship bundle was one-shotted across six native lanes:
identityreasoningmathcodestoryworld
The ship corpus includes:
- Aethon native identity, code, math, story, and reasoning corpora
- Aethon AGI transfer corpora
- Humanity's Last Exam transfer corpus
- Anthropic HH-RLHF instruction and safety corpus
- theology and religion grounding corpora
- curated reasoning bases
- multilingual base mixes
- code and tool-use corpora
- story and chat continuity corpora
- world knowledge corpora
- multilingual news and world sources
The file-level one-shot provenance for this ship candidate is included in:
bundle/corpus_manifest.json
Humanity's Last Exam And Unseen Questions
This release bundle includes Humanity's Last Exam transfer data through the native one-shot pipeline.
That does not mean the model is limited to replaying HLE content.
The point of the inclusion is to widen transfer pressure and improve unseen-question handling inside the Open Structure base.
More Versions Coming
This release is part of a continuing line.
Future Open Structure releases will push:
- larger one-shot corpora
- harder benchmark walls
- broader multilingual coverage
- deeper planning and story continuity
- stronger portable runtimes
License
This Open Structure release is published under:
CC BY-NC 4.0
See:
docs/AETHON_OPEN_STRUCTURE_LICENSE.md
Citation
If you use, benchmark, discuss, or build on this release, cite it.
Suggested citation:
@misc{aethon_open_structure_v25,
title = {Aethon-N1-Base-Open-Structure},
author = {OkeyMeta Ltd},
year = {2026},
howpublished = {\url{https://huggingface.co/OkeyMetaLtd/Aethon-N1-Base-Open-Structure}},
note = {Aethon Open Structure release}
}
Public Contract
See:
docs/AETHON_N1_BUNDLE_SPEC.mddocs/aethon_n1_bundle_schema.jsondocs/AETHON_OPEN_STRUCTURE_RUNTIME.md
Release Artifact
Bundle files:
bundle/metadata.jsonbundle/graph.sqlite3bundle/corpus_manifest.jsonbundle/integration_report.json
Current release facts:
public_contract:aethon.n1.bundle.v1release_class:open-structuretokenizer:Aethon Native Concept Codec (ANCC)size_unit:Structural Capacity (SC)
Additional docs in this release:
docs/AETHON_N1_BUNDLE_SPEC.mddocs/aethon_n1_bundle_schema.jsondocs/AETHON_OPEN_STRUCTURE_RUNTIME.md
Python package entry point:
aethon_open_structure/__init__.pyaethon_open_structure/model.py
Portable runtime entry points:
python -c "from aethon_open_structure import AethonOpenStructureModel; ..."examples/aethon_open_structure_python.pyrun_aethon.pyruntime/aethon/...