Papers
arxiv:2601.19942

Latent Object Permanence: Topological Phase Transitions, Free-Energy Principles, and Renormalization Group Flows in Deep Transformer Manifolds

Published on Jan 16
Authors:

Abstract

Deep Transformer language models exhibit a phase transition in representation space marked by reduced effective dimensionality and the emergence of stable concept basins, identified through spectral analysis and geometric properties of hidden states.

AI-generated summary

We study the emergence of multi-step reasoning in deep Transformer language models through a geometric and statistical-physics lens. Treating the hidden-state trajectory as a flow on an implicit Riemannian manifold, we analyze the layerwise covariance spectrum of activations, where C^{(ell)}=E[h^{(ell)}h^{(ell)top}], and track deviations from a random-matrix bulk. Across model scales (1.5B--30B), we observe a sharp reduction in effective dimensionality consistent with a phase transition: an order parameter based on sparsity/localization, Ω(h)=1-|h|_1/(d|h|_2), exhibits a discontinuity near a critical normalized depth γ_capprox 0.42 in sufficiently large models. We formalize the forward pass as a discrete coarse-graining map and relate the appearance of stable "concept basins" to fixed points of this renormalization-like dynamics. The resulting low-entropy regime is characterized by a spectral tail collapse and by the formation of transient, reusable object-like structures in representation space, which we call Transient Class Objects (TCOs). We provide theoretical conditions connecting logical separability to spectral decay and validate the predicted signatures with layerwise probes on multiple open-weight model families.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2601.19942
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2601.19942 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2601.19942 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2601.19942 in a Space README.md to link it from this page.

Collections including this paper 1