Dieu, Patrick-Olivier: A Measurement Framework for Divergence in Representation Stability Under Transformation Complexity

We introduce a measurement framework for analyzing how neural representations diverge under transformations of increasing complexity. Rather than proposing a universal theory of representation learning or invariance, we define a single observable, $\Delta(c)$, which measures inter-model variance in representation stability across heterogeneous neural architectures. The framework is designed as an empirical instrument: it specifies how to measure divergence, how to control for parameterization ar