Robust. 2. Requires only coordinate data ... as a universal data compression problem. Future work ... Big Data Approach: Capture microstates of the system by.
Modern Physics Ends Where Life Begins: An Introduction to Random Gyration Matrix Theory 123 1
Jarod Benowitz,
23
Mike Schnieders, and 1Janice Robertson
Department of Molecular Physiology and Biophysics, 2Department of Biochemistry, 3Department of Biomedical Engineering, Carver College of Medicine, University of Iowa, Iowa City, 52240
Abstract
Beautiful Canonical Relations
Conformational state space is vast, and for living systems its structure is rich. The greatest hurdle in trajectory analysis is the navigation of conformational state space in a statistically rigorous and computationally tractable way. Here, we introduce a new paradigm using the novel tools of Random Matrix Theory and Information Theory, while maintaining flexibility and robustness for pragmatic biological implications.
Gyrotropic Entropy Data Compression
Building a Universal Model 1 Flexible
Equipped to handle any MD simulation
2 Robust
Requires only coordinate data Computationally Efficient
3 Rigorous
Theorem driven Mathematical formulation of biological questions
4 Falsifiable
NMR, X-ray, Neutron, Static, and Dynamic Light Scattering
The Fundamentals Conformational State Space To navigate conformational state space we need to derive an entropy function that is representative of . In other words, our entropy function must be an appropriate measure on . It is important to note that entropy as a measure of 'disorder' is a misnomer. Entropy is a measure of the information required to specify the state of a system. It is NOT an observable. The Entropy Problem
Formal definition
Random Matrix Theory Calculation of Probability Densities Random Matrix Theory (RMT) is the extension of probability theory to matrices. For Orthogonal and Unitary Invariant matrices, the joint probability densities of the matrix entries, relative to the Lebsegue measures, are functions of only the eigenvalues. From Cartesian Coordinates to Eigen-Coordinates Calculation of probability densities are drastically simplified by transforming to eigen-coordinates, which form a complete orthonormal basis
Figure 1. Conceptualization of Gyrotropic Entropy. As we move up the potential energy landscape, we move from a state of low entropy to a state of high entropy. The number of bits (Gyrotropic Entropy) required to specify the locations of the atoms in the simulation, increase as we move up the landscape. If we imagine a periodic lattice, fewer bits of information are required to specify the locations of the sites, since we need only specify the locations within a single unit cell. If we perturb the lattice in such a way as to create defects or waves across its surface, periodicity is lost and more bits of information would be required to specify the locations of the sites.
Conclusion We have used RMT to calculate probability densities that are representative of the state space of the system. How we handle these probability densities lie within the field of Information Theory. We have set the stage for a new and novel paradigm of trajectory analysis, where probing state space is reformulated as a universal data compression problem. Future work includes equipping the model with a symplectic geometry vis-à-vis the Fisher Information Metric, where the change of entropy becomes the action along a path on the statistical manifold provided by the fisher information.
Theorem 1.1 (Gaussian Orthogonal Gyration Density)
Model Workflow
Boltzmann-Gibbs Entropy
How Ought We Approach the Problem? 1 Derive a representative entropy function
Proof CM = 0
2 Indirectly measure the difference in entropy from time to time from an adequately choosen observable 3 Rigorous comparison of entropic differences of the system prepared in state a with respect to the system prepared in state b
diagonalize
4 Big Data Approach: Capture microstates of the system by measuring information loss from lossy data compression schemes.
Problem Statement
histograms bin size
Proposition
Characteristic-time The characteristic-time is the time required for some particular random fluctuation to occur in the system. It is a unitless value that is often times fine-tuned to be proportional to length-scales.
Fourier
Mass-Averaged Gyration Matrix
Inverse Fourier Matrix Properties
Gyration Density
Radius of Gyration
Shannon Entropy Gyrotropic Entropy (bits)
It's VERY fast!