Random matrix

Many physical phenomena, such as the spectrum of nuclei of heavy atoms,[1][2] the thermal conductivity of a lattice, or the emergence of quantum chaos,[3] can be modeled mathematically as problems concerning large, random matrices.

In nuclear physics, random matrices were introduced by Eugene Wigner to model the nuclei of heavy atoms.

[4] In solid-state physics, random matrices model the behaviour of large disordered Hamiltonians in the mean-field approximation.

Random matrix theory also saw applications in neuronal networks[17] and deep learning, with recent work utilizing random matrices to show that hyper-parameter tunings can be cheaply transferred between large neural networks without the need for re-training.

[18] In numerical analysis, random matrices have been used since the work of John von Neumann and Herman Goldstine[19] to describe computation errors in operations such as matrix multiplication.

Voiculescu introduced the concept of freeness around 1983 in an operator algebraic context; at the beginning there was no relation at all with random matrices.

In the field of computational neuroscience, random matrices are increasingly used to model the network of synaptic connections between neurons in the brain.

Dynamical models of neuronal networks with random connectivity matrix were shown to exhibit a phase transition to chaos[24] when the variance of the synaptic weights crosses a critical value, at the limit of infinite system size.

Results on random matrices have also shown that the dynamics of random-matrix models are insensitive to mean connection strength.

[27][28] In the analysis of massive data such as fMRI, random matrix theory has been applied in order to perform dimension reduction.

The criteria for selecting components can be multiple (based on explained variance, Kaiser's method, eigenvalue, etc.).

This matrix calculated in this way becomes the null hypothesis that allows one to find the eigenvalues (and their eigenvectors) that deviate from the theoretical random range.

When such operators are discretized to perform computational simulations, their accuracy is limited by the missing physics.

Random matrices have been used in this sense,[33] with applications in vibroacoustics, wave propagations, materials science, fluid mechanics, heat transfer, etc.

Random matrix theory can be applied to the electrical and communications engineering research efforts to study, model and develop Massive Multiple-Input Multiple-Output (MIMO) radio systems.

[citation needed] Random matrix theory first gained attention beyond mathematics literature in the context of nuclear physics.

Experiments by Enrico Fermi and others demonstrated evidence that individual nucleons cannot be approximated to move independently, leading Niels Bohr to formulate the idea of a compound nucleus.

Because there was no knowledge of direct nucleon-nucleon interactions, Eugene Wigner and Leonard Eisenbud approximated that the nuclear Hamiltonian could be modeled as a random matrix.

[34] The most-commonly studied random matrix distributions are the Gaussian ensembles: GOE, GUE and GSE.

The joint probability density for the eigenvalues λ1, λ2, ..., λn of GUE/GOE/GSE is given by where Zβ,n is a normalization constant which can be explicitly computed, see Selberg integral.

above the main diagonal are independent random variables with zero mean and have identical second moments.

[37][38] The spectral theory of random matrices studies the distribution of the eigenvalues as the size of the matrix goes to infinity.

The cumulative distribution function of the limiting measure is called the integrated density of states and is denoted N(λ).

[40][41] The limit of the empirical spectral measure of invariant matrix ensembles is described by a certain integral equation which arises from potential theory.

, the number of dimensions of the gaussian ensemble increases, the proportion of the eigenvalues falling within the interval converges to

increases, then we obtain strictly stronger theorems, named "local laws" or "mesoscopic regime".

For example, the Ginibre ensemble has a mesoscopic law: For any sequence of shrinking disks with areas

The following result expresses these correlation functions as determinants of the matrices formed from evaluating the appropriate integral kernel at the pairs

In the important special case considered by Wishart, the entries of X are identically distributed Gaussian random variables (either real or complex).

The limit of the empirical spectral measure of Wishart matrices was found[40] by Vladimir Marchenko and Leonid Pastur.

Spectral density of GOE/GUE/GSE, as . They are normalized so that the distributions converge to the semicircle distribution . The number of "humps" is equal to N.