Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions.
[5][6][7][8][9][10][11] Interatomic potentials can be written as a series expansion of functional terms that depend on the position of one, two, three, etc.
Even for single well-known elements such as silicon, a wide variety of potentials quite different in functional form and motivation have been developed.
Over time interatomic potentials have largely grown more complex and more accurate, although this is not strictly true.
Until recently, all interatomic potentials could be described as "parametric", having been developed and optimized with a fixed number of (physical) terms and parameters.
New research focuses instead on non-parametric potentials which can be systematically improvable by using complex local atomic neighbor descriptors and separate mappings to predict system properties, such that the total number of terms and parameters are flexible.
[16] These non-parametric models can be significantly more accurate, but since they are not tied to physical forms and parameters, there are many potential issues surrounding extrapolation and uncertainties.
[6] On its own, this potential is quantitatively accurate only for noble gases and has been extensively studied in the past decades,[20] but is also widely used for qualitative studies and in systems where dipole interactions are significant, particularly in chemistry force fields to describe intermolecular interactions - especially in fluids.
For very short interatomic separations, important in radiation material science, the interactions can be described quite accurately with screened Coulomb potentials which have the general form Here,
[24] and more accurate ones can be obtained from all-electron quantum chemistry calculations [25] In binary collision approximation simulations this kind of potential can be used to describe the nuclear stopping power.
[43][44] [45] [46] EAM potentials have also been extended to describe covalent bonding by adding angular-dependent terms to the electron density function
The term force field characterizes the collection of parameters for a given interatomic potential (energy function) and is often used within the computational chemistry community.
Force fields are used for the simulation of metals, ceramics, molecules, chemistry, and biological systems, covering the entire periodic table and multiphase materials.
Current research in interatomic potentials involves using systematically improvable, non-parametric mathematical forms and increasingly complex machine learning methods.
An accurate machine-learning potential requires both a robust descriptor and a suitable machine learning framework.
However, the accuracy of a machine-learning potential can be converged to be comparable with the underlying quantum calculations, unlike analytical models.
[62] Since the interatomic potentials are approximations, they by necessity all involve parameters that need to be adjusted to some reference values.
[63][64] Lennard-Jones potential can typically describe the lattice parameters, surface energies, and approximate mechanical properties.
[66][67] For solids, a many-body potential can often describe the lattice constant of the equilibrium crystal structure, the cohesive energy, and linear elastic constants, as well as basic point defect properties of all the elements and stable compounds well, although deviations in surface energies often exceed 50%.
For any but the simplest model forms, sophisticated optimization and machine learning methods are necessary for useful potentials.
Key aspects here are the correct representation of chemical bonding, validation of structures and energies, as well as interpretability of all parameters.
Early neural networks showed promise, but their inability to systematically account for interatomic energy interactions limited their applications to smaller, low-dimensional systems, keeping them largely within the confines of academia.
[80][81][82] Modern neural networks have revolutionized the construction of highly accurate and computationally light potentials by integrating theoretical understanding of materials science into their architectures and preprocessing.
Encoding symmetry has been pivotal in enhancing machine learning potentials by drastically constraining the neural networks' search space.
In 2017, the first-ever MPNN model, a deep tensor neural network, was used to calculate the properties of small organic molecules.
Advancements in this technology led to the development of Matlantis in 2022, which commercially applies machine learning potentials for new materials discovery.
[84] Matlantis, which can simulate 72 elements, handle up to 20,000 atoms at a time, and execute calculations up to 20 million times faster than density functional theory with almost indistinguishable accuracy, showcases the power of machine learning potentials in the age of artificial intelligence.
[97] Classical interatomic potentials often exceed the accuracy of simplified quantum mechanical methods such as density functional theory at a million times lower computational cost.
As a limitation, electron densities and quantum processes at the local scale of hundreds of atoms are not included.
[98] The robustness of a model at different conditions other than those used in the fitting process is often measured in terms of transferability of the potential.