There is no general definition of an operator, but the term is often used in place of function when the domain is a set of functions or other structured objects.
Also, the domain of an operator is often difficult to characterize explicitly (for example in the case of an integral operator), and may be extended so as to act on related objects (an operator that acts on functions may act also on differential equations whose solutions are functions that satisfy the equation).
In more technical words, linear operators are morphisms between vector spaces.
In the finite-dimensional case linear operators can be represented by matrices in the following way.
and V be finite-dimensional vector spaces over K. Let us select a basis
Thus in fixed bases n-by-m matrices are in bijective correspondence to linear operators from
The important concepts directly related to operators between finite-dimensional vector spaces are the ones of rank, determinant, inverse operator, and eigenspace.
Linear operators also play a great role in the infinite-dimensional case.
The concepts of rank and determinant cannot be extended to infinite-dimensional matrices.
The study of linear operators in the infinite-dimensional case is known as functional analysis (so called because various classes of functions form interesting examples of infinite-dimensional vector spaces).
The most important cases are sequences of real or complex numbers, and these spaces, together with linear subspaces, are known as sequence spaces.
Bounded linear operators over a Banach space form a Banach algebra in respect to the standard operator norm.
The theory of Banach algebras develops a very general concept of spectra that elegantly generalizes the theory of eigenspaces.
Let U and V be two vector spaces over the same ordered field (for example;
Then a linear operator from U to V is called bounded if there exists c > 0 such that
for every x in U. Bounded operators form a vector space.
It is possible to generalize spectral theory to such algebras.
C*-algebras, which are Banach algebras with some additional structure, play an important role in quantum mechanics.
[3] In geometry, additional structures on vector spaces are sometimes studied.
Operators that map such vector spaces to themselves bijectively are very useful in these studies, they naturally form groups by composition.
They form the general linear group under composition.
However, they do not form a vector space under operator addition; since, for example, both the identity and −identity are invertible (bijective), but their sum, 0, is not.
Operators preserving the Euclidean metric on such a space form the isometry group, and those that fix the origin form a subgroup known as the orthogonal group.
Indeed, every covariance is basically a dot product: Every variance is a dot product of a vector with itself, and thus is a quadratic norm; every standard deviation is a norm (square root of the quadratic norm); the corresponding cosine to this dot product is the Pearson correlation coefficient; expected value is basically an integral operator (used to measure weighted shapes in the space).
The Fourier transform is useful in applied mathematics, particularly physics and signal processing.
It is another integral operator; it is useful mainly because it converts a function on one (temporal) domain to a function on another (frequency) domain, in a way effectively invertible.
No information is lost, as there is an inverse transform operator.
In the simple case of periodic functions, this result is based on the theorem that any continuous periodic function can be represented as the sum of a series of sine waves and cosine waves:
The tuple ( a0, a1, b1, a2, b2, ... ) is in fact an element of an infinite-dimensional vector space ℓ2 , and thus Fourier series is a linear operator.
, the transform takes on an integral form: The Laplace transform is another integral operator and is involved in simplifying the process of solving differential equations.