For example, the median filter is widely used to remove spike noise — that affects only a small percentage of the samples, possibly by very large amounts.
In particular, if S and N do not overlap in the frequency domain, they can be completely separated by linear bandpass filters.
For almost any other form of noise, on the other hand, some sort of non-linear filter will be needed for maximum signal recovery.
In digital image processing, for example, one may wish to preserve the sharpness of silhouette edges of objects in photographs, or the connectivity of lines in scanned drawings.
For example, if an image contains a low amount of noise but with relatively high magnitude, then a median filter may be more appropriate.
The context here is the formulation of the nonlinear filtering problem seen through the lens of the theory of stochastic processes.
In this context, both the random signal and the noisy partial observations are described by continuous time stochastic processes.
Given the nonlinear nature of the dynamics, familiar frequency domain concepts that can be applied to linear filters are not viable, and a theory based on the state space representation is formulated.
The problem of optimal nonlinear filtering in this context was solved in the late 1950s and early 1960s by Ruslan L. Stratonovich[1][2][3][4] and Harold J.
[10] Particle filters[11] are another option, related to sequential Monte Carlo methods.
[12] Energy can be moved to higher or lower frequency bands, spread over a designed range, or focused.
The size and shape of the neighborhood are defined by a structuring element, typically a square or circular mask.
For example, if you have text that is lightly printed, the minimum filter makes letters thicker.
For example, if you have a text string drawn with a thick pen, you can make the sign skinnier.