The original design was proposed in the 1940 IRE paper, A New Standard Volume Indicator and Reference Level, written by experts from CBS, NBC, and Bell Telephone Laboratories.
[1] The Acoustical Society of America then standardized it in 1942 (ANSI C16.5-1942)[2][3] for use in telephone installation and radio broadcast stations.
This has the effect of averaging out peaks and troughs of short duration, and reflects the perceived loudness of the material more closely than the more modern and initially more expensive PPM meters.
In the broadcast industry, loudness monitoring was standardized, in 2009 in the United States by the ATSC A/85, in 2010 in Europe by the EBU R 128, in 2011 in Japan by the TR-B32, and in 2010 in Australia by the OP-59.
The original designers of the VU meter were tasked with finding a way to measure complex audio signals with a simple technology.
(CFRB Toronto and CFPL London Canada) This was yet another "standard" established in the early years of audio and the VU meter was altered by changing the series resistors to adjust its sensitivity.
The VU meter and its attenuator should present a 7,500-ohm impedance to the circuit it is applied to, measured with a sinusoid signal that sets the indicator to 0 dB.
[11] In the 1970s–80s, neon-filled, planar dual displays with up to 201 segments per stereo channel[12] were popular among broadcasters as fast bar graph VU meters.
The ballistics shown by this instrument, in response to signals with a large crest factor, position its readings halfway between both.