Bandwidth (computing)

[3][4] This definition of bandwidth is in contrast to the field of signal processing, wireless communications, modem data transmission, digital communications, and electronics,[citation needed] in which bandwidth is used to refer to analog signal bandwidth measured in hertz, meaning the frequency range between lowest and highest attainable frequency while meeting a well-defined impairment level in signal power.

The consumed bandwidth in bit/s, corresponds to achieved throughput or goodput, i.e., the average rate of successful data transfer through a communication path.

For example, a channel with x bit/s may not necessarily transmit data at x rate, since protocols, encryption, and other factors can add appreciable overhead.

For instance, much internet traffic uses the transmission control protocol (TCP), which requires a three-way handshake for each transaction.

[5] Asymptotic bandwidths are usually estimated by sending a number of very large messages through the network, measuring the end-to-end throughput.

[7] The most widely used data compression technique for media bandwidth reduction is the discrete cosine transform (DCT), which was first proposed by Nasir Ahmed in the early 1970s.

[13] The MOSFET (MOS transistor) was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959,[14][15][16] and went on to become the basic building block of modern telecommunications technology.