[1] He is the co-inventor of the Widrow–Hoff least mean squares filter (LMS) adaptive algorithm with his then doctoral student Ted Hoff.
[3] He is the namesake of "Uncle Bernie's Rule": the training sample size should be 10 times the number of weights in a network.
For his masters thesis (1953, advised by William Linvill), he worked on raising the signal-to-noise ratio of the sensing signal of magnetic core memory.
Back then, the hysteresis loops for magnetic core memory was not square enough, making sensing signal noisy.
They improved the previous adaptive filter so that it makes a gradient descent for each datapoint, resulting in the delta rule and the ADALINE.
During a meeting with Frank Rosenblatt, Widrow argued that the S-units in the perceptron machine should not be connected randomly to the A-units.
[14] At a 1985 conference in Snowbird, Utah, he noticed that neural network research was returning, and he also learned of the backpropagation algorithm.