[1] Typically, audiometric tests determine a subject's hearing levels with the help of an audiometer, but may also measure ability to discriminate between different sound intensities, recognize pitch, or distinguish speech from background noise.
Another model used a tripped hammer to strike a metal rod and produce the testing sound; in another, a tuning fork was struck.
[3] Following development of the induction coil in 1849 and audio transducers (telephone) in 1876, a variety of audiometers were invented in United States and overseas.
In 1885, Arthur Hartmann designed an "Auditory Chart" which included left and right ear tuning fork representation on the x -axis and percent of hearing on the y-axis.
In 1899, Carl E. Seashore Prof. of Psychology at U. Iowa, United States, introduced the audiometer as an instrument to measure the "keenness of hearing" whether in the laboratory, schoolroom, or office of the psychologist or aurist.
Schaefer and G. Gruschke, B. Griessmann and H. Schwarzkopf — demonstrated before the Berlin Oto-logical Society two instruments designed to test hearing acuity.
It was not until 1922 that otolaryngologist Dr. Edmund P. Fowler, and physicists Dr. Harvey Fletcher and Robert Wegel of Western Electric Co. first employed frequency at octave intervals plotted along the x axis and intensity downward along the y-axis as a degree of hearing loss.
With further technologic advances, bone conduction testing capabilities became a standard component of all Western Electric audiometers by 1928.
In 1967, Sohmer and Feinmesser were the first to publish auditory brainstem responses (ABR), recorded with surface electrodes in humans which showed that cochlear potentials could be obtained non-invasively.
The motion of the stapes against the oval window sets up waves in the fluids of the cochlea, causing the basilar membrane to vibrate.