Sumikawa to coin "earcon" as the auditory equivalent in a 1985 article, 'Guidelines for the integration of audio cues into computer user interfaces.
Assistive technologies for computing devices—such as screen readers including ChromeOS's ChromeVox, Android's TalkBack and Apple's VoiceOver—use earcons as a convenient and fast means of conveying to blind or visually impaired users contextual information about the interface they are navigating.
Earcons in screen readers largely serve as auditory cues to inform the user that they have selected a particular type of interface element, such as a button, hyperlink or text input field.
To help with learning such associations, some screen readers will also speak the meanings of their respective earcons, albeit towards the end of their full description of an interface element.
It is recommended that earcons be introduced early on when learning how to use a screen reader to ensure that they become impulsively (and eventually, subconsciously) associated through habitual usage.