Natural language understanding

[2] There is considerable commercial interest in the field because of its application to automated reasoning,[3] machine translation,[4] question answering,[5] news-gathering, text categorization, voice-activation, archiving, and large-scale content analysis.

A year later, in 1965, Joseph Weizenbaum at MIT wrote ELIZA, an interactive program that carried on a dialogue in English on any topic, the most popular being psychotherapy.

ELIZA worked by simple parsing and substitution of key words into canned phrases and Weizenbaum sidestepped the problem of giving the program a database of real-world knowledge or a rich lexicon.

[12] This model, partially influenced by the work of Sydney Lamb, was extensively used by Schank's students at Yale University, such as Robert Wilensky, Wendy Lehnert, and Janet Kolodner.

SHRDLU could understand simple English sentences in a restricted world of children's blocks to direct a robotic arm to move items.

A number of commercial efforts based on the research were undertaken, e.g., in 1982 Gary Hendrix formed Symantec Corporation originally as a company for developing a natural language interface for database queries on personal computers.

"[21] The umbrella term "natural language understanding" can be applied to a diverse set of computer applications, ranging from small, relatively simple tasks such as short commands issued to robots, to highly complex endeavors such as the full comprehension of newspaper articles or poetry passages.

Many real-world applications fall between the two extremes, for instance text classification for the automatic analysis of emails and their routing to a suitable department in a corporation does not require an in-depth understanding of the text,[22] but needs to deal with a much larger vocabulary and more diverse syntax than the management of simple queries to database tables with fixed schemata.

Throughout the years various attempts at processing natural language or English-like sentences presented to computers have taken place at varying degrees of complexity.

For example, Wayne Ratliff originally developed the Vulcan program with an English-like syntax to mimic the English speaking computer in Star Trek.

Systems that attempt to understand the contents of a document such as a news release beyond simple keyword matching and to judge its suitability for a user are broader and require significant complexity,[26] but they are still somewhat shallow.