ELIZA

[4][5][6] Whereas the ELIZA program itself was written (originally)[7] in MAD-SLIP, the pattern matching directives that contained most of its language capability were provided in separate "scripts", represented in a lisp-like representation.

[3] Many academics believed that the program would be able to positively influence the lives of many people, particularly those with psychological issues, and that it could aid doctors working on such patients' treatment.

Joseph Weizenbaum's ELIZA, running the DOCTOR script, created a conversational interaction somewhat similar to what might take place in the office of "a [non-directive] psychotherapist in an initial psychiatric interview"[17] and to "demonstrate that the communication between man and machine was superficial".

[20] Weizenbaum chose to make the DOCTOR script in the context of psychotherapy to "sidestep the problem of giving the program a data base of real-world knowledge",[3] allowing it to reflect back the patient's statements in order to carry the conversation forward.

Weizenbaum first implemented ELIZA in his own SLIP list-processing language, where, depending upon the initial entries by the user, the illusion of human intelligence could appear, or be dispelled through several interchanges.

[2] Some of ELIZA's responses were so convincing that Weizenbaum and several others have anecdotes of users becoming emotionally attached to the program, occasionally forgetting that they were conversing with a computer.

Weizenbaum was surprised by this, later writing: "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.

It was 11 years before the personal computer became familiar to the general public, and three decades before most people encountered attempts at natural language processing in Internet services like Ask.com or PC help systems such as Microsoft Office Clippit.

[25] Weizenbaum originally wrote ELIZA in MAD-SLIP for CTSS on an IBM 7094 as a program to make natural-language conversation possible with a computer.

[20] These steps represent the bulk of the procedures that ELIZA follows in order to create a response from a typical input, though there are several specialized situations that ELIZA/DOCTOR can respond to.

This would allow the program to be applied in multiple situations, including the well-known DOCTOR script, which simulates a Rogerian psychotherapist.

[29] The 1965 source code has been dated as part of a software archaeology project which brings together researchers from USC, University of Sussex, Oxford, and Stanford U. who have worked together to unravel the complicated history of ELIZA.

He conducted several conversations with an APL implementation of ELIZA and published them – in English, and in his own translation to Hebrew – under the title My Electronic Psychiatrist – Eight Authentic Talks with a Computer.

Other versions adapted ELIZA around a religious theme, such as ones featuring Jesus (both serious and comedic), and another Apple II variant called I Am Buddha.

ELIZA has been referenced in popular culture and continues to be a source of inspiration for programmers and developers focused on artificial intelligence.

Inhabitants of the underground future world of THX, when stressed, would retreat to "confession booths" and initiate a one-sided Eliza-formula conversation with a Jesus-faced computer who claimed to be "OMM".

[citation needed] Frederik Pohl's science-fiction novel Gateway has the narrator undergo therapy at a praxis run by an AI that performs the task of a Freudian therapist, which he calls "Sigfrid von Shrink".

Don Daglow claims he wrote an enhanced version of the program called Ecala on a DEC PDP-10 minicomputer at Pomona College in 1973.

[44] The twelfth episode of the American sitcom Young Sheldon, aired in January 2018, included the protagonist "conversing" with ELIZA, hoping to resolve a domestic issue.

[46][47] In A Murder at the End of the World, the anthropomorphic LLM-powered character Ray cites ELIZA as an example of how some may seek refuge in a non-human therapist.

This exclusivity was especially prevalent during the creation and testing stages of the bot, which marginalized the experience of those intended users and those who did not fit into the characteristics mentioned.

[3] He criticizes this decision by acknowledging that when technologies such as chatbots are created in such a way, they reinforce the idea that emotional and nurturing jobs are inherently feminine.

ELIZA's design, while a pioneering chatbot of its time, unveils the need to reevaluate the Turing Test's relevance in assessing AI capabilities.

In a study titled “Does GPT-4 Pass the Turing Test?” by University of California, San Diego researchers Cameron R. Jones and Benjamin K. Bergen where they explored the performance of various AI models, including ELIZA, GPT-3.5, and GPT-4, alongside human participants in imitating human conversation, they ended up highlighting several factors that contributed to ELIZA's surprising performance.

Further, researchers observed an absence of characteristic traits that were found in modern AI - such as helpfulness or excessive verbosity - that led them to view ELIZA as an uncooperative human.

[50] In the same research, they - Cameron R. Jones and Benjamin K. Bergen from UC San Diego - also observed how ELIZA ignores grammatical structure and the context of the sentence.

A conversation with Eliza
A conversation between a human and ELIZA's DOCTOR script