When executing Weizenbaum's DOCTOR script, ELIZA simulated a Rogerian psychotherapist, largely by rephrasing the "patient"'s replies as questions:[1] Though designed strictly as a mechanism to support "natural language conversation" with a computer,[2] ELIZA's DOCTOR script was found to be surprisingly successful in eliciting emotional responses from users who, in the course of interacting with the program, began to ascribe understanding and motivation to the program's output.
[3] As Weizenbaum later wrote, "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.
In proposing what would later be called a carry-lookahead adder, Babbage remarked that he found such terms convenient for descriptive purposes, even though nothing more than mechanical action was meant.
[7] A trivial example of the specific form of the Eliza effect, given by Douglas Hofstadter, involves an automated teller machine which displays the words "THANK YOU" at the end of a transaction.
[12] The discovery of the ELIZA effect was an important development in artificial intelligence, demonstrating the principle of using social engineering rather than explicit programming to pass a Turing test.
[14] General digital assistants have been integrated into personal devices, with skills like sending messages, taking notes, checking calendars, and setting appointments.
In June 2022, Google engineer Blake Lemoine claimed that the large language model LaMDA had become sentient, hiring an attorney on its behalf after the chatbot requested he do so.
[18] In February 2023, Luka made abrupt changes to its Replika chatbot following a demand from the Italian Data Protection Authority, which cited "real risks to children".