Amazon Mechanical Turk (MTurk) is a crowdsourcing website with which businesses can hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do as economically.
[1] Employers, known as requesters, post jobs known as Human Intelligence Tasks (HITs), such as identifying specific content in an image or video, writing product descriptions, or answering survey questions.
Workers, colloquially known as Turkers or crowdworkers, browse among existing jobs and complete them in exchange for a fee set by the requester.
It was later revealed that this "machine" was not an automaton, but a human chess master hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy.
Analogously, the Mechanical Turk online service uses remote human labor hidden behind a computer interface to help employers perform tasks that are not possible using a true machine.
In early- to mid-November 2005, there were tens of thousands of jobs, all uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence.
As of April 2019[update], Requesters paid Amazon a minimum 20% commission on the price of successfully completed jobs, with increased amounts for additional services[clarification needed].
[6] Requesters can use the Amazon Mechanical Turk API to programmatically integrate the results of the work directly into their business processes and systems.
[23] Since 2010[update], numerous researchers have explored the viability of Mechanical Turk to recruit subjects for social science experiments.
Researchers have generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they are also not wildly misrepresentative.
[24][25] As a result, thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.
A study published in 2021 found that the types of quality control approaches used by researchers (such as checking for bots, VPN users, or workers willing to submit dishonest responses) can meaningfully influence survey results.
Satellite data was divided into 85-square-metre (910 sq ft) sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.
[further explanation needed] Programmers have developed browser extensions and scripts designed to simplify the process of completing jobs.
[42] In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users.[importance?]
[44] Companies with large online catalogues use Mechanical Turk to identify duplicates and verify details of item entries.
Mechanical Turk allows Requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research.
Other uses include writing comments, descriptions, and blog entries to websites and searching data elements or specific fields in large government and legal documents.
MTurk appears well-suited for questions that seek to understand whether two or more things are related to each other (called correlational research; e.g., are happy people more healthy?)
Fortunately, these categories capture most of the research conducted by behavioral scientists, and most correlational and experimental findings found in nationally representative samples replicate on MTurk.
Computer scientist Jaron Lanier noted how the design of Mechanical Turk "allows you to think of the people as software components" in a way that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost".
Critics of MTurk argue that workers are forced onto the site by precarious economic conditions and then exploited by requesters with low wages and a lack of power when disputes occur.
Journalist Alana Semuels’s article "The Internet Is Enabling a New Kind of Poorly Paid Hell" in The Atlantic is typical of such criticisms of MTurk.
[53] A recent academic commentary argued that study participants on sites like MTurk should be clearly warned about the circumstances in which they might later be denied payment as a matter of ethics,[54] even though such statements may not reduce the rate of careless responding.
[55] A paper published by a team at CloudResearch[14] shows that only about 7% of people on MTurk view completing HITs as something akin to a full-time job.
[United States-centric] Workers on MTurk must compete with others for good HIT opportunities as well as spend time searching for tasks and other actions that they are not compensated for.
[59] The Pew Research Center and the International Labour Office published data indicating people made around $5.00 per hour in 2015.
[14] In the Facebook–Cambridge Analytica data scandal, Mechanical Turk was one of the means of covertly gathering private information for a massive database.
In response to criticisms of payment evasion and lack of representation, a group developed a third-party platform called Turkopticon which allows workers to give feedback on their employers.