It is part of a theory of mental content proposed by Dennett, which provides the underpinnings of his later works on free will, consciousness, folk psychology, and evolution.
48–49), folk psychology provides a systematic, "reason-giving explanation" for a particular action, and an account of the historical origins of that action, based on deeply embedded assumptions about the agent;[6] namely that: This approach is also consistent with the earlier work of Fritz Heider and Marianne Simmel, whose joint study revealed that, when subjects were presented with an animated display of 2-dimensional shapes, they were inclined to ascribe intentions to the shapes.
The more concrete the level, the more accurate in principle our predictions are; the more abstract, the greater the computational power we gain by zooming out and skipping over the irrelevant details.
Dennett argues that it is best to understand human behavior at the level of the intentional stance, without making any specific commitments to any deeper reality of the artifacts of folk psychology.
[20] As a way of thinking about things, Dennett's intentional stance is entirely consistent with everyday commonsense understanding; and, thus, it meets Eleanor Rosch's (1978, p. 28) criterion of the "maximum information with the least cognitive effort".
50–51) for algorithms: The general notion of a three level system was widespread in the late 1970s/early 1980s; for example, when discussing the mental representation of information from a cognitive psychology perspective, Glass and his colleagues (1979, p. 24) distinguished three important aspects of representation: Other significant cognitive scientists who also advocated a three level system were Allen Newell, Zenon Pylyshyn, and David Marr.
The claim is that we don't just imagine the intentional states of other people in order to predict their behaviour; the fact that they have thoughts and feelings just like we do is central to notions such as trust, friendship and love.
Philip Robbins and Anthony I. Jack suggest that "Dennett's philosophical distinction between the physical and intentional stances has a lot going for it" from the perspective of psychology and neuroscience.
[39] Robbins and Jack point to a 2003 study[40] in which participants viewed animated geometric shapes in different "vignettes," some of which could be interpreted as constituting social interaction, while others suggested mechanical behavior.
The authors suggest "that these findings reveal putative 'core systems' for social and mechanical understanding that are divisible into constituent parts or elements with distinct processing and storage capabilities.
The authors suggest that psychopathy may represent a deficit in the phenomenal but not intentional stance, while people with autism appear to have intact moral sensibilities, just not mind-reading abilities.
[38][41] In a follow-up paper, Robbins and Jack describe four experiments about how the intentional and phenomenal stances relate to feelings of moral concern.
[42] Bryce Huebner (2010) performed two experimental philosophy studies to test students' ascriptions of various mental states to humans compared with cyborgs and robots.