Rationality

[4][5] While actions and beliefs are the most paradigmatic forms of rationality, the term is used both in ordinary language and in many academic disciplines to describe a wide variety of things, such as persons, desires, intentions, decisions, policies, and institutions.

In this regard, different fields often focus their investigation on one specific conception, type, or aspect of rationality without trying to cover it in its most general sense.

Often many additional activities of the higher cognitive faculties are included as well, such as acquiring concepts, judging, deliberating, planning, and deciding as well as the formation of desires and intentions.

Examples of behaviors considered irrational in ordinary discourse are giving into temptations, going out late even though one has to get up early in the morning, smoking despite being aware of the health risks, or believing in astrology.

Examples of irrationality in this sense include cognitive biases and violating the laws of probability theory when assessing the likelihood of future events.

This view has been criticized based on the claim that, in order to respond to reasons, people have to be aware of them, i.e. they have some form of epistemic access.

[7][15] A common approach is to hold that this access is given through the possession of evidence in the form of cognitive mental states, like perceptions and knowledge.

[2][19] These versions avoid the previous objection since rationality no longer requires the agent to respond to external factors of which they could not have been aware.

For example, if terrorists threaten to blow up a city unless the agent forms an irrational belief, this is a very weighty reason to do all in one's power to violate the norms of rationality.

In complex cases, inconsistencies may be difficult to detect, for example, when a person believes in the axioms of Euclidean geometry and is nonetheless convinced that it is possible to square the circle.

[38] On this view, critique of irrational behavior, like the doctor prescribing drug B, involves a negative evaluation of the agent in terms of responsibility but remains silent on normative issues.

[6][12][1] For example, the ideal rational norms of decision theory demand that the agent should always choose the option with the highest expected value.

Another is that enormous mental resources would be required to constantly keep track of all the justificatory relations connecting non-fundamental beliefs to fundamental ones.

An important research question in this field is about how cognitive agents use heuristics rather than brute calculations to solve problems and make decisions.

[6][8][9] However, this strong affirmation has been subjected to many criticisms, for example, that humans are not rational all the time and that non-human animals also show diverse forms of intelligence.

Some forms of research restrict themselves to one specific domain while others investigate the topic in an interdisciplinary manner by drawing insights from different fields.

[7][77][21] The German scholar Max Weber proposed an interpretation of social action that distinguished between four different idealized types of rationality.

Here the action is undertaken for what one might call reasons intrinsic to the actor: some ethical, aesthetic, religious or other motives, independent of whether it will lead to success.

The third type was affectual, determined by an actor's specific affect, feeling, or emotion—to which Weber himself said that this was a kind of rationality that was on the borderline of what he considered "meaningfully oriented."

Audi is committed to a form of foundationalism: the idea that justified beliefs, or in his case, rational states in general, can be divided into two groups: the foundation and the superstructure.

[97] The psychologist Jean Piaget gave an influential account of how the stages in human development from childhood to adulthood can be understood in terms of the increase of rational and logical abilities.

in cognitive science and neuroscience show that no human has ever satisfied this criterion, except perhaps a person with no affective feelings, for example, an individual with a massively damaged amygdala or severe psychopathy.

[12][6] While decision theory gives a very precise formal treatment of this issue, it leaves open the empirical problem of how to assign utilities and probabilities.

Game theory can be used to analyze various situations, like playing chess, firms competing for business, or animals fighting over prey.

Others think that any kind of rationality along the lines of rational choice theory is a useless concept for understanding human behavior; the term homo economicus (economic man: the imaginary man being assumed in economic models who is logically consistent but amoral) was coined largely in honor of this view.

In order to make a safe agent that plays defensively, a nonlinear function of performance is often desired, so that the reward for winning is lower than the punishment for losing.

Abulof finds that Some 40% of all scholarly references to "foreign policy" allude to "rationality"—and this ratio goes up to more than half of pertinent academic publications in the 2000s.

"[114] The concept of rationality has been subject to criticism by various philosophers who question its universality and capacity to provide a comprehensive understanding of reality and human existence.

Friedrich Nietzsche, in his work "Beyond Good and Evil" (1886), criticized the overemphasis on rationality and argued that it neglects the irrational and instinctual aspects of human nature.

"[115] Martin Heidegger, in "Being and Time" (1927), offered a critique of the instrumental and calculative view of reason, emphasizing the primacy of our everyday practical engagement with the world.

German scholar Max Weber notably articulated a theory of rationality that divided human capacity to think through things in four ways. [ 78 ]