Causal decision theory

Causal decision theory (CDT) is a school of thought within decision theory which states that, when a rational agent is confronted with a set of possible actions, one should select the action which causes the best outcome in expectation.

CDT contrasts with evidential decision theory (EDT), which recommends the action which would be indicative of the best outcome if one received the "news" that it had been taken.

In this case you don't know the causal effects of eating the apple.

In a 1981 article, Allan Gibbard and William Harper explained causal decision theory as maximization of the expected utility

Gibbard and Harper showed that if we accept two axioms (one related to the controversial principle of the conditional excluded middle[5]), then the statistical independence of

Gibbard and Harper give an example in which King David wants Bathsheba but fears that summoning her would provoke a revolt.

Further, David has studied works on psychology and political science which teach him the following: Kings have two personality types, charismatic and uncharismatic.

A king's degree of charisma depends on his genetic make-up and early childhood experiences, and cannot be changed in adulthood.

David does not know whether or not he is charismatic; he does know that it is unjust to send for another man's wife.

(p. 164) In this case, evidential decision theory recommends that David abstain from Bathsheba, while causal decision theory—noting that whether David is charismatic or uncharismatic cannot be changed—recommends sending for her.

[6] Different decision theories are often examined in their recommendations for action in different thought experiments.

In Newcomb's paradox, there is a predictor, a player, and two boxes designated A and B.

Causal decision theory recommends taking both boxes in this scenario, because at the moment when the player must make a decision, the predictor has already made a prediction (therefore, the action of the player will not affect the outcome).

[5] One proposal is the "imaging" technique suggested by Lewis:[8] To evaluate

[5] There are innumerable "counterexamples" where, it is argued, a straightforward application of CDT fails to produce a defensibly "sane" decision.

Philosopher Andy Egan argues this is due to a fundamental disconnect between the intuitive rational rule, "do what you expect will bring about the best results", and CDT's algorithm of "do whatever has the best expected outcome, holding fixed our initial views about the likely causal structure of the world."

In this view, it is CDT's requirement to "hold fixed the agent’s unconditional credences in dependency hypotheses" that leads to irrational decisions.

[citation needed] Because your choice of one or two boxes can't causally affect the Predictor's guess, causal decision theory recommends the two-boxing strategy.

Philosophers disagree whether one-boxing or two-boxing is the "rational" strategy.

[10] Similar concerns may arise even in seemingly-straightforward problems like the prisoner's dilemma,[11] especially when playing opposite your "twin" whose choice to cooperate or defect correlates strongly, but is not caused by, your own choice.

As in Newcomb's problem, we postulate that Death is a reliable predictor.

A CDT agent would be unable to process the correlation, and may as a consequence make irrational decisions:[9][13][14] Recently, a few variants of Death in Damascus have been proposed in which following CDT’s recommendations voluntarily loses money or, relatedly, forgoes a guaranteed payoff.

Yesterday, the seller put $3 in each box that she predicted the buyer not to acquire.

Adopting the buyer's perspective, CDT reasons that at least one box contains $3.

Paul very strongly prefers living in a world with psychopaths to dying.

[9] Philosopher Jim Joyce, perhaps the most prominent modern defender of CDT,[19] argues that CDT naturally is capable of taking into account any "information about what one is inclined or likely to do as evidence".