Deductive-nomological model

Because of problems concerning humans' ability to define, discover, and know causality, this was omitted in initial formulations of the DN model.

Causality was thought to be incidentally approximated by realistic selection of premises that derive the phenomenon of interest from observed starting conditions plus general laws.

In the early 1980s, a revision to the DN model emphasized maximal specificity for relevance of the conditions and axioms stated.

[1] The DN model holds to a view of scientific explanation whose conditions of adequacy (CA)—semiformal but stated classically—are derivability (CA1), lawlikeness (CA2), empirical content (CA3), and truth (CA4).

[2] In the DN model, a law axiomatizes an unrestricted generalization from antecedent A to consequent B by conditional proposition—If A, then B—and has empirical content testable.

[6][7] Thus, given the explanans as initial, specific conditions C1, C2, ... Cn plus general laws L1, L2, ... Ln, the phenomenon E as explanandum is a deductive consequence, thereby scientifically explained.

[7] The framework of Aristotelian physics—Aristotelian metaphysics—reflected the perspective of this principally biologist, who, amid living entities' undeniable purposiveness, formalized vitalism and teleology, an intrinsic morality in nature.

[15] Near 1780, countering Hume's ostensibly radical empiricism, Immanuel Kant highlighted extreme rationalism—as by Descartes or Spinoza—and sought middle ground.

[14][17][18] Aborting Francis Bacon's inductivist mission to dissolve the veil of appearance to uncover the noumena—metaphysical view of nature's ultimate truths—Kant's transcendental idealism tasked science with simply modeling patterns of phenomena.

Safeguarding metaphysics, too, it found the mind's constants holding also universal moral truths,[19] and launched German idealism.

[20] Meanwhile, evolutionary theory's natural selection brought the Copernican Revolution into biology and eventuated in the first conceptual alternative to vitalism and teleology.

[40] In 1948, when explicating DN model and stating scientific explanation's semiformal conditions of adequacy, Hempel and Oppenheim acknowledged redundancy of the third, empirical content, implied by the other three—derivability, lawlikeness, and truth.

[2] In the early 1980s, upon widespread view that causality ensures the explanans' relevance, Wesley Salmon called for returning cause to because,[44] and along with James Fetzer helped replace CA3 empirical content with CA3' strict maximal specificity.

Blurring epistemic with ontic—as by incautiously presuming a natural law to refer to a causal mechanism, or to trace structures realistically during unobserved transitions, or to be true regularities always unvarying—tends to generate a category mistake.

[5][6] Thus, the epistemic success of Newtonian theory's law of universal gravitation is reduced to—thus explained by—Albert Einstein's general theory of relativity, although Einstein's discards Newton's ontic claim that universal gravitation's epistemic success predicting Kepler's laws of planetary motion[49] is through a causal mechanism of a straightly attractive force instantly traversing absolute space despite absolute time.

[66] Mach as well as Ostwald viewed matter as a variant of energy, and molecules as mathematical illusions,[66] as even Boltzmann thought possible.

[67] In 1905, via statistical mechanics, Albert Einstein predicted the phenomenon Brownian motion—unexplained since reported in 1827 by botanist Robert Brown.

[66] Also in 1905, Einstein explained the electromagnetic field's energy as distributed in particles, doubted until this helped resolve atomic theory in the 1910s and 1920s.

[63][80] Originally epistemic or instrumental, this was interpreted as ontic or realist—that is, a causal mechanical explanation—and the principle became a theory,[81] refuting Newtonian gravitation.

[79][82] By predictive success in 1919, general relativity apparently overthrew Newton's theory, a revolution in science[83] resisted by many yet fulfilled around 1930.

[88] From it, Dirac interpreted and predicted the electron's antiparticle, soon discovered and termed positron,[89] but the QED failed electrodynamics at high energies.

[94] Meeting in 1947, Freeman Dyson, Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga soon introduced renormalization, a procedure converting QED to physics' most predictively precise theory,[90][95] subsuming chemistry, optics, and statistical mechanics.

[98][102] General philosophers of science commonly believe that aether, rather, is fictitious,[103] "relegated to the dustbin of scientific history ever since" 1905 brought special relativity.

[100] Objects became conceived as pinned directly on space and time[111] by abstract geometric relations lacking ghostly or fluid medium.

[90] Comprised by EWT, QCD, and Higgs field, this Standard Model of particle physics is an "effective theory",[113] not truly fundamental.

[92][122] By now, most theoretical physicists infer that the four, known fundamental interactions would reduce to superstring theory, whereby atoms and molecules, after all, are energy vibrations holding mathematical, geometric forms.