[9] Recommendations for folic acid supplementation during pregnancy reduced risk of infant neural tube defects.
[10] from plant origin as provitamin A / all-trans-beta-carotene: orange, ripe yellow fruits, leafy vegetables, carrots, pumpkin, squash, spinach The value of eating certain foods to maintain health was recognized long before vitamins were identified.
The ancient Egyptians knew that feeding liver to a person may help with night blindness, an illness now known to be caused by a vitamin A deficiency.
[24] The advance of ocean voyages during the Age of Discovery resulted in prolonged periods without access to fresh fruits and vegetables, and made illnesses from vitamin deficiency common among ships' crews.
[25] In 1747, the Scottish surgeon James Lind discovered that citrus foods helped prevent scurvy, a particularly deadly disease in which collagen is not properly formed, causing poor wound healing, bleeding of the gums, severe pain, and death.
However, during the 19th century, limes grown in the West Indies were substituted for lemons; these were subsequently found to be much lower in vitamin C.[27] As a result, Arctic expeditions continued to be plagued by scurvy and other deficiency diseases.
In the early 20th century, when Robert Falcon Scott made his two expeditions to the Antarctic, the prevailing medical theory was that scurvy was caused by "tainted" canned food.
He fed mice an artificial mixture of all the separate constituents of milk known at that time, namely the proteins, fats, carbohydrates, and salts.
He made a conclusion that "a natural food such as milk must therefore contain, besides these known principal ingredients, small quantities of unknown substances essential to life."
[29] A similar result by Cornelis Adrianus Pekelharing appeared in Dutch medical journal Nederlands Tijdschrift voor Geneeskunde in 1905,[a] but it was not widely reported.
[29] In East Asia, where polished white rice was the common staple food of the middle class, beriberi resulting from lack of vitamin B1 was endemic.
In 1884, Takaki Kanehiro, a British-trained medical doctor of the Imperial Japanese Navy, observed that beriberi was endemic among low-ranking crew who often ate nothing but rice, but not among officers who consumed a Western-style diet.
This convinced Takaki and the Japanese Navy that diet was the cause of beriberi, but they mistakenly believed that sufficient amounts of protein prevented it.
[31] That diseases could result from some dietary deficiencies was further investigated by Christiaan Eijkman, who in 1897 discovered that feeding unpolished rice instead of the polished variety to chickens helped to prevent a kind of polyneuritis that was the equivalent of beriberi.
[32] The following year, Frederick Hopkins postulated that some foods contained "accessory factors" – in addition to proteins, carbohydrates, fats etc.
Max Nierenstein a friend and Reader of Biochemistry at Bristol University reportedly suggested the "vitamine" name (from "vital amine").
Thirty-five years earlier, Eijkman had observed that chickens fed polished white rice developed neurological symptoms similar to those observed in military sailors and soldiers fed a rice-based diet, and that the symptoms were reversed when the chickens were switched to whole-grain rice.
Karrer and Norman Haworth confirmed Albert Szent-Györgyi's discovery of ascorbic acid and made significant contributions to the chemistry of flavins, which led to the identification of lactoflavin.
[39] In 1931, Albert Szent-Györgyi and a fellow researcher Joseph Svirbely suspected that "hexuronic acid" was actually vitamin C, and gave a sample to Charles Glen King, who proved its activity counter to scurvy in his long-established guinea pig scorbutic assay.
In 1943, Edward Adelbert Doisy and Henrik Dam were awarded the Nobel Prize in Physiology or Medicine for their discovery of vitamin K and its chemical structure.
[38][42] Once discovered, vitamins were actively promoted in articles and advertisements in McCall's, Good Housekeeping, and other media outlets.
[32] Marketers enthusiastically promoted cod-liver oil, a source of vitamin D, as "bottled sunshine", and bananas as a "natural vitality food".
[43] They promoted foods such as yeast cakes, a source of B vitamins, on the basis of scientifically determined nutritional value, rather than taste or appearance.
[32] Robert W. Yoder is credited with first using the term vitamania, in 1942, to describe the appeal of relying on nutritional supplements rather than on obtaining vitamins from a varied diet of foods.
The continuing preoccupation with a healthy lifestyle led to an obsessive consumption of vitamins and multi-vitamins, the beneficial effects of which are questionable.
[53] Once growth and development are completed, vitamins remain essential nutrients for the healthy maintenance of the cells, tissues, and organs that make up a multicellular organism; they also enable a multicellular life form to efficiently use chemical energy provided by food it eats, and to help process the proteins, carbohydrates, and fats required for cellular respiration.
The European Union and the governments of several countries have established Tolerable upper intake levels (ULs) for those vitamins which have documented toxicity (see table).
[69][71] A 2018 meta-analysis found no evidence that intake of vitamin D or calcium for community-dwelling elderly people reduced bone fractures.
Vitamin products above these regulatory limits are not considered supplements and should be registered as prescription or non-prescription (over-the-counter drugs) due to their potential side effects.
Others, such as PABA (formerly B10), are biologically inactive, toxic, or with unclassifiable effects in humans, or not generally recognised as vitamins by science,[82] such as the highest-numbered, which some naturopath practitioners call B21 and B22.