Many colleges also allow students to declare a minor field, a secondary discipline in which they also take a substantial number of classes, but not so many as would be necessary to complete a major.
[2] Before that, all students receiving an undergraduate degree would be required to study the same slate of courses geared at a comprehensive "liberal education".
Offering eight options (which included ancient languages, anatomy, medicine), other higher educational systems in Europe began to develop into a stricter specialization approach to studies after the American Civil War.
[2] In the United States, in the second half of the 19th century, concentrated foci at the undergraduate level began to prosper and popularize, but the familiar term "major" did not appear until 1877 in a Johns Hopkins University catalogue.
From 1880 to 1910, Baccalaureate granting American institutions vastly embraced a free-elective system, where students were endowed with a greater freedom to explore intellectual curiosities.
[3] In the 1980s and 1990s, "interdisciplinary studies, multiculturalism, feminist pedagogy, and a renewed concern for the coherence and direction of the undergraduate program began to assail the Baccalaureate degree dominated by the academic major.
Generally, proponents of the major and departmental system "argue that they enable an academic community to foster the development, conservation and diffusion of knowledge."
In contrast, critics "claim that they promote intellectual tribalism, where specialization receives favor over the mastery of multiple epistemologies, where broader values of liberal learning and of campus unity are lost, and where innovation is inhibited due to parochial opposition to new sub-specialties and research methods.
If a person applies to an impacted major, the school can raise the minimum requirements as much as needed to weed out the students that it is unable to accommodate.