History of computer science

[5] In the 5th century BC in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical.

They were developed by Muslim astronomers, such as the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[8] and the torquetum by Jabir ibn Aflah.

[9] According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.

[13] When John Napier discovered logarithms for computational purposes in the early 17th century,[14] there followed a period of considerable progress by inventors and scientists in making calculating tools.

The analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities able to interpret a programming language with loops and conditional branching.

In 1702, Gottfried Wilhelm Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system.

Leibniz simplified the binary system and articulated logical properties such as conjunction, disjunction, negation, identity, inclusion, and the empty set.

"[22] But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.

[28] Following Babbage, although at first unaware of his earlier work, was Percy Ludgate, a clerk to a corn merchant in Dublin, Ireland.

In his Essays on Automatics (1914), Torres designed an analytical electromechanical machine that was controlled by a read-only program and introduced the idea of floating-point arithmetic.

[34] Bush's paper Instrumental Analysis (1936) discussed using existing IBM punch card machines to implement Babbage's design.

In the same year he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.

Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924.

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor.

From 1934 to 1936, Akira Nakashima, Claude Shannon, and Viktor Shetakov published a series of papers showing that the two-valued Boolean algebra, can describe the operation of switching circuits.

Switching circuit theory provided the mathematical foundations and tools for digital system design in almost all areas of modern technology.

[43] While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems.

His thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.

The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

The thesis states that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.

[52] In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy.

[52][60] Turing's design for ACE had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer, which was enormous by the standards of his day.

[61] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details.

[62] From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman."

The von Neumann architecture was considered innovative as it introduced an idea of allowing machine instructions and data to share memory space.

Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as if statements or while loops.

[67] On August 31, 1955, a research project was proposed consisting of John McCarthy, Marvin L. Minsky, Nathaniel Rochester, and Claude E. Shannon.

The official project began in 1956 that consisted of several significant parts they felt would help them better understand artificial intelligence's makeup.

The concept behind this was looking at how humans understand our own language and structure of how we form sentences, giving different meaning and rule sets and comparing them to a machine process.

[72] They wanted to see if a machine could take a piece of incomplete information and improve upon it to fill in the missing details as the human mind can do.

John Napier (1550–1617), the inventor of logarithms
Gottfried Wilhelm Leibniz (1646–1716) developed logic in a binary number system and has been called the "founder of computer science". [ 19 ]
Charles Babbage (1791–1871), one of the pioneers of computing
Ada Lovelace (1815–1852) predicted the use of computers in symbolic manipulation
Leonardo Torres Quevedo (1852–1936) proposed a consistent manner to store floating-point numbers
Charles Sanders Peirce (1839–1914) described how logical operations could be carried out by electrical switching circuits
Alan Turing , English computer scientist, mathematician, logician, and cryptanalyst. (circa 1930)
John V. Atanasoff (1903–1995) created the first electric digital computer, known as the Atanasoff–Berry computer
Konrad Zuse , inventor of the modern computer [ 55 ] [ 56 ]
Claude Shannon (1916–2001) created the field of information theory
Norbert Wiener (1894–1964) created the term cybernetics
John von Neumann (1903–1957) introduced the computer architecture known as Von Neumann architecture
John McCarthy (1927–2011) is considered one of the founding fathers of artificial intelligence