History of computing

By the High Middle Ages, the positional Hindu–Arabic numeral system had reached Europe, which allowed for the systematic computation of numbers.

These kinds of statements have existed for thousands of years, and across multiple civilizations, as shown below: The earliest known tool for use in computation is the Sumerian abacus, and it was thought to have been invented in Babylon c. 2700–2300 BC.

[citation needed] In the 3rd century BC, Archimedes used the mechanical principle of balance (see Archimedes Palimpsest § The Method of Mechanical Theorems) to calculate mathematical problems, such as the number of grains of sand in the universe (The sand reckoner), which also required a recursive notation for numbers (e.g., the myriad myriad).

[8] According to Simon Singh, Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.

These machines were never actually built, as they were more of a thought experiment to produce new knowledge in systematic ways; although they could make simple logical operations, they still needed a human being for the interpretation of results.

Despite this, Llull's work had a strong influence on Gottfried Leibniz (early 18th century), who developed his ideas further and built several calculating tools using them.

[13] The additionally advanced Analytical Engine combined concepts from his previous work and that of others to create a device that, if constructed as designed, would have possessed many properties of a modern electronic computer, such as an internal "scratch memory" equivalent to RAM, multiple forms of output including a bell, a graph-plotter, and simple printer, and a programmable input-output "hard" memory of punch cards which it could modify as well as read.

This was a fundamental shift in thought; previous computational devices served only a single purpose but had to be at best disassembled and reconfigured to solve a new problem.

Babbage's devices could be reprogrammed to solve new problems by the entry of new data and act upon previous calculations within the same series of instructions.

Following Babbage, although unaware of his earlier work, Percy Ludgate[14][15] in 1909 published the 2nd of the only two designs for mechanical analytical engines in history.

In his Essays on Automatics (1914) Torres presented the design of an electromechanical calculating machine and introduced the idea of Floating-point arithmetic.

[21][22] Bush's paper Instrumental Analysis (1936) discussed using existing IBM punch card machines to implement Babbage's design.

In the same year, he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.

But this is speculation and there is no sign of it so far.In an 1886 letter, Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits.

Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924.

The first recorded idea of using digital electronics for computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E.

[29] From 1934 to 1936, NEC engineer Akira Nakashima, Claude Shannon, and Victor Shestakov published papers introducing switching circuit theory, using digital electronics for Boolean algebraic operations.

[citation needed] The first digital electronic computer was developed in the period April 1936 - June 1939, in the IBM Patent Department, Endicott, New York by Arthur Halsey Dickinson.

The competitor to IBM was the digital electronic computer NCR3566, developed in NCR, Dayton, Ohio by Joseph Desch and Robert Mumma in the period April 1939 - August 1939.

In December 1939 John Atanasoff and Clifford Berry completed their experimental model to prove the concept of the Atanasoff–Berry computer (ABC) which began development in 1937.

It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.

[51] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications.

[77] While at the University of New Mexico, Bader sought to build a supercomputer running Linux using consumer off-the-shelf parts and a high-speed low-latency interconnection network.

The prototype utilized an Alta Technologies "AltaCluster" of eight dual, 333  MHz, Intel Pentium II computers running a modified Linux kernel.

[78][79] Though Linux-based clusters using consumer-grade parts, such as Beowulf, existed before the development of Bader's prototype and RoadRunner, they lacked the scalability, bandwidth, and parallel computing capabilities to be considered "true" supercomputers.

[78] Today, supercomputers are still used by the governments of the world and educational institutions for computations such as simulations of natural disasters, genetic variant searches within a population relating to disease, and more.

The first computerized weather forecast was performed in 1950 by a team composed of American meteorologists Jule Charney, Philip Duncan Thompson, Larry Gates, and Norwegian meteorologist Ragnar Fjørtoft, applied mathematician John von Neumann, and ENIAC programmer Klara Dan von Neumann.

[83] By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to pass college-level calculus courses.

[90][91][92] Professor Janet Abbate, in her book Recoding Gender, writes:Yet women were a significant presence in the early decades of computing.

Some female programmers of the 1950s and 1960s would have scoffed at the notion that programming would ever be considered a masculine occupation, yet these women’s experiences and contributions were forgotten all too quickly.

A Smith Chart is a well-known nomogram .