Algorithm

As an effective method, an algorithm can be expressed within a finite amount of space and time[3] and in a well-defined formal language[4] for calculating a function.

[9] Around 825 AD, Persian scientist and polymath Muḥammad ibn Mūsā al-Khwārizmī wrote kitāb al-ḥisāb al-hindī ("Book of Indian computation") and kitab al-jam' wa'l-tafriq al-ḥisāb al-hindī ("Addition and subtraction in Indian arithmetic").

[1] In the early 12th century, Latin translations of said al-Khwarizmi texts involving the Hindu–Arabic numeral system and arithmetic appeared, for example Liber Alghoarismi de practica arismetrice, attributed to John of Seville, and Liber Algorismi de numero Indorum, attributed to Adelard of Bath.

[citation needed] One informal definition is "a set of rules that precisely defines a sequence of operations",[11][need quotation to verify] which would include all computer programs (including programs that do not perform numeric calculations), and any prescribed bureaucratic procedure[12] or cook-book recipe.

Boolos, Jeffrey & 1974, 1999 define an algorithm to be an explicit set of instructions for determining an output, that can be followed by a computing machine or a human who could only carry out specific elementary operations on symbols.

However, algorithms are also implemented by other means, such as in a biological neural network (for example, the human brain performing arithmetic or an insect looking for food), in an electrical circuit, or a mechanical device.

A Sumerian clay tablet found in Shuruppak near Baghdad and dated to c. 2500 BC describes the earliest division algorithm.

[16] During the Hammurabi dynasty c. 1800 – c. 1600 BC, Babylonian clay tablets described algorithms for computing formulas.

Babylonian clay tablets describe and employ algorithmic procedures to compute the time and place of significant astronomical events.

[20]: Ch 9.1 Examples of ancient Indian mathematics included the Shulba Sutras, the Kerala School, and the Brāhmasphuṭasiddhānta.

Although the full implementation of Babbage's second device was not realized for decades after her lifetime, Lovelace has been called "history's first programmer".

[30][31] In 1928, a partial formalization of the modern concept of algorithms began with attempts to solve the Entscheidungsproblem (decision problem) posed by David Hilbert.

Pseudocode, flowcharts, drakon-charts, and control tables are structured expressions of algorithms that avoid common ambiguities of natural language.

[34] An implementation description describes the general manner in which the machine moves its head and stores data to carry out the algorithm, but does not give exact states.

[34] In the most detail, a formal description gives the exact state table and list of transitions of the Turing machine.

[34] The graphical aid called a flowchart offers a way to describe and document an algorithm (and a computer program corresponding to it).

It has four primary symbols: arrows showing program flow, rectangles (SEQUENCE, GOTO), diamonds (IF-THEN-ELSE), and dots (OR-tie).

[37] Speedups of this magnitude enable computing devices that make extensive use of image processing (like digital cameras and medical equipment) to consume less power.

The design of algorithms is part of many solution theories, such as divide-and-conquer or dynamic programming within operation research.

In the United States, a claim consisting solely of simple manipulations of abstract concepts, numbers, or signals does not constitute "processes" (USPTO 2006), so algorithms are not patentable (as in Gottschalk v. Benson).

For example, in Diamond v. Diehr, the application of a simple feedback algorithm to aid in the curing of synthetic rubber was deemed patentable.

From this follows a simple algorithm, which can be described in plain English as: High-level description: (Quasi-)formal description: Written in prose but much closer to the high-level language of a computer program, the following is the more formal coding of the algorithm in pseudocode or pidgin code:

In a loop, subtract the larger number against the smaller number. Halt the loop when the subtraction will make a number negative. Assess two numbers, whether one of them is equal to zero or not. If yes, take the other number as the greatest common divisor. If no, put the two numbers in the subtraction loop again.
Flowchart of using successive subtractions to find the greatest common divisor of number r and s
Ada Lovelace 's diagram from " Note G ", the first published computer algorithm