Quantcast
Last updated on April 20, 2014 at 8:28 EDT

Algorithm

In mathematics and computer science, an algorithm is a step-by-step procedure for calculations. Algorithms are used for calculations, automated reasoning, and data processing.

An algorithm is an effective technique expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and an initial input, the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states and eventually producing “output” and terminating at a final ending state. The transition from one state to the next isn’t necessarily deterministic; some algorithms, known as randomized algorithms, include random input.

Though al-Khwarismi’s algorism referred to the rules of performing arithmetic utilizing Hindu-Arabic numerals and the systematic solution of linear and quadratic equations, a partial formalization of what would become the modern algorithm began with attempts to solve the Entscheidungsproblem posed by David Hilbert in 1928. Subsequent formalizations were framed as attempts to define “effective calculability” or “effective method”; those formalizations included the Godel-Herbrand-Kleene recursive functions of 1930, 1934, and 1935, Alonzo Church’s lambda calculus of 1936, Emil Post’s “Formulation 1” of 1936, and Alan Turing’s Turing machines of 1936 to 1937 and 1939. Giving a formal definition of algorithms, parallel to the intuitive notion, remains a challenging issue.

Image Caption: Euclid’s algorithm for computing the greatest common divisor of two numbers, drawn as a flowchart. Credit: Wvbailey/Wikipedia (CC BY 3.0)

Algorithm