Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
65 views

Algorithm - Wikipedia

The document discusses the history and definition of algorithms. It describes how ancient civilizations used algorithms in mathematics and astronomy. It then covers the development of algorithms through mechanical clocks, relay circuits, and early computers. It also discusses the formalization of algorithms in the 1920s and different ways to represent algorithms.

Uploaded by

darnit2703
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views

Algorithm - Wikipedia

The document discusses the history and definition of algorithms. It describes how ancient civilizations used algorithms in mathematics and astronomy. It then covers the development of algorithms through mechanical clocks, relay circuits, and early computers. It also discusses the formalization of algorithms in the 1920s and different ways to represent algorithms.

Uploaded by

darnit2703
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Algorithm

ⓘ)
In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ is a finite sequence of
mathematically rigorous instructions, typically used to solve a class of specific problems or to
perform a computation.[1] Algorithms are used as specifications for performing calculations and
data processing. More advanced algorithms can use conditionals to divert the code execution
through various routes (referred to as automated decision-making) and deduce valid inferences
(referred to as automated reasoning), achieving automation eventually. Using human
characteristics as descriptors of machines in metaphorical ways was already practiced by Alan
Turing with terms such as "memory", "search" and "stimulus".[2]

Flowchart of using successive


subtractions to find the greatest
common divisor of number r and s

In contrast, a heuristic is an approach to problem solving that may not be fully specified or may
not guarantee correct or optimal results, especially in problem domains where there is no well-
defined correct or optimal result.[3] For example, social media recommender systems rely on
heuristics in such a way that, although widely characterized as "algorithms" in 21st century
popular media, cannot deliver correct results due to the nature of the problem.

As an effective method, an algorithm can be expressed within a finite amount of space and
time[4] and in a well-defined formal language[5] for calculating a function.[6] Starting from an
initial state and initial input (perhaps empty),[7] the instructions describe a computation that,
when executed, proceeds through a finite[8] number of well-defined successive states, eventually
producing "output"[9] and terminating at a final ending state. The transition from one state to the
next is not necessarily deterministic; some algorithms, known as randomized algorithms,
incorporate random input.[10]
Etymology

Around 825, Persian scientist and polymath Muḥammad ibn Mūsā al-Khwārizmī wrote kitāb al-
ḥisāb al-hindī ("Book of Indian computation") and kitab al-jam' wa'l-tafriq al-ḥisāb al-hindī ("Addition
and subtraction in Indian arithmetic"). Both of these texts are lost in the original Arabic at this
time. (However, his other book on algebra remains.)[1]

In the early 12th century, Latin translations of said al-Khwarizmi texts involving the Hindu–Arabic
numeral system and arithmetic appeared: Liber Alghoarismi de practica arismetrice (attributed to
John of Seville) and Liber Algorismi de numero Indorum (attributed to Adelard of Bath).[2] Hereby,
alghoarismi or algorismi is the Latinization of Al-Khwarizmi's name; the text starts with the phrase
Dixit Algorismi ("Thus spoke Al-Khwarizmi").[3]

Around 1230, the English word algorism is attested and then by Chaucer in 1391, English adopted
the French term.[4][5] In the 15th century, under the influence of the Greek word ἀριθμός
(arithmos, "number"; cf. "arithmetic"), the Latin word was altered to algorithmus.

Definition

One informal definition is "a set of rules that precisely defines a sequence of operations",[11]
which would include all computer programs (including programs that do not perform numeric
calculations), and (for example) any prescribed bureaucratic procedure[12] or cook-book
recipe.[13] In general, a program is an algorithm only if it stops eventually[14]—even though infinite
loops may sometimes prove desirable. Boolos, Jeffrey & 1974, 1999 define an algorithm to be a
set of instructions for determining an output, given explicitly, in a form that can be followed by
either a computing machine, or a human who could only carry out specific elementary operations
on symbols.[15]

The concept of algorithm is also used to define the notion of decidability—a notion that is central
for explaining how formal systems come into being starting from a small set of axioms and
rules. In logic, the time that an algorithm requires to complete cannot be measured, as it is not
apparently related to the customary physical dimension. From such uncertainties, that
characterize ongoing work, stems the unavailability of a definition of algorithm that suits both
concrete (in some sense) and abstract usage of the term.

Most algorithms are intended to be implemented as computer programs. However, algorithms


are also implemented by other means, such as in a biological neural network (for example, the
human brain implementing arithmetic or an insect looking for food), in an electrical circuit, or in a
mechanical device.
History

Ancient algorithms

Since antiquity, step-by-step procedures for solving mathematical problems have been attested.
This includes Babylonian mathematics (around 2500 BC),[16] Egyptian mathematics (around
1550 BC),[16] Indian mathematics (around 800 BC and later; e.g. Shulba Sutras, Kerala School,
and Brāhmasphuṭasiddhānta),[17][18] The Ifa Oracle (https://www.jstor.org/stable/3027363)
(around 500 BC), Greek mathematics (around 240 BC, e.g. sieve of Eratosthenes and Euclidean
algorithm),[19] and Arabic mathematics (9th century, e.g. cryptographic algorithms for code-
breaking based on frequency analysis).[20] The first cryptographic algorithm for deciphering
encrypted code was developed by Al-Kindi, a 9th-century Arab mathematician, in A Manuscript On
Deciphering Cryptographic Messages. He gave the first description of cryptanalysis by frequency
analysis, the earliest codebreaking algorithm.[20]

Ancient Near East

The earliest evidence of algorithms is found in the Babylonian mathematics of ancient


Mesopotamia (modern Iraq). A Sumerian clay tablet found in Shuruppak near Baghdad and dated
to c. 2500 BC described the earliest division algorithm.[16] During the Hammurabi dynasty
c. 1800 – c. 1600 BC, Babylonian clay tablets described algorithms for computing formulas.[21]
Algorithms were also used in Babylonian astronomy. Babylonian clay tablets describe and
employ algorithmic procedures to compute the time and place of significant astronomical
events.[22]

Algorithms for arithmetic are also found in ancient Egyptian mathematics, dating back to the
Rhind Mathematical Papyrus c. 1550 BC.[16] Algorithms were later used in ancient Hellenistic
mathematics. Two examples are the Sieve of Eratosthenes, which was described in the
Introduction to Arithmetic by Nicomachus,[23][19]: Ch 9.2 and the Euclidean algorithm, which was
first described in Euclid's Elements (c. 300 BC).[19]: Ch 9.1

Computers

Weight-driven clocks

Bolter credits the invention of the weight-driven clock as "The key invention [of Europe in the
Middle Ages]", in particular, the verge escapement[24] that provides us with the tick and tock of a
mechanical clock. "The accurate automatic machine"[25] led immediately to "mechanical
automata" beginning in the 13th century and finally to "computational machines"—the difference
engine and analytical engines of Charles Babbage and Countess Ada Lovelace, mid-19th
century.[26] Lovelace is credited with the first creation of an algorithm intended for processing on
a computer—Babbage's analytical engine, the first device considered a real Turing-complete
computer instead of just a calculator—and is sometimes called "history's first programmer" as a
result, though a full implementation of Babbage's second device would not be realized until
decades after her lifetime.

Electromechanical relay

Bell and Newell (1971) indicate that the Jacquard loom (1801), precursor to Hollerith cards
(punch cards, 1887), and "telephone switching technologies" were the roots of a tree leading to
the development of the first computers.[27] By the mid-19th century the telegraph, the precursor
of the telephone, was in use throughout the world, its discrete and distinguishable encoding of
letters as "dots and dashes" a common sound. By the late 19th century the ticker tape (c. 1870s)
was in use, as was the use of Hollerith cards in the 1890 U.S. census. Then came the teleprinter
(c. 1910) with its punched-paper use of Baudot code on tape.

Telephone-switching networks of electromechanical relays (invented 1835) was behind the work
of George Stibitz (1937), the inventor of the digital adding device. As he worked in Bell
Laboratories, he observed the "burdensome' use of mechanical calculators with gears. "He went
home one evening in 1937 intending to test his idea... When the tinkering was over, Stibitz had
constructed a binary adding device".[28] The mathematician Martin Davis supported the particular
importance of the electromechanical relay.[29]

Formalization

Ada Lovelace's diagram from "Note


G", the first published computer
algorithm

In 1928, a partial formalization of the modern concept of algorithms began with attempts to
solve the Entscheidungsproblem (decision problem) posed by David Hilbert. Later formalizations
were framed as attempts to define "effective calculability"[30] or "effective method".[31] Those
formalizations included the Gödel–Herbrand–Kleene recursive functions of 1930, 1934 and
1935, Alonzo Church's lambda calculus of 1936, Emil Post's Formulation 1 of 1936, and Alan
Turing's Turing machines of 1936–37 and 1939.
Representations

Algorithms can be expressed in many kinds of notation, including natural languages,


pseudocode, flowcharts, drakon-charts, programming languages or control tables (processed by
interpreters). Natural language expressions of algorithms tend to be verbose and ambiguous and
are rarely used for complex or technical algorithms. Pseudocode, flowcharts, drakon-charts and
control tables are structured ways to express algorithms that avoid many of the ambiguities
common in statements based on natural language. Programming languages are primarily
intended for expressing algorithms in a form that can be executed by a computer, but they are
also often used as a way to define or document algorithms.

Turing machines

There is a wide variety of representations possible and one can express a given Turing machine
program as a sequence of machine tables (see finite-state machine, state-transition table and
control table for more), as flowcharts and drakon-charts (see state diagram for more), or as a
form of rudimentary machine code or assembly code called "sets of quadruples" (see Turing
machine for more). Representations of algorithms can also be classified into three accepted
levels of Turing machine description: high level description, implementation description, and
formal description.[32] A high level description describes qualities of the algorithm itself, ignoring
how it is implemented on the turing machine.[32] An implementation description describes the
general manner in which the turing machine moves its head and stores data in order to carry out
the algorithm, but doesn't give exact states.[32] In the most detail, a formal description gives the
exact state table and list of transitions of the turing machine.[32]

Flowchart representation

The graphical aid called a flowchart offers a way to describe and document an algorithm (and a
computer program corresponding to it). Like the program flow of a Minsky machine, a flowchart
always starts at the top of a page and proceeds down. Its primary symbols are only four: the
directed arrow showing program flow, the rectangle (SEQUENCE, GOTO), the diamond (IF-THEN-
ELSE), and the dot (OR-tie). The Böhm–Jacopini canonical structures are made of these primitive
shapes. Sub-structures can "nest" in rectangles, but only if a single exit occurs from the
superstructure. The symbols and their use to build the canonical structures are shown in the
diagram.[33]

You might also like