Computer Reading
Computer Reading
Computer Reading
A computer is a programmable machine that receives input, stores and manipulates data, and provides
output in a useful format.
While a computer can, in theory, be made out of almost anything (see misconceptions section), and
mechanical examples of computers have existed through much of recorded human history, the first
electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the
size of a large room, consuming as much power as several hundred modern personal computers (PCs).
Modern computers based on integrated circuits are millions to billions of times more capable than the
early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile
devices, and can be powered by a small battery. Personal computers in their various forms are icons of
the Information Age and are what most people think of as "computers". However, the embedded
computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots
are the most numerous.
Contents
1.Misconceptions
A computer does not need to be electric, nor even have a processor, nor RAM, nor even hard
disk. The minimal definition of a computer is anything that transforms information in a
purposeful way.
Historically, computers evolved from mechanical computers and eventually from vacuum
tube transistors.
There is active research to make computers out of many promising new types of
technology, such as optical computing, DNA computers, neural computers, and quantum
computers. Some of these can easily tackle problems that modern computers cannot (such
as how quantum computers can break some modern encryption algorithms by quantum
factoring).
o 1.2 Computer architecture paradigms ~ model
2 History of computing
The first use of the word "computer" was recorded in 1613, referring to a person who carried
out calculations, or computations, and the word continued to be used in that sense until the
middle of the 20th century. From the end of the 19th century onwards though, the word began
to take on its more familiar meaning, describing a machine that carries out computations
The history of the modern computer begins with two separate technologies—automated
calculation and programmability—but no single device can be identified as the earliest
computer, partly because of the inconsistent application of that term. Examples of early
mechanical calculating devices include the abacus, the slide rule and arguably the
astrolabe and the Antikythera mechanism, an ancient astronomical computer built by the
Greeks around 80 BC.[4] The Greek mathematician Hero of Alexandria (c. 10–70 AD)
built a mechanical theater which performed a play lasting 10 minutes and was operated
by a complex system of ropes and drums that might be considered to be a means of
deciding which parts of the mechanism performed which actions and when.[5] This is the
essence of programmability.
In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series
of punched paper cards as a template which allowed his loom to weave intricate patterns
automatically. The resulting Jacquard loom was an important step in the development of
computers because the use of punched cards to define woven patterns can be viewed as an early,
albeit limited, form of programmability.
It was the fusion of automatic calculation with programmability that produced the first
recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a
fully programmable mechanical computer, his analytical engine. Limited finances and Babbage's
inability to resist tinkering with the design meant that the device was never completed.
In the late 1880s, Herman Hollerith invented the recording of data on a machine readable
medium. Prior uses of machine readable media, above, had been for control, not data. "After
some initial trials with paper tape, he settled on punched cards ..." To process these punched
cards he invented the tabulator, and the keypunch machines. These three inventions were the
foundation of the modern information processing industry. Large-scale automated data
processing of punched cards was performed for the 1890 United States Census by Hollerith's
company, which later became the core of IBM. By the end of the 19th century a number of
technologies that would later prove useful in the realization of practical computers had begun to
appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the
teleprinter.
During the first half of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct mechanical or electrical model
of the problem as a basis for computation. However, these were not programmable and generally
lacked the versatility and accuracy of modern digital computers.
Alan Turing is widely regarded to be the father of modern computer science. In 1936 Turing
provided an influential formalisation of the concept of the algorithm and computation with the
Turing machine, providing a blueprint for the electronic digital computer.[11] Of his role in the
creation of the modern computer, Time magazine in naming Turing one of the 100 most
influential people of the 20th century, states: "The fact remains that everyone who taps at a
keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of
a Turing machine".[11]
The Zuse Z3, 1941, considered the world's first working programmable, fully automatic
computing machine.
The ENIAC, which became operational in 1946, is considered to be the first general-purpose
electronic computer.
EDSAC was one of the first computers to implement the stored program (von Neumann)
architecture.
The inventor of the program-controlled computer was Konrad Zuse, who built the first working
computer in 1941 and later in 1955 the first computer based on magnetic storage.[12]
George Stibitz is internationally recognized as a father of the modern digital computer. While
working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he
dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to
use binary circuits to perform an arithmetic operation. Later models added greater sophistication
including complex arithmetic and programmability.[13]
A succession of steadily more powerful and flexible computing devices were constructed in the
1930s and 1940s, gradually adding the key features that are seen in modern computers. The use
of digital electronics (largely invented by Claude Shannon in 1937) and more flexible
programmability were vitally important steps, but defining one point along this road as "the first
digital electronic computer" is difficult.Shannon 1940 Notable achievements include.
Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working
machine featuring binary arithmetic, including floating point arithmetic and a measure of
programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the
world's first operational computer.[14]
The non-programmable Atanasoff–Berry Computer (1941) which used vacuum tube
based computation, binary numbers, and regenerative capacitor memory. The use of
regenerative memory allowed it to be much more compact than its peers (being
approximately the size of a large desk or workbench), since intermediate results could be
stored and then fed back into the same set of computation elements.
The secret British Colossus computers (1943),[15] which had limited programmability but
demonstrated that a device using thousands of tubes could be reasonably reliable and
electronically reprogrammable. It was used for breaking German wartime codes.
The Harvard Mark I (1944), a large-scale electromechanical computer with limited
programmability.
The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal
arithmetic and is sometimes called the first general purpose electronic computer (since
Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however,
ENIAC had an inflexible architecture which essentially required rewiring to change its
programming.
Several developers of ENIAC, recognizing its flaws, came up with a far more flexible
and elegant design, which came to be known as the "stored program architecture" or von
Neumann architecture. This design was first formally described by John von Neumann in
the paper First Draft of a Report on the EDVAC, distributed in 1945. A number of
projects to develop computers based on the stored-program architecture commenced
around this time, the first of these being completed in Great Britain. The first working
prototype to be demonstrated was the Manchester Small-Scale Experimental Machine
(SSEM or "Baby") in 1948. The Electronic Delay Storage Automatic Calculator
(EDSAC), completed a year after the SSEM at Cambridge University, was the first
practical, non-experimental implementation of the stored program design and was put to
use immediately for research work at the university. Shortly thereafter, the machine
originally described by von Neumann's paper—EDVAC—was completed but did not see
full-time use for an additional two years.
Nearly all modern computers implement some form of the stored-program architecture,
making it the single trait by which the word "computer" is now defined. While the
technologies used in computers have changed dramatically since the first electronic,
general-purpose computers of the 1940s, most still use the von Neumann architecture.
Beginning in the 1950s, Soviet scientists Sergei Sobolev and Nikolay Brusentsov
conducted research on ternary computers, devices that operated on a base three
numbering system of −1, 0, and 1 rather than the conventional binary numbering system
upon which most computers are based. They designed the Setun, a functional ternary
computer, at Moscow State University. The device was put into limited production in the
Soviet Union, but supplanted by the more common binary architecture.
Computers using vacuum tubes as their electronic elements were in use throughout the
1950s, but by the 1960s had been largely replaced by transistor-based machines, which
were smaller, faster, cheaper to produce, required less power, and were more reliable.
The first transistorised computer was demonstrated at the University of Manchester in
1953.[16] In the 1970s, integrated circuit technology and the subsequent creation of
microprocessors, such as the Intel 4004, further decreased size and cost and further
increased speed and reliability of computers. By the late 1970s, many products such as
video recorders contained dedicated computers called microcontrollers, and they started
to appear as a replacement to mechanical controls in domestic appliances such as
washing machines. The 1980s witnessed home computers and the now ubiquitous
personal computer. With the evolution of the Internet, personal computers are becoming
as common as the television and the telephone in the household].
Modern smartphones are fully programmable computers in their own right, and as of
2009 may well be the most common form of such computers in existence.
3 Programs
o 3.1 Stored program architecture
o 3.2 Bugs
o 3.3 Machine code
o 3.4 Higher-level languages and program design
4 Function
o 4.1 Control unit
o 4.2 Arithmetic/logic unit (ALU)
o 4.3 Memory
o 4.4 Input/output (I/O)
o 4.5 Multitasking
o 4.6 Multiprocessing
o 4.7 Networking and the Internet
5 Further topics
o 5.1 Artificial intelligence
o 5.2 Hardware
o 5.3 Software
o 5.4 Programming languages
o 5.5 Professions and organizations
6 See also
7 Notes
8 References
9 External links