A computer is an electronic device that stores and processes data according to a set of
instructions to perform a wide range of tasks. It accepts data as input, processes it
using a central processing unit (CPU), and produces a result as output. Computers can
be found in various shapes and sizes, from smartphones to large servers.
Functions: Computers are used to store, retrieve, and process information such as
words, pictures, and numbers. They can be used for tasks like creating documents,
sending emails, playing games, and browsing the internet.
Components: A computer system is made up of physical components called hardware
and the programs that run on it, known as software.
Processing: Computers process data using a binary system of 1s and 0s and follow a
sequence of instructions to complete tasks.
Input/Output: The basic process of a computer is input, process, output, and storage.
Examples of input include typing on a keyboard, while examples of output include what
you see on the screen or a printed document.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.
A computer is a machine that can be programmed to automatically carry
out sequences of arithmetic or logical operations (computation). Modern digital
electronic computers can perform generic sets of operations known as programs,
which enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and
used for full operation; or to a group of computers that are linked and function
together, such as a computer network or computer cluster.
A broad range of industrial and consumer products use computers as control
systems, including simple special-purpose devices like microwave
ovens and remote controls, and factory devices like industrial robots. Computers
are at the core of general-purpose devices such as personal
computers and mobile devices such as smartphones. Computers power
the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and
the microcomputer revolution in the 1970s. The speed, power, and versatility of
computers have been increasing dramatically ever since then, with transistor
counts increasing at a rapid pace (Moore's law noted that counts doubled every
two years), leading to the Digital Revolution during the late 20th and early 21st
centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a microprocessor,
together with some type of computer memory, typically semiconductor
memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of
operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers,
etc.), and input/output devices that perform both functions (e.g. touchscreens).
Peripheral devices allow information to be retrieved from an external source, and
they enable the results of operations to be saved and retrieved.