Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Reviewer

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Chapter 1 – The Development of Computers and Information Systems

Pre-Computer Age and Calculating Machines

The abacus is one of the earliest machines invented over 2000 years ago
by Asian merchants to speed up calculation. It is a simple hand device for
recording numbers or performing simple calculations.
Calculating machines were first
introduced in the 17th century. In 1642, the first
calculating machine that can perform addition
and subtraction, a precursor of the digital
computer, was devised by the French scientist,
mathematician, and philosopher Blaise Pascal.
This device employed a series of ten-toothed
wheels, each tooth representing a digit from 0 to
9. The wheels were connected so that numbers
could be added to each other by advancing the
wheels by a correct number of teeth. In the 1670s
the German philosopher and mathematician
Gottfried Wilhelm Leibniz improved on this
machine by devising one that could also multiply.
It was in 1820 when the next
generation of calculating
devices was invented, the
artithometer, by Charles
Xavier Thomas of France. It
combined the features of the
Leibnitz calculator with newer
engineering techniques.
The first mechanical calculator
produced in the US was developed
in 1972 by Frank S. Baldwin.
Improving the Leibnitz design, it
made a much smaller and lighter
calculator.

The first commercial calculator that was


both a calculating and a listing machine was
developed in 1886 by William Seward
Burroughs, an American bank clerk.

Punched Card Information Processing and the Analytical Engine


The French weaver and inventor Joseph-
Marie Jacquard, designed an automatic
loom (Jacquard’s loom), which used
thin, perforated wooden boards to
control the weaving of complicated
cloth designs. The concept of recording
data in the form of holes punched in
cards was used in the design of punched
card information processing equipment.
Another lesson from Jacquard learned
from Jacquard was that work can be
performed automatically if a set of
instructions can be given to a machine
to direct it in its operations. This was the
fundamental for the development of
computers.
During the 1880s the American statistician Herman Hollerith who worked in
the US Bureau of Census, conceived the idea of using perforated cards (punch
cards similar to Jacquard’s boards) for processing data.

Employing a system that passed


punched cards over electrical contacts, he
devised the Hollerith’s punched-cards
tabulating machine, which he used to speed
up the compilation of statistical information for
the 1890 United States census. Hollerith went on
to establish the Tabulating Machine Company
to manufacture and market his invention,
which IN 1911 merged with other organizations
to form the Computing-Tabulating-Recording
Company.
In 1924, after further acquisitions, Computing-Recording-Tabulating Company
absorbed the International Business Machines Corporation (IBM) and assumed
that company's name. Thomas J. Watson, Sr. arrived that same year and began
to build the foundering company into an industrial giant. IBM soon became the
country's largest manufacturer of time clocks and developed and marketed the
first electric typewriter. In 1951 the company entered the computer field. The
punched-card technology was widely used until the mid-1950s.

Also in the 19th century, the British mathematician


and inventor Charles Babbage (referred to as the Father of
the modern computer) worked out the principles of the
modern digital computer. He conceived a number of
machines, such as the Difference Engine and Analytical
engine, the forerunners of the modern computer, that were
designed to handle complicated mathematical problems.
One of Babbage’s designs, the Analytical Engine,
had many features of a modern computer. It had an
input stream in the form of a deck of punched cards, a
“store” for saving data, a “mill” for arithmetic operations,
and a printer that made a permanent record. Babbage
failed to put this idea into practice, though it may well
have been technically possible at that date.
Many historians consider Babbage and his associate, the mathematician
Augusta Ada Byron, Countess of Lovelace and daughter of the poet, Lord Byron,
the true pioneers of the modern digital computer. The latter provided complete
details as to exactly how the analytical engine was to work. Because she
described some of the key elements in computer programming, she was referred
to as the “world’s first computer programmer”.
Early Computers
Analogue computers began to be built in the late 19th
century. Early models calculated by means of rotating shafts
and gears. Numerical approximations of equations too difficult
to solve in any other way were evaluated with such machines.
Lord Kelvin built a mechanical tide predictor that was a
specialized analogue computer. During World Wars I and II,
mechanical and, later, electrical analogue computing systems
were used as torpedo course predictors in submarines and as
bombsight controllers in aircraft. Another system was designed
to predict spring floods in the Mississippi River basin.

In the United States, a prototype


electronic machine had been built as early as
1939, by John Atanasoff and Clifford Berry, at
Iowa State College. This prototype and later
research were completed quietly for the
development of the Atanasoff-Berry Computer
(ABC). This is considered as the first electronic
computing machine.

It could only perform addition and subtraction, and never became


operational because of the involvement of the inventors in US military efforts in
World War II.
In 1944, Howard Aiken completed the MARK I computer (also known as the
Automatic Sequence controlled Calculator), the first electromechanical
computer. It can solve mathematical problems 1,000 times faster than existing
machines.
The first electronic computer to be
made operational was the Electronic
Numerical Integrator and Calculator (ENIAC).
It was built in 1946 for the US Army to perform
quickly and accurately the complex
calculations that gunners needed to aim their
artillery weapons. ENIAC contained 18,000
vacuum tubes and had a speed of several
hundred multiplications per minute, but
originally its program was wired into the
processor and had to be manually altered.

The scientists of the Cambridge University in


England designed the world’s first electronic
computer that stored its program of instructions, the
Electronic Delay Storage Automatic Calculator
(EDSAC). This gave more flexibility in the use of the
computer. Two years after (1951), machines were
built with program storage, based on the ideas of
the Hungarian-American mathematician John von
Neumann of Pennsylvania University. The
instructions, like the data, were stored within a
“memory”, freeing the computer from the speed
limitations of the paper-tape reader during
execution and permitting problems to be solved
without rewiring the computer. This concept gave
birth to the Electronic Discreet Variable Automatic
Computer (EDVAC).

During World War II a team of scientists and


mathematicians, working at Bletchley Park, north
of London, created one of the first all-electronic
digital computers: Colossus. By December 1943,
Colossus, which incorporated 1,500 vacuum
tubes, was operational. It was used by the team
headed by Alan Turing, in the largely successful
attempt to crack German radio messages
enciphered in the Enigma code.
First Generation of Computers
● The first generation of computers (1951-1959) is characterized by use of the
vacuum tube and were very large in size (a mainframe can occupy the
whole room).
● The first business computer, the Universal Automatic Computer (UNIVAC I),
was developed in 1951. It was invented to improve information processing
in business organizations.
● In 1953, IBM produced the first of its computers, the IBM 701—a machine
designed to be mass-produced and easily installed in a customer’s
building. The success of the 701 led IBM to manufacture many other
machines for commercial data processing. The IBM 650 computer is
probably the reason why IBM enjoys such a healthy share of today’s
computer market. The sales of IBM 650 were a particularly good indicator
of how rapidly the business world accepted electronic data processing.
Initial sales forecasts were extremely low because the machine was
thought to be too expensive, but over 1,800 were eventually made and
sold.
● The invention of the integrated circuit (IC) by Jack S. Kilbey of Texas
Instruments in 1958 is considered as a great invention which changed how
the world functions. It is the heart of all electronic equipment today.
● Between 1959 and 1961, (COBOL) was invented by Grace Murray Hopper.
It is a verbose, English-like programming language. Its establishment as a
required language by the United States Department of Defense, its
emphasis on data structures, and its English-like syntax led to its widespread
acceptance and usage, especially in business applications. It is a
champion of standardized programming languages that are hardware
independent. COBOL is run in many types of computers by a compiler that
is also designed by Hopper.
Second Generation of Computers
● The invention of the transistor marked the start of second generation of
computers (ca. 1954-1964) which were smaller in size (a mainframe can be
the size of a closet). Second generation computers were smaller, faster,
and more versatile logical elements than were possible with vacuum-tube
machines. Because transistors use much less power and have a much
longer life, components became smaller, as did inter-component spacings,
and the system became much less expensive to build. The Honeywell 400
computer is the first in the line op of second generation computers.
● In the 1950’s and 1960’s, only the largest companies could afford the six to
seven digit tags of mainframe computers. Digital Equipment Corporation
introduced the PDP-8, which is generally considered as the first successful
transistor-based microcomputer. It was an instant hit and there were
tremendous demands from business and scientific organizations.
Third Generation of Computers
● Even if the first IC Integrated Circuit was invented earlier during the era of
first-generation computers, it was only in late 1960s when it was introduced,
making it possible for many transistors to be fabricated on one silicon
substrate, with interconnecting wires plated in place. The IC resulted in a
further reduction in price, size, and failure rate. This was the start of third
generation computers (mid-1960s to mid-1970s).
● Some historians consider the IBM System/360 of computers the single most
important innovation in the history of computers. It was conceived as a
family of computers with upward compatibility, when a company outgrew
one model it could move up to the next model without worrying about
converting its data. This made all previous computers obsolete.
● In 1964, Beginner's All-purpose Symbolic Instruction Code (BASIC). a high-
level programming language, was developed by John Kemeny and
Thomas Kurtz at Dartmouth College. BASIC gained its enormous popularity
mostly because it can be learned and used quickly. The language has
changed over the years, from a teaching language into a versatile and
powerful language of both business and scientific applications.
● In 1969, two Bell Telephone Labs software engineers, Dennis Ritchie and Ken
Thompson, developed a multi-user computer system named Multics
(Multiplexed Information and Computing Service). They eventually
implemented a rudimentary operating system they named Unics, as a pun
of Multics. Somehow, the name became UNIX. The most notable feature of
this operating system is its portability: the operating system can run in all
types of computers, is machine-independent, and supports multi-user
processing, multitasking, and networking. UNIX is used in high-end
workstations and servers. This is written in C language, which was also
developed by Ritchie and Thompson.

Fourth Generation of Computers


● The introduction of large-scale integration of circuitry (more circuits per unit
of space) is the mark of the beginning of fourth generation of computers.
The base technology, though, is still the IC, had significant innovations after
two decades have passed. The computer industry actually experienced a
mind-bogging succession of advancements in the further miniaturization of
circuitry, data communications, and the design of computer hardware
and software. The microprocessor became a reality in the mid-1970s with
the introduction of the large-scale integrated (LSI) circuit.
● Bill Gates and Paul Allen revolutionized the computer industry. They
developed the BASIC programming language for the first commercially-
available microcomputer, the MITS Altair. After successful completion of
the project, the two formed Microsoft Corporation in 1975. Microsoft is now
the largest and most influential software company in the world. Microsoft
was given an anonymous boost when its operating system software, MS-
DOS was selected for use by the IBM PC. Gates, now the wealthiest person
in the world, provides the company’s vision of new product ideas and
technologies.
● One important entrepreneurial venture during the early years is the Apple
II personal computer, which was introduced in 1977. This event has forever
changed how society perceives computers: that computing is made
available to individuals and very small companies.

● IBM tossed its hat into the personal computer ring with its release of the IBM
personal computer in 1981. By the end of 1982, 835,000 units had been sold.
When software vendors began to orient their products to the IBM PC, many
companies began offering IBM PC-compatibles or clones. Today, the IBM
PC and its clones have become a powerful standard in the microcomputer
industry.

● In 1982, Michael Kapor founded the Lotus Development Company, a


subsidiary of IBM. It introduced an electronic spreadsheet product (Lotus
123) and gave IBM PC credibility in the business marketplace. Sales of IBM
PC and Lotus 123 soared.

▪ qIn 1984, Apple Macintosh introduced the Macintosh desktop


computer with a very friendly graphical user interface (GUI).
This was a proof that computers can be easy and fun to use.
GUI began to change the complexion of the software industry.
They have changed the interaction between the user and the
computer from a short, character-oriented exchange
modeled from the teletypewriter to the now famous WIMP
interface (WIMP stands for windows, icons, menus, and
pointing devices).

● It was in 1985 when Microsoft adopted the GUI in its Windows operating
system for IBM PC compatible computers. Windows did not enjoy
widespread acceptance until 1990, with the release of Windows 3.0. It gave
a huge boost to the software industry because larger, more complex
programs could not be run on IBM-PC compatibles. Subsequent releases
made the PC even easier to use, fueling the PC explosion in the 1990s.

● In 1991, Linus Torvalds developed LINUX, a reliable and compactly


designed operating system that is an offshoot of UNIX and can be run on
many different hardware platforms. It is available free or at very low cost.
LINUX was used as an alternative to the costly Windows Operating System.

● In 1993, the IBM-PC compatible PCs started out using Intel microprocessor
chips, then a succession of even more powerful chips. But not until the Intel
Pentium and its successors did PCs do much with multimedia (the
integration of motion, video, animation, graphics, sound, and so on). The
emergence of the high-powered Intel Pentium processors and their ability
to handle multimedia applications changed the way people view and use
PCs.

● It was also in this year when millions of people began to tune into the
Internet for news. The World Wide Web (WWW), one of several internet-
based applications, came of age as Web traffic grew 341.634%. The web is
unique that it enabled Web pages to be linked across the Internet. A
number of Internet browsers were introduced (e.g. Mosaic and Netscape
Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to
navigate the World Wide Web with ease. Today, WWW is the foundation for
most Internet communications and services. The World Wide Web was
actually created in 1991 by Tim Berners-Lee, an engineer in Geneva,
Switzerland.

Fifth Generation of Computers


● The fifth generation of computers is characterized by the very large-scale
integrated (VLSI) circuit (microchip), with many thousands of
interconnected transistors etched into a single silicon substrate. It is also
characterized by network computers of all sizes, the Internet, Intranets, and
Extranets.

● The year 1996 marked the 50th year of computer history. The US Postal
service issued stamps that commemorated the 50th anniversary of ENIAC,
the first full-scale computer and the 50 years of computer technology that
followed. It was during this year when the handheld computer was
introduced and signaled to the world that you can place a tremendous
computing power at the palm of your hand. Nowadays, millions of people
rely on handhelds for a variety of personal information management
applications, including e-mail.

● In the year 1999, the world was threatened by the Y2K problem, known as
the millennium bug. It may have been one of the biggest challenges ever
to confront the businesses of the world. For most of the 20th century,
information systems had only two digits to represent the year (e.g. 99 for
1999). But what would happen when the 20th century ended and a new
one begins is that non-compliant computers would interpret the date 01-
01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management’s awareness of how critical information technology is to the
operation of any organization.
● Jack Kilbey’s first IC contained a single transistor. Tens of thousands
engineers around the world have built on his invention, such that each year,
our society is the beneficiary of smaller, more powerful, cheaper chips.
● One continuing trend in computer development is microminiaturization,
the effort to compress more circuit elements into smaller and smaller chip
space. In 1999, scientists developed a circuit the size of a single layer of
molecules, and in 2000 IBM announced that it had developed new
technology to produce computer chips that operate five times faster than
the most advanced models to date.
● Researchers are also trying to speed up circuitry functions through the use
of superconductivity, the phenomenon of decreased electrical resistance
observed in certain materials at very low temperatures.
● Whether we are moving into a fifth generation of computing is a subject of
debate since the concept of generations may no longer fit the continual,
rapid changes occurring in computer hardware, software, data, and
networking technologies. But in any case, we can be sure that progress in
computing will continue to accelerate and that the development of
Internet-based technologies and applications will be one of the major
forces driving computing in the 21st century.
Chapter 1 – Part 2 Computer Hardware
Defining Computer Hardware

The computer hardware is the equipment and devices that make up a computer
system as opposed to the programs that are used on it. A digital computer is not
a single machine: rather, it is a system composed of distinct elements.
● input devices
● central processing unit

● primary storage devices


● secondary storage devices
● output devices

● communication devices
In order for information to flow through a computer system and be in a form
suitable for processing, all symbols, pictures, or words must be reduced to a string
of binary digits. A binary digit is called a bit. It represents the smallest unit of data
in a computer system. It can only have one of two states (e.g. true or false, on or
off), represented by 0 or 1. A byte is a string of eight (8) bits, used to store one
number or character in a computer system.

Computers continue to become smaller, faster, more reliable, less costly to


purchase and maintain, and more interconnected within computer networks.
input devices are directed toward direct data input that ids more natural and
easy to use, while output devices are geared toward direct output methods that
communicate naturally, quickly, and clearly.

A. INPUT DEVICES
Input devices enable a computer user to enter data, commands, and programs
into the CPU. Included in this category are the following.

● Keyboard – This is commonly known as the QWERTY keyboard, named after the
six leftmost characters in the top row of alphabetic characters on most
keyboards—the standard layout of most typewriters and computer keyboards. An
alternative layout, the Dvorak keyboard, is considered more efficient, but the
QWERTY keyboard has the advantage of familiarity. This is the most common input
device. Information typed at the typewriter-like keyboard is translated by the
computer into recognizable patterns.
● Mouse – This was invented by Douglas Englebert and was popularized by its
inclusion as standard equipment with the Apple Macintosh. It helps a user
navigate through a graphical computer interface. It is generally mapped so that
an on-screen cursor may be controlled by moving the mouse across a flat surface.
There are many variations on mouse design, but they all work in a similar manner.
Some mouse units feature a scroller, which provides a better way of scrolling
through documents vertically and/or horizontally. The latter optomechanical
mouse eliminates the need for many of the wear-related repairs and
maintenance necessary with purely mechanical mice.
● Joystick – This performs the same function as the mouse. It is favored for
computer games. A joystick usually has a square or rectangular plastic base to
which is attached a vertical stem. Control buttons are located on the base and
sometimes on top of the stem. The stem can be moved in all directions to control
the movement of an object on the screen. The buttons activate various software
features, generally producing on-screen events. A joystick is usually a relative
pointing device, moving an object on the screen when the stem is moved from
the centre and stopping the movement when the stem is released. In industrial
control applications, the joystick can also be an absolute pointing device, with
each position of the stem mapped to a specific on-screen location.
● Trackball – This can be roughly described as a mouse on its back. It consists of
a ball resting on two rollers at right angles to each other, which translate the ball's
motion into vertical and horizontal movement on the screen. It typically has one
or more buttons to initiate other actions. The only functional difference between
a mechanical mouse and a trackball is in how the ball is moved. With a trackball,
the housing is stationary, and the ball is rolled with the hand. A trackball is useful
for fine work because the user can exert fingertip control. Another major
advantage of a trackball is that it takes up little desktop surface. This replaces the
mouse on some laptop computers.
● Graphics tablet – This pointing device is also called a digitizing tablet. It is a flat
plastic rectangle with subsurface electronics, used in conjunction with a pointing
device in many engineering and design applications as well as in illustration work.
When a pointing device, like a puck (or even the finger), is moved on the surface
of the tablet, the location of the device is translated to a specific on-screen cursor
position.

● Puck – This is often used in engineering applications. It is a mouse-like device


with buttons for selecting items or choosing commands and a clear plastic section
extending from one end with cross hairs printed on it. The intersection of the cross
hairs on the puck points to a location on the graphics tablet, which in turn is
mapped to a specific location on the screen. Since the puck's cross hairs are on
a transparent surface, a drawing can easily be traced by placing it between the
graphics tablet and the puck and moving the cross hairs over the lines of the
drawing.

● Scanner – This input device uses light-sensing equipment to read information in


paper or another medium, and translate the pattern of light and dark (or color)
into a digital signal that can be manipulated by either optical character
recognition software or graphics software. A frequently encountered type of
scanner is flatbed, meaning that the scanning device moves across or reads
across a stationary document. Another type of flatbed scanner uses a scanning
element placed in a stationary housing above the document. Other scanners
work by pulling in sheets of paper, which are scanned as they pass over a
stationary scanning mechanism, as in the common office fax machine. Some
specialized scanners, like barcode readers, work with a standard video camera,
translating the video signal into a digital signal for computer processing. Another
popular type of scanner is the hand-held scanner, a scanner held in the user’s
hand is moved over the document to be scanned.
● Light pen – This is a pointing device in which the user holds a wand, which is
attached to the computer, up to the screen and selects items or chooses
commands on the screen (the equivalent of a mouse click) either by pressing a
clip on the side of the light pen or by pressing the light pen against the surface of
the screen. The light pen doesn't require a special screen or screen coating, as
does a touch screen, but its disadvantage is that holding the pen up for an
extended length of time is tiring to the user.

● Touch screen – This is a computer screen designed or modified to recognize the


location of a touch on its surface. By touching the screen, the user can make a
selection or move a cursor.. The touch screen's popularity with personal-computer
users has been limited because users must hold their hands in midair to point at
the screen, which is prohibitively tiring over extended periods. Also, touch screens
do not offer high resolution—the user is not able to touch only a specific point on
the screen. Touch screens are, however, immensely popular in applications such
as information kiosks because they offer pointing control without requiring any
movable hardware and because touching the screen is intuitive
● Card reader – This is a device that can acquire and process information stored
in electronic cards like ATM cards, ID cards, special privilege cards, credit and
debit cards, and so on. It commonly found in commercial establishments where
transaction cards are swiped to obtain necessary information about the customer
or client.
● Voice recognition system – It may be any device and software which together,
take spoken words and translate them into digital signals for the computer. A
typical device used in speech recognition is a microphone. Speech recognition
is the ability of a computer to understand the spoken word for the purpose of
receiving commands and data input from the speaker. This method is also fairly
reliable provided the speaker's speech patterns are consistent. Speech
recognition also allows full speech-to-text conversion. Storage devices can also
be used to input data into the processing unit. An example is the transfer of data
from an external storage device to the computer, such as an external disk drive,
digital camera with stored images, or any other external storage device.
B. CENTRAL PROCESSING UNIT

The central processing unit (CPU) is the part of the computer system where
manipulation of data, (symbols, numbers, and letters) occurs. It also controls other
parts of the system.

The CPU may be a single chip or a series of chips that perform arithmetic and
logical calculations and that time and control the operations of the other
elements of the system. Contemporary CPUs use semiconductor chips called
microprocessors, common in personal computers, which integrate all the
memory, logic, and control circuits for an entire CPU onto a single chip. The
development of the microprocessor was made possible through miniaturization
and integration techniques. The speed and performance of a computer’s
microprocessor help determine a computer’s processing power. These are based
on the following.
● Word length – This refers to the number of bits that the computer can process at
one time (e.g. a 64-bit chip can process 64 bits, or 8 bytes in a single cycle). The
larger the word length, the greater the computer’s speed.
● Cycle speed – This is measured in megahertz (MHz) or gigahertz (GHz). This
indicates the number of cycles per second (e.g. a 500 MHz Intel Pentium III
processor will have 500 million cycles per second).

● Data bus width – This acts as a superhighway between the CPU, primary storage,
and other devices, which determines how much data can be moved at one time.
the 8088 chip having 16-bit word length but only an 8-bit data bus width can
process data in 16-bit chunks but could only be moved 8 bits at a time)
OUTPUT DEVICES
Output devices enable the user to see the results of the computer’s calculations
or data manipulations. They present data in a form the user of the computer can
understand.
The most common output device can deliver either the soft copy or the hard
copy of the data. Devices that render soft copy are the following

● Video display unit (VDU) – This is commonly known as the monitor, which displays
characters and graphics on a television-like screen. It usually has a cathode ray
tube like an ordinary television set, but small, portable computers use liquid crystal
displays (LCDs) or electroluminescent screens.
● Audio output devices – These are responsible for the sound that the user hears
from the computer. These include the sound card and the speakers. The sound
card is a computer circuit board that allows the computer to receive sound in
digital form and reproduce it through speakers.

● External storage devices – These include floppy disks, compact disks,, external
hard disks, etc.

● Interactive multimedia – This is the combination of audio, video, and text on


high-capacity compact discs. CD-I includes such features as image display and
resolution, animation, special effects, and audio. Interactive multimedia includes
the following materials.

⮚ e-books and e-newspapers

⮚ electronic classroom presentation technologies

⮚ full motion videoconferencing

⮚ imaging

⮚ graphic design tools

⮚ video and voice mail

⮚ interactive web pages

⮚ multimedia web sites (they render digitized music and videos)

There are only two output devices known to reindeer hard copy of data.

● Printers – These are computer peripherals that put text or a computer-generated


image on paper or on another medium, such as a transparency. Printers can be
categorized in several different ways. The most common distinction is impact and
non-impact.

⮚ Impact printers physically strike the paper and are exemplified by pin dot-matrix
printers and daisy-wheel printers.

⮚ Non-impact printers include every other type of print mechanism, including


thermal, ink-jet, and laser printers.

● Computer-output microform – These are outpot devices that can render


microscopic format of documents.

PRIMARY STORAGE

Primary storage refers to temporary storage of data and program instructions


during processing. It is also known as internal storage since it stores data in the
computer memory. There are two types.
● RAM (Random Access Memory) – These are chips that are mounted directly on
the computer’s main circuit board, or in chips mounted on peripheral cards that
plug into the computer’s main circuit board. They are called so because the
computer can directly access any randomly chosen location in the same amount
of time. These RAM chips consist of millions of switches that are sensitive to
changes in electric current. So-called static RAM chips hold their data as long as
current flows through the circuit, whereas dynamic RAM (DRAM) chips need high
or low voltages applied at regular intervals—every two milliseconds or so—if they
are not to lose their information. RAM is used for short-term storage of data or
program instructions. It is volatile – meaning its contents will be lost when the
computer’s electric supply is turned off.

● ROM (Read-Only Memory) – These chips form commands, data, or programs


that the computer needs to function correctly. RAM chips are like pieces of paper
that can be written on, erased, and used again; ROM chips are like a book, with
its words already set on each page. ROM is non-volatile. ROM can only be read
from it; it cannot be written to. ROM chips come from manufacturers with
programs already burned in or stored. ROM is used in general-purpose computers
to store important or frequently-used programs. Like RAM, ROM chips are linked
by circuitry to the CPU.
Primary storage has three main functions.
● They store all or part of the software program that is being executed.
● They store the operating system programs that manage the operation of he
computer.
● They hold the data that the program is using.

SECONDARY STORAGE

Secondary storage stores data and instructions when they are not used in
processing. Relatively, they are long-term, non-volatile storage of data outside
the CPU or primary storage. Secondary storage is also known as external storage
because it does not use the computer memory to store data. External storage
devices, which may actually be located within the computer housing, are
external to the main circuit board. These devices store data as charges on a
magnetically sensitive medium such as a magnetic tape or, more commonly, on
a disk coated with a fine layer of metallic particles.

The most popular secondary storage devices include the following.

● Magnetic disks – This broad category includes the following.

⮚ Floppy disk – The floppy disk in normal use stores about 800 KB or about 1.4 MB.
⮚ ZIP disk – A ZIP disk is much like a floppy disk but has a greater capacity.
⮚ Hard disk – Hard, or “fixed”, disks cannot be removed from their disk-drive
cabinets, which contain the electronics to read and write data on to the
magnetic disk surfaces. Hard disks currently used with personal computers can
store from several hundred megabytes to several gigabytes.
⮚ RAID (Redundant Array of Inexpensive Disks) – This is a disk storage technology
to boost disk performance by packing more than 100 smaller disk drives with a
control chip and a specialized software in a single large unit to deliver data over
multiple paths simultaneously.

● Optical disks – These disks use the same laser techniques that are used to create
audio compact discs (CDs). Under this genre are:

⮚ CD-ROM – This is an acronym for compact disc read-only memory, a form of


storage characterized by high capacity (roughly 600 MB) and the use of laser
optics rather than magnetic means for reading data.
⮚ WORM – This is an acronym for write once, read many. This is very much like the
CD-ROM. This type of optical disc can be read and reread but cannot be altered
after it has been recorded. WORMs are high-capacity storage devices. Because
they cannot be erased and re-recorded, they are suited to storing archives and
other large bodies of unchanging information.
⮚ CD-R and CD-RW – In simple definition, these are blank CD-ROM that are ready
for data storage. A CD-R is similar to a WORM which cannot be erased or re-
recorded. A CD-RW is capable of being erased and re-recorded.
⮚ DVD - This is short for digital versatile disc. The group of DVD disc formats includes
various forms of data recording for computer purposes, including discs that
contain pre-recorded data (DVD-ROM) and discs that can be rewritten many
times (DVD-RAM). These are several times the capacity of CD-ROMs. The simple
single-layer version of the DVD holds between 3.7 and 4.38 GB (with double-layer
versions holding 15.9 GB), compared to the 650 MB of CD-ROMs. These higher
capacity discs are used particularly for computer games and in multimedia
applications.
⮚ DVD-R and DVD-RW – These are blank optical disks in DVD format ready for
data storage, just like CD-R and CR-RW

COMMUNICATION DEVICES

Communication devices control the passing of information to and from


communication networks. It consists of both physical devices and software that
link the various pieces of hardware and transfer data from one physical location
to another. Computers and communications equipment can be connected in
networks for sharing voice, data, images, sound, video, or even a combination of
all these

The most familiar communication device in a typical computer is the modem. a


device that converts between analogue and digital signals. The modem works
by, and derives its name from, a process of modulating and demodulating.

Some modems have become specialized in terms of function. For instance, one
of the cards available for a PC is a facsimile transmission (fax) modem that allows
the PC to talk directly to fax machines and to send and receive fax messages.
High-speed modems have been developed that work at speeds of 2 megabits
per second. These are used as components in leading-edge communications
services.
Telecommunication, which is communications over a distance using technology
to overcome that distance.

Chapter 1 – Part 3 Computer Software

Computer software is the detailed program and instruction that control the
operations of a computer system. They cause the hardware to do work. A
software program is a series of statements or instructions to the computer. The
process of writing coding programs is termed programming, and individuals who
specialize in this task are programmers.

Software as a whole can be divided into a number of categories based on the


type of work done. The two primary software categories are:
● system software (operating system and language translators), which control the
workings of the computer, and
● application software, which addresses the multitude of tasks for which people
use computers.

SYSTEM SOFTWARE

System software is a set of generalized programs that manage the computer’s


resources, such as the central processor, communication links, and peripheral
devices. It coordinates the various parts of the computer and mediates between
the application software and the computer hardware.

There are three major types of system software.


● Operating system – This manages ad control’s the computer’s activities, such as
multiprogramming, multitasking, virtual storage, time sharing, and
multiprocessing. The most common feature of software programs nowadays is the
graphical user interface (GUI), which is the part of the OS users interact with, which
uses graphic icons and some input pointing devices like the mouse to issue
commands and make selections. The following is a list of popular operating
systems.

⮚ DOS (Disk Operating System) – This is a 16-bit OS used by earlier PCs. It does not
support multitasking and limits the size of programs that can be run.
⮚ OS/2 – This is an operating system for personal computers which allows multi-
tasking. It can run MS-DOS and Windows-based applications, and can read all
MS-DOS disks. OS/2 was originally developed as a joint project between Microsoft
and IBM.
⮚ Mac OS – This is the first OS to render graphical user interface that was
developed for Macintosh computers.
⮚ Linux – Linux is available from many different companies each adding their own
features, such as a graphical installation routine, but all relying on a basic set of
operating system functions.
⮚ UNIX – This is a multi-user, user ID operating system that incorporates
multitasking. It was originally developed for use on minicomputers. UNIX exists in
various forms and implementations and is considered a powerful operating
system that is more portable—less machine-specific—than other operating
systems because it is written in C. UNIX is available in several related forms,
including AIX, a version of UNIX adapted by IBM (to run on RISC-based
workstations), A/UX (a graphical version for the Apple Macintosh), and Mach (a
rewritten but essentially UNIX-compatible operating system for the NeXT
computer).
⮚ Microsoft Windows – This is a multitasking graphical user interface environment
that runs on MS-DOS-based computers. Windows provides a standard interface
based on drop-down menus, screen windows, and a pointing device such as a
mouse. Programs must be specially designed to take advantage of these
features. This was released in several versions.

❖ Windows 3.0 (1990)


❖ Windows 3.1 (1992)
❖ Windows NT (1993) – This is an operating system for business environments.
❖ Windows 95 (1995)
❖ Windows 98 (1998) – This featured integrated Internet capabilities.
❖ Windows CE (1999) – This OS was used in devices designed to provide
consumers with integrated cable-television, telephone, and high-speed Internet
services.
❖ Windows ME (Millennium Edition, 1999) – This is very much similar to the features
and capabilities of Windows 98, with some additional upgrades.
❖ Windows 2000 (1999) – This was released as an update for Windows NT,
intended for workstations and network servers.
❖ Windows XP (eXPerience, 2001 – This is the company's first operating system for
consumers that was not based on MS-DOS. It combines the robustness of Windows
2000 and Windows 98 and ME.
❖ Windows Longhorn – This OS by Microsoft is the next generation to Windows XP,
with updated and added features not present in previous versions. It has not been
released to the market and is presently under development.
Language translators – These are programs that convert the programming
language instructions in a computer program into machine language code. The
program I the high level language before translation into machine language is
called source code. There are two basic types of language translators.

⮚ Assembly languages – These programs substitute mnemonics for numeric


codes. These are popular in second generation computers.
⮚ Compilers or interpreters – These programs translate high-level language into
machine language.

Some examples are the following. (Third generation programming languages)


❖ COBOL
❖ FORTRAN
❖ BASIC
❖ PASCAL
❖ C and C++
(Succeeding generation programming languages)
❖ PERL interpreter
❖ JAVA compiler

Fourth generation languages are language programs that can be employed


directly by end users or less-skilled programmers to develop computer
applications more rapidly than conventional programming languages. They are
characterized by the following.
⮚ They are less procedural, or even non-procedural.
⮚ They use a programming language that is very close to human language
(natural language).
⮚ They incorporate software tools that provide immediate on-line answers to
requests for information that is pre-defined (query language).
Fourth generation language programs are distributed in seven categories.
⮚ PC software tools (e.g. WordPerfect, IE, Access)
⮚ Query language (e.g. SQL)
⮚ Report generator (e.g. RPG III)
⮚ Graphics language (e.g. SAS Graph, Systat)
⮚ Application generator (e.g. Focus, Power Builder, MS Front Page)
⮚ Application software packages (e.g. Peoplesoft, HRMS, SAP R/3)
⮚ Very-high-level programming languages (e.g. APL, Nomads2)
● Utility software – These are programs that are used to support, enhance, and
expand existing programs
in a computer system. Typical utility software programs include
⮚ screen savers
⮚ data recovery and back-up utilities
⮚ virus-detection programs
⮚ data compression and disk defragmenter tools
⮚ device drivers
⮚ spooling programs
⮚ internet security programs

APPLICATION SOFTWARE

Application software is a program written for a specific application to perform


function specified the the end user. It must work through the system software in
order to operate. The following are features common to all application software
programs.
● WYSIWYG (What-You-See-Is-What-You-Get) – What appers exactly in the
computer monitor is exactly as they would be finally produced.
● White space – The area where the work is done is commonly white (unless the
user changes the default color).
● Cursor – This indicates where particularly in the document are the current and
next operations applied.
● Panning – The user can scroll through the document to the left or to the right.
● Desktop – This is usually the entire screen area that is available for GUI.
● Desktop accessories – These are features commonly found on a conventional
office desktop like toolbars, icons, forms, drop-down lists, check boxes, etc.
● Clipart – These are artwork designed for import usually to text documents or
charts.
● Object linking embedding (OLE) – This lets one to embed an object created
using one application into another application. This is common in integrated
software packages

The following is a categorization of application software commonly used


nowadays.
● Word processing software – These are programs that are used to enter, store,
manipulate, and print text (or sometimes text with images) to produce
documents.
Examples: MS Word
Office Writer
Star Office Writer
● Electronic spreadsheets – These are programs that are used for file containing
data and formulas in tabular format. They are capable of easily recalculating
numerical data.
Examples: MS Excel
SPSS
Calc
● Database management software – These are used for creating and
manipulating lists, creating files and databases to store data, and combining
information for reports.
Examples: MS Access
Integrated Library System Software
File Maker Pro
● Presentation graphics software – These are programs that create quality
graphics presentations that can incorporate charts, sound, animation, photos,
and video clips.
Examples: MS PowerPoint
Lotus Freelance Graphics
● Integrated software packages and software suites – These come in bundles of
two or more
applications, which provide easy transfer of data between them. Integrated
software suites have
capabilities for supporting collaborative work on the Web or incorporating
information from the Web
into documents (e.g. MS Office 2000, XP, and 2003).
Examples: MS Office
Lotus Notes
● Personal information management software – These are equipped with
appointment scheduling
systems, calendars, contact lists, e-mail browsers, and other applications used for
orfganizing personal
data and information.
Examples: MS Outlook
Palm OS
● Electronic mail software – These programs facilitate computer-to-computer
exchange of messages.
Web browsers and PC software suites also have e-mail capabilities.
Examples: MS Outlook
Mozilla
Eudora
Pegasus
● Web browsers – These are easy-to-use software tools for accessing information
in the World Wide
web and the Internet.
Examples: MS Internet Explorer
Netscape Navigator
Opera
● Web authoring software – These are intended in the creation of high quality
Web pages and Web sites.
They usually apply a WYSIWYG working environment, allowing the less skilled Web
developers to
come up with competitive work results.
Examples: MS FrontPage
Adobe GoLive
● Image processing software – These are intended in producing and editing high-
quality images and
photos, which can be used in other works or can be shared online or through
devices like digital
cameras.
Examples: Adobe Photoshop
Correl Draw
● Reference suite software – These are the electronic counterpart of the printed
reference sources
known, like encyclopedias, dictionaries, atlases, and so on.
Examples MS Encarta Reference Suite
Compton’s Interactive Encyclopedia
● Media authoring software – These are intended to be used in producing various
types of media like
video, music, animations, and so on.
Examples Pinnacle
Cakewalk Studio
Sonic Foundry Acid Pro
Macromedia Flash
● Music notation software – These are chiefly intended for the production of
printed music. Some music
notation software applications are integrated with features that function like
media authoring software.
Examples Finale
Voyetra MusicWrite
Cakewalk Score Writer
juanthelibrarian2012
Page | 216
● Media players – These are intended to playback media files like music files (e.g.
audio tracks, mp3
files, MIDI sequences, wav files, etc.), and video files (MPEG files, avi files, etc.).
They are also used
to access the media content of optical discs (e.g. VCD, DVD, audio CD) or other
storage devices that
contain media.
Examples Windows Media Player
Cyberlink Power DVD
Creative Media Center
● Computer aided design software – These are highly specialized software used
in creating designs like
architectural and engineering designs. They are capable of rendering three-
dimensional images.
Example: Autodesk AutoCAD
The following are features common to all application software programs.

Contemporary Tools for Software Development


Here are some of the tools and approaches commonly used nowadays in the
design, creation, and
development of computer software applications.
● Object-oriented programming – This is an approach to software development
that combines data and
procedures into a single object. The object combines data and program code. It
has spawned a new
programming technology known as visual programming. Visual Basic (VB) is a
widely used visual
programming tool to run on Windows platforms.
● JAVA – This is a programming language that can deliver only the software
functionality needed for a
particular task. such as a small applet downloaded from a network. JAVA can rin
on any computer or
operating system.
● HTML (Hypertext Markup Language) – This is the standard text formatting
language for documents
on the World Wide Web since 1989. HTML documents are text files that contain
two parts: content
that is meant to be rendered on a computer screen; and markup or tags,
encoded information that
directs the text format on the screen and is generally hidden from the user. HTML
is a subset of a
broader language called Standard Generalized Markup Language (SGML),
which is a system for
encoding and formatting documents, whether for output to a computer screen
or to be printed on
paper.
● XML (Extensible Markup Language) – This was created to structure, store, and
send electronic
information. In appearance, XML is similar to the familiar HTML used to create
pages on the World
Wide Web. The main difference between the two is that HTML is used to describe
how Web pages
should look while XML is designed to describe what the information on a Web
page actually means.
Put another way, HTML is about displaying information, while XML is about
describing information.
XML is not a replacement for HTML; it was designed for a specific purpose with an
overall intent
that it should complement HTML.

Data Resources Management

Databases: Some Concepts and Terminologies

The management of data and information in computers typically involves databases. A database
is a collection of data organized for storage in a computer memory and designed for easy access by
authorized users. It serves many applications efficiently by centralizing the data and minimizing
redundant data. The data may be in the form of text, numbers, or encoded graphics.
Since their first, experimental appearance in the 1950s, databases have become so important
in industrial societies that they can be found in almost every field of information. Government, military,
and industrial databases are often highly restricted, and professional databases are usually of limited
interest. A wide range of commercial, governmental, and non-profit databases are available to the
general public and may be used by anyone who owns or has access to the equipment that they require.
The organization of data in databases involves some terminologies.
● character – consists of a single alphabetic, numeric, or other symbol
● field – a grouping of characters into a word, a grouping of words, or a complete number; such as a
person’s name or age
● record – a group of related fields
● file – a group of records of the same type, or records that are somehow related
● entity – a person, place, thing, or event about which information must be kept
● attribute – a piece of information describing an entity
● key field – a field in a record that uniquely identifies instances of that record so that it can be
retrieved, sorted, or updated
● query - a statement defined by the user, which instructs the database management system
(DBMS) the find and retrieve the wanted record or information
● tuple – a row or record in a relational database
Databases: Management, Design, and Structure
Database management system (DVMS) is a special software or a computer program that controls the
creation, maintenance, and use of a database of an organization and its end users. It has three (3)
components:
● a data definition language
● a data manipulation language
● a data dictionary
Many database management software packages make use of SQL (Structured Query Language). It is
the most prominent data manipulation language today.
A typical database consists of several database objects. The following objects are the usual
components of a database. Other database management programs may use a different name for
some of the objects.
● Table
A table is the basic unit for storing a collection of data. A table’s definition consists of a list of
fields, each of which stores a discrete piece of information for a single record.
● Queries
Queries enable the user to extract a subset of data from a single table, from a group of related
tables, or from other queries, using criteria you define. By saving a query as a database object, the
query can be run at any time, using the current contents of the database. They may sometimes look
exactly like a table; the crucial difference is that each row of the query’s results may consist of fields
drawn from several tables. A query may also contain calculated fields, which display results based on
the contents of other fields.
● Forms
Forms enable users to enter, view, and edit information, generally one record at a time. They can
closely resemble paper forms such as invoices and time sheets; or they are organized for data entry
with data validation rules. A form may also include a subforin that displays information from a related
table.
● Reports
Reports enables the user to present data from one or more tables or queries in a readable style and
a professional format, generally for printed output. A report may include detailed lists of specific
data, with each row consisting of a single record, or it may provide a statistical summary of a large
quantity of information. A report design can include grouping and sorting options.
● Macro
A macro is a set of one or more actions that perform a particular operation, such as opening a
form or printing a report. Macros can help to automate common tasks. For example, the user can run
a macro that prints a report when a user clicks a command button. A macro can be one macro
composed of a sequence of actions, or it can be a macro group.
● Module
A module is essentially a collection of declarations, statements, and procedures stored together as
one named unit to organize Visual Basic code or nay other code used by the database which are
generated by other programming languages. In designing a database, the following steps should be
applied.
● Determine the purpose of your database.
The first step in designing a database is to determine its purpose and how it's to be used.
⮚ Talk to people who will use the database.
⮚ Brainstorm about the questions you and they would like the database to answer.
⮚ Sketch out the reports you'd like the database to produce. Gather the forms you currently use
to record your data.
As you determine the purpose of your database, a list of information you want from the database
will begin to emerge. From that, you can determine what facts you need to store in the database and
what subject each fact belongs to. These facts correspond to the fields (columns) in your database,
and the subjects that those facts belong to correspond to the tables.
● Determine the fields you need in the database.
Each field is a fact about a particular subject. For example, you might need to store the following
facts about customers: company name, address, city, state, and phone number. You need to create a
separate field for each of these facts.
● Determine the relationships between tables.
Now that you've divided your information into tables and identified primary key fields, you need
a way to tell the database how to bring related information back together again in meaningful ways.
To do this, you define relationships between tables.
● Refine the design.
After designing the tables, fields, and relationships needed, it's time to study the design and detect
any flaws that might remain. It is easier to change the database design at this point than it will be
after you have filled the tables with data.
● Test the design.
Enter enough sample data in your tables so as to test the design. To test the relationships in the
database, see if you can create queries to get the answers you want. Create rough drafts of forms
and reports and see if they show the data expected. Look for unnecessary duplications of data and
eliminate them.
● Enter data and create other database objects.
If table structures meet the design principles described and is determined to serve its purpose
effectively, then it's time to go ahead and add all existing data to the tables. Other database objects
can already be created at this point, such as queries, forms, reports, macros, modules, and other
available objects.
Classifying Databases
Databases can be classified in different ways. They can be classified by their intended use and
function, or by their structure.
● By intended use and function
⮚ Operational databases (e.g. HR database, inventory database, customer database)
⮚ Distributed database – This is a replicate copy or a part of a database to network servers at a
variety of sites.
⮚ External database – This database is designed to be published in the World Wide Web which can
be accessed through the Internet, with charge or free.
● By structure
⮚ Relational DBMS – This is a type of a logical database model that represents all data in the
database as simple two-dimensional tables called relations. The tables appear similar to flat files
but the information in one file can be easily extracted and combined.
⮚ Hierarchical DBMS – This is an older logical database model that organizes data in a treelike
structure. A record is subdivided into segments that are connected to each other in one-to-many
parent-child relationships.
⮚ Network DBMS – This is also an older logical database model that is useful for depicting
many-to-many relationships.
⮚ Object-oriented DBMS – This is a database for storing graphics and multimedia and has the
capabilities of a relational DBMS for sorting traditional information.
Trends in Database Management
The notable factor to the trends in database programming and management is the continuous
advancement of information management practices. Listed below are some of these trends.
● Multidimensional data analysis
This is the capability for manipulating and analyzing large volumes of data from multiple
perspectives. It is also known as on-line analytical processing (OLAP).
● Data warehouses
A data warehouse is a database, with reporting and query tools, that stores current and historical
data extracted from various operational systems and consolidated for management reporting
analysis.
● Data mining
This is the analysis of large pools of data to find patterns and rules that can be used to guide
decision making and predict future behavior.
● Hypermedia databases
These are common in the Web. Hypermedia was used as an approach to data management that
organizes data as a network of nodes linked in any pattern the user specifies. The nodes can contain
text, graphics, sound, full-motion video, or executable programs.
Issues Affecting Libraries and Information Centers
Even if there are so many developments in hardware, software, and network technologies, there are
several issues in IT that are of great concern to libraries and information centers.
● Licensing
Like in any of the many areas of commerce in which licenses are required, licensing applies also
in commercially-distributed software. This is a very big concern since the cost of licensed software is
too high nowadays. Some institutions who cannot afford to purchase a licensed software resort to
the use of pirated software. An alternative to expensive licensed software is the use of shareware
(software that is distributed on the basis of an honor system), or freeware (a computer program given
away free of charge). Most shareware is distributed free of charge but the author usually requests
that you pay a small fee if you like the program or use it on a regular basis. Freeware is often made
available on bulletin boards and through user groups. An independent program developer might offer
a product as freeware either for personal satisfaction or to assess its reception among interested
users.
● Piracy
Software piracy is a crime of robbery for private ends. Software programs are reengineered and
redistributed by unauthorized bodies for their own gains. They usually unlock the software by
providing passwords, serial numbers, or codes required for installation. There are also times at which
they unlock the software by using cracking program tools.
● Computer viruses
A computer virus is a program that “infects” computer files (usually other executable programs)
by inserting copies of itself in those files. This is usually done in such a manner that the copies will
be executed when the file is loaded into memory, allowing them to infect still other files, and so on.
Viruses often have damaging side effects, sometimes intentionally, sometimes not. PC users can
safeguard their files using anti-virus software packages such as Norton Anti virus, McAfee Virus
Scan, AVG Anti-Virus, and so on. These programs can detect viruses, and often repair the damage
done by them.
The increase in transactions over the Internet has greatly increased the chance of virus infection,
so anti-virus measures have been introduced to promote the growth of electronic business. Digital
certificates can be used to validate the identity of people and organizations on the Internet, digital
signatures can prove the identity of an individual, and Secure Electronic Transaction (SET)
mechanisms have been developed to allow safe credit card transactions.
E-mail viruses remain a major threat, however—during 2000, many large organizations were
brought down by a virus attached to an e-mail message entitled "I Love You". In 2002 a new type of
virus appeared that allowed unauthorized users to access private information (such as credit card
details). This virus, known as “Bugbear”, was carried via e-mail and affected many users.
● Data theft
This is a more serious problem than software piracy. Computer system hackers (or crackers)
mutilate the encryption of restricted databanks and databases and make unauthorized use of the
information/data contained in them. The use of these data may be intended for unlawful activities
like theft.
● Spam and junk mails
Spam or unsolicited e-mail is the electronic equivalent of junk mail. People usually send spam in
order to sell products and services, to draw traffic to Web sites, or to promote moneymaking
schemes. Unlike physical junk mail, spam does not stop if it is unsuccessful. When marketing
departments send junk mail they incur some expense, so give up if they do not succeed. Spam costs
virtually nothing to send and so it persists, whatever the recipient does.
Spam can easily be confused with legitimate bulk e-mail. According to Mail Abuse Prevention
System (MAPS), an electronic message is regarded as spam only if the recipient's personal identity is
irrelevant because the message is equally applicable to many others; the recipient has not granted
permission for it to be sent; and the message appears to the recipient to give a disproportionate
benefit to the sender. Spam has become a big problem over the past few years as it consumes large
amounts of the recipient’s time and Internet capacity. It is also an enduring problem as it is virtually
impossible to determine where it originates.
The first spam was sent as long ago as 1978 by a Digital Equipment Corporation sales
representative to advertise a computer equipment demonstration. The initial defense against spam
was to block mail from domains that are known to be senders but it is relatively easy for spam
senders to send from a new domain. The most effective measure now available is to use one of the
e-mail filters on the market that saves the user from having to manually sift though his or her inbox.
Legislation introduced in the European Union in December 2003 makes it a criminal offence to
send spam unless the recipient has agreed in advance to accept it. Similar legislation was signed into
law in the US in the same month.
● Obsolescence of hardware and software
The very fast developments in computer technology mean the very quick obsolescence of
computer devices. Both hardware and software are subject to this problem. This can be resolved by
downloading software updates from the Internet. Unused computers which are left because of
purchasing newer one can be donated to charitable institutions so that they become useful once
more. Computers drain critical resources such as electricity and paper. They also produce unwanted
electrical and chemical, and bulk-waste side effects. As a society, we should adopt a more
environmentally position with respect to use, manufacture, and disposal of computer equipment and
devices. This is known as green computing (environmentally sensible computing).
● High costs in electricity
A computer will never work without electricity. The electrical consumption of computers
becomes a big deal if the institution has many computer units which are run simultaneously. Always
set the computer to a mode at which the monitor and the hard drive is automatically turned off when
not in use. Green computing is also a solution to this problem.
● Health issues
Ergonomics (or human factor engineering), the science and technology emphasizing the safety,
comfort, and ease of use of human-operated machines such as computers, its goal is to produce
systems that are user-friendly, safe, comfortable, and easy to use. Institutions which make use of
computers in their daily activities should consider using ergonomically correct furniture (e.g. chairs
and tables) and devices (e.g. mouse, keyboard, etc.).
Trends and Future Developments
The following are just some of the trends in the development of information technology.
● Computer system capabilities
Computers continue to become smaller, faster, more reliable, less expensive to purchase and
maintain, and more interconnected within computer networks and other electronic gadgets and
devices.
● Input technology trends
Input devices are becoming more natural and easier to use. Even programming languages are
becoming to be structured like human language, making them easier and faster to learn.
● Output technology trends
Output devices are geared toward direct output methods that communicate naturally, quickly, and
clearly.
● Trends in storage media
The capacity of data storage media is continuously growing. Primary storage media are starting to
use microelectronic circuits while secondary storage media are using magnetic and optical media.
One continuing trend in computer development is microminiaturization, the effort to compress more
circuit elements into smaller and smaller chip space. Researchers are also trying to speed up circuitry
functions through the use of superconductivity, the phenomenon of decreased electrical resistance
observed in certain materials at very low temperatures. As the physical limits of silicon-chip
computer processors are being approached, scientists are exploring the potential of the next
generation of computer technology, using, for instance, devices based on deoxyribonucleic acid
(DNA). The fifth-generation computer effort to develop computers that can solve complex problems
in ways that might eventually merit the description “creative” is another trend in computer
development, the ideal goal being true artificial intelligence. One path actively being explored is parallel
processing computing, which uses many chips to perform several different tasks at the same time.
Parallel processing may eventually be able to duplicate to some degree the complex feedback,
approximating, and assessing functions of human thought. One important parallel processing
approach is the neural network, which mimics the architecture of the nervous system. Another
ongoing trend is the increase in computer networking, which now employs the worldwide data
communications system of satellite and cable links to connect computers globally. There is also a
great deal of research into the possibility of “optical” computers—hardware that processes not
pulses of electricity but much faster pulses of light.

You might also like