Reviewer
Reviewer
Reviewer
The abacus is one of the earliest machines invented over 2000 years ago
by Asian merchants to speed up calculation. It is a simple hand device for
recording numbers or performing simple calculations.
Calculating machines were first
introduced in the 17th century. In 1642, the first
calculating machine that can perform addition
and subtraction, a precursor of the digital
computer, was devised by the French scientist,
mathematician, and philosopher Blaise Pascal.
This device employed a series of ten-toothed
wheels, each tooth representing a digit from 0 to
9. The wheels were connected so that numbers
could be added to each other by advancing the
wheels by a correct number of teeth. In the 1670s
the German philosopher and mathematician
Gottfried Wilhelm Leibniz improved on this
machine by devising one that could also multiply.
It was in 1820 when the next
generation of calculating
devices was invented, the
artithometer, by Charles
Xavier Thomas of France. It
combined the features of the
Leibnitz calculator with newer
engineering techniques.
The first mechanical calculator
produced in the US was developed
in 1972 by Frank S. Baldwin.
Improving the Leibnitz design, it
made a much smaller and lighter
calculator.
● IBM tossed its hat into the personal computer ring with its release of the IBM
personal computer in 1981. By the end of 1982, 835,000 units had been sold.
When software vendors began to orient their products to the IBM PC, many
companies began offering IBM PC-compatibles or clones. Today, the IBM
PC and its clones have become a powerful standard in the microcomputer
industry.
● It was in 1985 when Microsoft adopted the GUI in its Windows operating
system for IBM PC compatible computers. Windows did not enjoy
widespread acceptance until 1990, with the release of Windows 3.0. It gave
a huge boost to the software industry because larger, more complex
programs could not be run on IBM-PC compatibles. Subsequent releases
made the PC even easier to use, fueling the PC explosion in the 1990s.
● In 1993, the IBM-PC compatible PCs started out using Intel microprocessor
chips, then a succession of even more powerful chips. But not until the Intel
Pentium and its successors did PCs do much with multimedia (the
integration of motion, video, animation, graphics, sound, and so on). The
emergence of the high-powered Intel Pentium processors and their ability
to handle multimedia applications changed the way people view and use
PCs.
● It was also in this year when millions of people began to tune into the
Internet for news. The World Wide Web (WWW), one of several internet-
based applications, came of age as Web traffic grew 341.634%. The web is
unique that it enabled Web pages to be linked across the Internet. A
number of Internet browsers were introduced (e.g. Mosaic and Netscape
Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to
navigate the World Wide Web with ease. Today, WWW is the foundation for
most Internet communications and services. The World Wide Web was
actually created in 1991 by Tim Berners-Lee, an engineer in Geneva,
Switzerland.
● The year 1996 marked the 50th year of computer history. The US Postal
service issued stamps that commemorated the 50th anniversary of ENIAC,
the first full-scale computer and the 50 years of computer technology that
followed. It was during this year when the handheld computer was
introduced and signaled to the world that you can place a tremendous
computing power at the palm of your hand. Nowadays, millions of people
rely on handhelds for a variety of personal information management
applications, including e-mail.
● In the year 1999, the world was threatened by the Y2K problem, known as
the millennium bug. It may have been one of the biggest challenges ever
to confront the businesses of the world. For most of the 20th century,
information systems had only two digits to represent the year (e.g. 99 for
1999). But what would happen when the 20th century ended and a new
one begins is that non-compliant computers would interpret the date 01-
01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management’s awareness of how critical information technology is to the
operation of any organization.
● Jack Kilbey’s first IC contained a single transistor. Tens of thousands
engineers around the world have built on his invention, such that each year,
our society is the beneficiary of smaller, more powerful, cheaper chips.
● One continuing trend in computer development is microminiaturization,
the effort to compress more circuit elements into smaller and smaller chip
space. In 1999, scientists developed a circuit the size of a single layer of
molecules, and in 2000 IBM announced that it had developed new
technology to produce computer chips that operate five times faster than
the most advanced models to date.
● Researchers are also trying to speed up circuitry functions through the use
of superconductivity, the phenomenon of decreased electrical resistance
observed in certain materials at very low temperatures.
● Whether we are moving into a fifth generation of computing is a subject of
debate since the concept of generations may no longer fit the continual,
rapid changes occurring in computer hardware, software, data, and
networking technologies. But in any case, we can be sure that progress in
computing will continue to accelerate and that the development of
Internet-based technologies and applications will be one of the major
forces driving computing in the 21st century.
Chapter 1 – Part 2 Computer Hardware
Defining Computer Hardware
The computer hardware is the equipment and devices that make up a computer
system as opposed to the programs that are used on it. A digital computer is not
a single machine: rather, it is a system composed of distinct elements.
● input devices
● central processing unit
● communication devices
In order for information to flow through a computer system and be in a form
suitable for processing, all symbols, pictures, or words must be reduced to a string
of binary digits. A binary digit is called a bit. It represents the smallest unit of data
in a computer system. It can only have one of two states (e.g. true or false, on or
off), represented by 0 or 1. A byte is a string of eight (8) bits, used to store one
number or character in a computer system.
A. INPUT DEVICES
Input devices enable a computer user to enter data, commands, and programs
into the CPU. Included in this category are the following.
● Keyboard – This is commonly known as the QWERTY keyboard, named after the
six leftmost characters in the top row of alphabetic characters on most
keyboards—the standard layout of most typewriters and computer keyboards. An
alternative layout, the Dvorak keyboard, is considered more efficient, but the
QWERTY keyboard has the advantage of familiarity. This is the most common input
device. Information typed at the typewriter-like keyboard is translated by the
computer into recognizable patterns.
● Mouse – This was invented by Douglas Englebert and was popularized by its
inclusion as standard equipment with the Apple Macintosh. It helps a user
navigate through a graphical computer interface. It is generally mapped so that
an on-screen cursor may be controlled by moving the mouse across a flat surface.
There are many variations on mouse design, but they all work in a similar manner.
Some mouse units feature a scroller, which provides a better way of scrolling
through documents vertically and/or horizontally. The latter optomechanical
mouse eliminates the need for many of the wear-related repairs and
maintenance necessary with purely mechanical mice.
● Joystick – This performs the same function as the mouse. It is favored for
computer games. A joystick usually has a square or rectangular plastic base to
which is attached a vertical stem. Control buttons are located on the base and
sometimes on top of the stem. The stem can be moved in all directions to control
the movement of an object on the screen. The buttons activate various software
features, generally producing on-screen events. A joystick is usually a relative
pointing device, moving an object on the screen when the stem is moved from
the centre and stopping the movement when the stem is released. In industrial
control applications, the joystick can also be an absolute pointing device, with
each position of the stem mapped to a specific on-screen location.
● Trackball – This can be roughly described as a mouse on its back. It consists of
a ball resting on two rollers at right angles to each other, which translate the ball's
motion into vertical and horizontal movement on the screen. It typically has one
or more buttons to initiate other actions. The only functional difference between
a mechanical mouse and a trackball is in how the ball is moved. With a trackball,
the housing is stationary, and the ball is rolled with the hand. A trackball is useful
for fine work because the user can exert fingertip control. Another major
advantage of a trackball is that it takes up little desktop surface. This replaces the
mouse on some laptop computers.
● Graphics tablet – This pointing device is also called a digitizing tablet. It is a flat
plastic rectangle with subsurface electronics, used in conjunction with a pointing
device in many engineering and design applications as well as in illustration work.
When a pointing device, like a puck (or even the finger), is moved on the surface
of the tablet, the location of the device is translated to a specific on-screen cursor
position.
The central processing unit (CPU) is the part of the computer system where
manipulation of data, (symbols, numbers, and letters) occurs. It also controls other
parts of the system.
The CPU may be a single chip or a series of chips that perform arithmetic and
logical calculations and that time and control the operations of the other
elements of the system. Contemporary CPUs use semiconductor chips called
microprocessors, common in personal computers, which integrate all the
memory, logic, and control circuits for an entire CPU onto a single chip. The
development of the microprocessor was made possible through miniaturization
and integration techniques. The speed and performance of a computer’s
microprocessor help determine a computer’s processing power. These are based
on the following.
● Word length – This refers to the number of bits that the computer can process at
one time (e.g. a 64-bit chip can process 64 bits, or 8 bytes in a single cycle). The
larger the word length, the greater the computer’s speed.
● Cycle speed – This is measured in megahertz (MHz) or gigahertz (GHz). This
indicates the number of cycles per second (e.g. a 500 MHz Intel Pentium III
processor will have 500 million cycles per second).
● Data bus width – This acts as a superhighway between the CPU, primary storage,
and other devices, which determines how much data can be moved at one time.
the 8088 chip having 16-bit word length but only an 8-bit data bus width can
process data in 16-bit chunks but could only be moved 8 bits at a time)
OUTPUT DEVICES
Output devices enable the user to see the results of the computer’s calculations
or data manipulations. They present data in a form the user of the computer can
understand.
The most common output device can deliver either the soft copy or the hard
copy of the data. Devices that render soft copy are the following
● Video display unit (VDU) – This is commonly known as the monitor, which displays
characters and graphics on a television-like screen. It usually has a cathode ray
tube like an ordinary television set, but small, portable computers use liquid crystal
displays (LCDs) or electroluminescent screens.
● Audio output devices – These are responsible for the sound that the user hears
from the computer. These include the sound card and the speakers. The sound
card is a computer circuit board that allows the computer to receive sound in
digital form and reproduce it through speakers.
● External storage devices – These include floppy disks, compact disks,, external
hard disks, etc.
⮚ imaging
There are only two output devices known to reindeer hard copy of data.
⮚ Impact printers physically strike the paper and are exemplified by pin dot-matrix
printers and daisy-wheel printers.
PRIMARY STORAGE
SECONDARY STORAGE
Secondary storage stores data and instructions when they are not used in
processing. Relatively, they are long-term, non-volatile storage of data outside
the CPU or primary storage. Secondary storage is also known as external storage
because it does not use the computer memory to store data. External storage
devices, which may actually be located within the computer housing, are
external to the main circuit board. These devices store data as charges on a
magnetically sensitive medium such as a magnetic tape or, more commonly, on
a disk coated with a fine layer of metallic particles.
⮚ Floppy disk – The floppy disk in normal use stores about 800 KB or about 1.4 MB.
⮚ ZIP disk – A ZIP disk is much like a floppy disk but has a greater capacity.
⮚ Hard disk – Hard, or “fixed”, disks cannot be removed from their disk-drive
cabinets, which contain the electronics to read and write data on to the
magnetic disk surfaces. Hard disks currently used with personal computers can
store from several hundred megabytes to several gigabytes.
⮚ RAID (Redundant Array of Inexpensive Disks) – This is a disk storage technology
to boost disk performance by packing more than 100 smaller disk drives with a
control chip and a specialized software in a single large unit to deliver data over
multiple paths simultaneously.
● Optical disks – These disks use the same laser techniques that are used to create
audio compact discs (CDs). Under this genre are:
COMMUNICATION DEVICES
Some modems have become specialized in terms of function. For instance, one
of the cards available for a PC is a facsimile transmission (fax) modem that allows
the PC to talk directly to fax machines and to send and receive fax messages.
High-speed modems have been developed that work at speeds of 2 megabits
per second. These are used as components in leading-edge communications
services.
Telecommunication, which is communications over a distance using technology
to overcome that distance.
Computer software is the detailed program and instruction that control the
operations of a computer system. They cause the hardware to do work. A
software program is a series of statements or instructions to the computer. The
process of writing coding programs is termed programming, and individuals who
specialize in this task are programmers.
SYSTEM SOFTWARE
⮚ DOS (Disk Operating System) – This is a 16-bit OS used by earlier PCs. It does not
support multitasking and limits the size of programs that can be run.
⮚ OS/2 – This is an operating system for personal computers which allows multi-
tasking. It can run MS-DOS and Windows-based applications, and can read all
MS-DOS disks. OS/2 was originally developed as a joint project between Microsoft
and IBM.
⮚ Mac OS – This is the first OS to render graphical user interface that was
developed for Macintosh computers.
⮚ Linux – Linux is available from many different companies each adding their own
features, such as a graphical installation routine, but all relying on a basic set of
operating system functions.
⮚ UNIX – This is a multi-user, user ID operating system that incorporates
multitasking. It was originally developed for use on minicomputers. UNIX exists in
various forms and implementations and is considered a powerful operating
system that is more portable—less machine-specific—than other operating
systems because it is written in C. UNIX is available in several related forms,
including AIX, a version of UNIX adapted by IBM (to run on RISC-based
workstations), A/UX (a graphical version for the Apple Macintosh), and Mach (a
rewritten but essentially UNIX-compatible operating system for the NeXT
computer).
⮚ Microsoft Windows – This is a multitasking graphical user interface environment
that runs on MS-DOS-based computers. Windows provides a standard interface
based on drop-down menus, screen windows, and a pointing device such as a
mouse. Programs must be specially designed to take advantage of these
features. This was released in several versions.
APPLICATION SOFTWARE
The management of data and information in computers typically involves databases. A database
is a collection of data organized for storage in a computer memory and designed for easy access by
authorized users. It serves many applications efficiently by centralizing the data and minimizing
redundant data. The data may be in the form of text, numbers, or encoded graphics.
Since their first, experimental appearance in the 1950s, databases have become so important
in industrial societies that they can be found in almost every field of information. Government, military,
and industrial databases are often highly restricted, and professional databases are usually of limited
interest. A wide range of commercial, governmental, and non-profit databases are available to the
general public and may be used by anyone who owns or has access to the equipment that they require.
The organization of data in databases involves some terminologies.
● character – consists of a single alphabetic, numeric, or other symbol
● field – a grouping of characters into a word, a grouping of words, or a complete number; such as a
person’s name or age
● record – a group of related fields
● file – a group of records of the same type, or records that are somehow related
● entity – a person, place, thing, or event about which information must be kept
● attribute – a piece of information describing an entity
● key field – a field in a record that uniquely identifies instances of that record so that it can be
retrieved, sorted, or updated
● query - a statement defined by the user, which instructs the database management system
(DBMS) the find and retrieve the wanted record or information
● tuple – a row or record in a relational database
Databases: Management, Design, and Structure
Database management system (DVMS) is a special software or a computer program that controls the
creation, maintenance, and use of a database of an organization and its end users. It has three (3)
components:
● a data definition language
● a data manipulation language
● a data dictionary
Many database management software packages make use of SQL (Structured Query Language). It is
the most prominent data manipulation language today.
A typical database consists of several database objects. The following objects are the usual
components of a database. Other database management programs may use a different name for
some of the objects.
● Table
A table is the basic unit for storing a collection of data. A table’s definition consists of a list of
fields, each of which stores a discrete piece of information for a single record.
● Queries
Queries enable the user to extract a subset of data from a single table, from a group of related
tables, or from other queries, using criteria you define. By saving a query as a database object, the
query can be run at any time, using the current contents of the database. They may sometimes look
exactly like a table; the crucial difference is that each row of the query’s results may consist of fields
drawn from several tables. A query may also contain calculated fields, which display results based on
the contents of other fields.
● Forms
Forms enable users to enter, view, and edit information, generally one record at a time. They can
closely resemble paper forms such as invoices and time sheets; or they are organized for data entry
with data validation rules. A form may also include a subforin that displays information from a related
table.
● Reports
Reports enables the user to present data from one or more tables or queries in a readable style and
a professional format, generally for printed output. A report may include detailed lists of specific
data, with each row consisting of a single record, or it may provide a statistical summary of a large
quantity of information. A report design can include grouping and sorting options.
● Macro
A macro is a set of one or more actions that perform a particular operation, such as opening a
form or printing a report. Macros can help to automate common tasks. For example, the user can run
a macro that prints a report when a user clicks a command button. A macro can be one macro
composed of a sequence of actions, or it can be a macro group.
● Module
A module is essentially a collection of declarations, statements, and procedures stored together as
one named unit to organize Visual Basic code or nay other code used by the database which are
generated by other programming languages. In designing a database, the following steps should be
applied.
● Determine the purpose of your database.
The first step in designing a database is to determine its purpose and how it's to be used.
⮚ Talk to people who will use the database.
⮚ Brainstorm about the questions you and they would like the database to answer.
⮚ Sketch out the reports you'd like the database to produce. Gather the forms you currently use
to record your data.
As you determine the purpose of your database, a list of information you want from the database
will begin to emerge. From that, you can determine what facts you need to store in the database and
what subject each fact belongs to. These facts correspond to the fields (columns) in your database,
and the subjects that those facts belong to correspond to the tables.
● Determine the fields you need in the database.
Each field is a fact about a particular subject. For example, you might need to store the following
facts about customers: company name, address, city, state, and phone number. You need to create a
separate field for each of these facts.
● Determine the relationships between tables.
Now that you've divided your information into tables and identified primary key fields, you need
a way to tell the database how to bring related information back together again in meaningful ways.
To do this, you define relationships between tables.
● Refine the design.
After designing the tables, fields, and relationships needed, it's time to study the design and detect
any flaws that might remain. It is easier to change the database design at this point than it will be
after you have filled the tables with data.
● Test the design.
Enter enough sample data in your tables so as to test the design. To test the relationships in the
database, see if you can create queries to get the answers you want. Create rough drafts of forms
and reports and see if they show the data expected. Look for unnecessary duplications of data and
eliminate them.
● Enter data and create other database objects.
If table structures meet the design principles described and is determined to serve its purpose
effectively, then it's time to go ahead and add all existing data to the tables. Other database objects
can already be created at this point, such as queries, forms, reports, macros, modules, and other
available objects.
Classifying Databases
Databases can be classified in different ways. They can be classified by their intended use and
function, or by their structure.
● By intended use and function
⮚ Operational databases (e.g. HR database, inventory database, customer database)
⮚ Distributed database – This is a replicate copy or a part of a database to network servers at a
variety of sites.
⮚ External database – This database is designed to be published in the World Wide Web which can
be accessed through the Internet, with charge or free.
● By structure
⮚ Relational DBMS – This is a type of a logical database model that represents all data in the
database as simple two-dimensional tables called relations. The tables appear similar to flat files
but the information in one file can be easily extracted and combined.
⮚ Hierarchical DBMS – This is an older logical database model that organizes data in a treelike
structure. A record is subdivided into segments that are connected to each other in one-to-many
parent-child relationships.
⮚ Network DBMS – This is also an older logical database model that is useful for depicting
many-to-many relationships.
⮚ Object-oriented DBMS – This is a database for storing graphics and multimedia and has the
capabilities of a relational DBMS for sorting traditional information.
Trends in Database Management
The notable factor to the trends in database programming and management is the continuous
advancement of information management practices. Listed below are some of these trends.
● Multidimensional data analysis
This is the capability for manipulating and analyzing large volumes of data from multiple
perspectives. It is also known as on-line analytical processing (OLAP).
● Data warehouses
A data warehouse is a database, with reporting and query tools, that stores current and historical
data extracted from various operational systems and consolidated for management reporting
analysis.
● Data mining
This is the analysis of large pools of data to find patterns and rules that can be used to guide
decision making and predict future behavior.
● Hypermedia databases
These are common in the Web. Hypermedia was used as an approach to data management that
organizes data as a network of nodes linked in any pattern the user specifies. The nodes can contain
text, graphics, sound, full-motion video, or executable programs.
Issues Affecting Libraries and Information Centers
Even if there are so many developments in hardware, software, and network technologies, there are
several issues in IT that are of great concern to libraries and information centers.
● Licensing
Like in any of the many areas of commerce in which licenses are required, licensing applies also
in commercially-distributed software. This is a very big concern since the cost of licensed software is
too high nowadays. Some institutions who cannot afford to purchase a licensed software resort to
the use of pirated software. An alternative to expensive licensed software is the use of shareware
(software that is distributed on the basis of an honor system), or freeware (a computer program given
away free of charge). Most shareware is distributed free of charge but the author usually requests
that you pay a small fee if you like the program or use it on a regular basis. Freeware is often made
available on bulletin boards and through user groups. An independent program developer might offer
a product as freeware either for personal satisfaction or to assess its reception among interested
users.
● Piracy
Software piracy is a crime of robbery for private ends. Software programs are reengineered and
redistributed by unauthorized bodies for their own gains. They usually unlock the software by
providing passwords, serial numbers, or codes required for installation. There are also times at which
they unlock the software by using cracking program tools.
● Computer viruses
A computer virus is a program that “infects” computer files (usually other executable programs)
by inserting copies of itself in those files. This is usually done in such a manner that the copies will
be executed when the file is loaded into memory, allowing them to infect still other files, and so on.
Viruses often have damaging side effects, sometimes intentionally, sometimes not. PC users can
safeguard their files using anti-virus software packages such as Norton Anti virus, McAfee Virus
Scan, AVG Anti-Virus, and so on. These programs can detect viruses, and often repair the damage
done by them.
The increase in transactions over the Internet has greatly increased the chance of virus infection,
so anti-virus measures have been introduced to promote the growth of electronic business. Digital
certificates can be used to validate the identity of people and organizations on the Internet, digital
signatures can prove the identity of an individual, and Secure Electronic Transaction (SET)
mechanisms have been developed to allow safe credit card transactions.
E-mail viruses remain a major threat, however—during 2000, many large organizations were
brought down by a virus attached to an e-mail message entitled "I Love You". In 2002 a new type of
virus appeared that allowed unauthorized users to access private information (such as credit card
details). This virus, known as “Bugbear”, was carried via e-mail and affected many users.
● Data theft
This is a more serious problem than software piracy. Computer system hackers (or crackers)
mutilate the encryption of restricted databanks and databases and make unauthorized use of the
information/data contained in them. The use of these data may be intended for unlawful activities
like theft.
● Spam and junk mails
Spam or unsolicited e-mail is the electronic equivalent of junk mail. People usually send spam in
order to sell products and services, to draw traffic to Web sites, or to promote moneymaking
schemes. Unlike physical junk mail, spam does not stop if it is unsuccessful. When marketing
departments send junk mail they incur some expense, so give up if they do not succeed. Spam costs
virtually nothing to send and so it persists, whatever the recipient does.
Spam can easily be confused with legitimate bulk e-mail. According to Mail Abuse Prevention
System (MAPS), an electronic message is regarded as spam only if the recipient's personal identity is
irrelevant because the message is equally applicable to many others; the recipient has not granted
permission for it to be sent; and the message appears to the recipient to give a disproportionate
benefit to the sender. Spam has become a big problem over the past few years as it consumes large
amounts of the recipient’s time and Internet capacity. It is also an enduring problem as it is virtually
impossible to determine where it originates.
The first spam was sent as long ago as 1978 by a Digital Equipment Corporation sales
representative to advertise a computer equipment demonstration. The initial defense against spam
was to block mail from domains that are known to be senders but it is relatively easy for spam
senders to send from a new domain. The most effective measure now available is to use one of the
e-mail filters on the market that saves the user from having to manually sift though his or her inbox.
Legislation introduced in the European Union in December 2003 makes it a criminal offence to
send spam unless the recipient has agreed in advance to accept it. Similar legislation was signed into
law in the US in the same month.
● Obsolescence of hardware and software
The very fast developments in computer technology mean the very quick obsolescence of
computer devices. Both hardware and software are subject to this problem. This can be resolved by
downloading software updates from the Internet. Unused computers which are left because of
purchasing newer one can be donated to charitable institutions so that they become useful once
more. Computers drain critical resources such as electricity and paper. They also produce unwanted
electrical and chemical, and bulk-waste side effects. As a society, we should adopt a more
environmentally position with respect to use, manufacture, and disposal of computer equipment and
devices. This is known as green computing (environmentally sensible computing).
● High costs in electricity
A computer will never work without electricity. The electrical consumption of computers
becomes a big deal if the institution has many computer units which are run simultaneously. Always
set the computer to a mode at which the monitor and the hard drive is automatically turned off when
not in use. Green computing is also a solution to this problem.
● Health issues
Ergonomics (or human factor engineering), the science and technology emphasizing the safety,
comfort, and ease of use of human-operated machines such as computers, its goal is to produce
systems that are user-friendly, safe, comfortable, and easy to use. Institutions which make use of
computers in their daily activities should consider using ergonomically correct furniture (e.g. chairs
and tables) and devices (e.g. mouse, keyboard, etc.).
Trends and Future Developments
The following are just some of the trends in the development of information technology.
● Computer system capabilities
Computers continue to become smaller, faster, more reliable, less expensive to purchase and
maintain, and more interconnected within computer networks and other electronic gadgets and
devices.
● Input technology trends
Input devices are becoming more natural and easier to use. Even programming languages are
becoming to be structured like human language, making them easier and faster to learn.
● Output technology trends
Output devices are geared toward direct output methods that communicate naturally, quickly, and
clearly.
● Trends in storage media
The capacity of data storage media is continuously growing. Primary storage media are starting to
use microelectronic circuits while secondary storage media are using magnetic and optical media.
One continuing trend in computer development is microminiaturization, the effort to compress more
circuit elements into smaller and smaller chip space. Researchers are also trying to speed up circuitry
functions through the use of superconductivity, the phenomenon of decreased electrical resistance
observed in certain materials at very low temperatures. As the physical limits of silicon-chip
computer processors are being approached, scientists are exploring the potential of the next
generation of computer technology, using, for instance, devices based on deoxyribonucleic acid
(DNA). The fifth-generation computer effort to develop computers that can solve complex problems
in ways that might eventually merit the description “creative” is another trend in computer
development, the ideal goal being true artificial intelligence. One path actively being explored is parallel
processing computing, which uses many chips to perform several different tasks at the same time.
Parallel processing may eventually be able to duplicate to some degree the complex feedback,
approximating, and assessing functions of human thought. One important parallel processing
approach is the neural network, which mimics the architecture of the nervous system. Another
ongoing trend is the increase in computer networking, which now employs the worldwide data
communications system of satellite and cable links to connect computers globally. There is also a
great deal of research into the possibility of “optical” computers—hardware that processes not
pulses of electricity but much faster pulses of light.