Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Benlot - Mary Joy - Module 1

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 10

lynnmayctanrvm@gmail.

com

Media Environments
Let’s Check It Out!
~ Information technology is the study of systems. It is the use of
any computers, networking, process etc.

Let’s Ponder!
1. To what extent technology has changed the way people
communicate?
Ans:
The extent technology has changed the way people communicate
by making it easier, quicker and more efficient.
2. What would life be without modern technology?
Ans:
Without modern technology, life would be very simple everyday
there is no gadget or technology involved.
3. Do technologies have more pros or cons? Why?
Ans:
Yes, because technologies help us in many ways, but there's a big
impact to us if we don’t know how to manage it.
4. Is the Internet bringing people closer to each other or separating
them?
Ans:
The internet brings people closer to each other, but it also can
cause separating people because of the internet.
5. Technology and work: what are the advantages and disadvantages of
technology in workplaces?
Ans:
Advantages of technology in workplace
● The ability to better serve customers
● Higher employee productivity and satisfaction
● It saves time
Disadvantages of technology in workplace
● It affects relationships at work
● It poses risks
● It can encourage laziness
Let’s test you!
Activities / Assessment:
1. Define Information Communication Technology.
Ans:
The Information Communication Technology is a system used to
control, manage, process and create information through
telecommunications technology and computers.
2. Write a short essay about the elements of the computer system.
Ans:
The Elements of the Computer System

Computer is an electronic device which is used for storing


and processing data. The hard work can be done in less time with the
help of computers. We can do all the things in today’s life. Computers
play a big role in this modern life.
There are so many important components of computers.
Working of a computer is dependent on the processing speed of RAM
and Processor. RAM and Processors are the main components of
computers. Computer and system devices near us are dependent on
RAM (Random Access Memory), ROM (Read Only Memory), Display
(To display us the information and data), Processor (Processing all
programs) and OS (Operating System; contact between us and
computer)
Computers are very important to make our work easier. But
it has a big impact on the people who spend more on it. We should use
computers for less time and for good purposes.

3. Write the Classification of Computers.


Ans:
The Classification of Computers are;
1. Analog computer
~ An analog computer is a form of computer that uses continuous
physical phenomena such as electrical, mechanical or hydraulic
quantities to model the problem being solved.
2. Digital computer
~ A computer that performs calculations and logical operations with
quantities represented as digits usually in the binary number system.
3. Hybrid computer ( Analog + Digital)
~ A combination of computers that are capable of inputting and
outputting in both digital and analog signals.

4. List down the Capabilities and Limitations of Computer.


Ans:
Capabilities of computers
1. Speed
2. Accuracy
3. Reliability
4. Adaptability
5. Storage
Limitations of computers
1. Lack of common sense
2. Zero IQ
3. Lack of decision making

5. Make a timeline of History of Computer, note the important


events.
Ans:
History of Computers: A Brief Timeline
1801: In France, Joseph Marie Jacquard invents a loom that uses
punched wooden cards to automatically weave fabric designs. Early
computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-


driven calculating machine that would be able to compute tables of
numbers. The project, funded by the English government, is a failure.
More than a century later, however, the world's first computer was
actually built.

1890: Herman Hollerith designs a punch card system to calculate the


1880 census, accomplishing the task in just three years and saving the
government $5 million. He establishes a company that would ultimately
become IBM.

1936: Alan Turing presents the notion of a universal machine, later


called the Turing machine, capable of computing anything that is
computable. The central concept of the modern computer was based on
his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa
State University, attempts to build the first computer without gears,
cams, belts or shafts.

1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in


a Palo Alto, California, garage, according to the Computer History
Museum.

1941: Atanasoff and his graduate student, Clifford Berry, design a


computer that can solve 29 equations simultaneously. This marks the
first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly


and J. Presper Eckert, build the Electronic Numerical Integrator and
Calculator (ENIAC). Considered the grandfather of digital computers, it
fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and


receive funding from the Census Bureau to build the UNIVAC, the first
commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell


Laboratories invent the transistor. They discovered how to make an
electric switch with solid materials and no need for a vacuum.

1953: Grace Hopper develops the first computer language, which


eventually becomes known as COBOL. Thomas Johnson Watson Jr.,
son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701
EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FORTRAN programming language, an acronym for FORmula


TRANslation, is developed by a team of programmers at IBM led by
John Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known
as the computer chip. Kilby was awarded the Nobel Prize in Physics in
2000 for his work.
1964: Douglas Engelbart shows a prototype of the modern computer,
with a mouse and a graphical user interface (GUI). This marks the
evolution of the computer from a specialized machine for scientists and
mathematicians to technology that is more accessible to the general
public.

1969: A group of developers at Bell Labs produce UNIX, an operating


system that addressed compatibility issues. Written in the C
programming language, UNIX was portable across multiple platforms
and became the operating system of choice among mainframes at large
companies and government entities. Due to the slow nature of the
system, it never quite gained traction among home PC users.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic
Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the
"floppy disk," allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox,


develops Ethernet for connecting multiple computers and other
hardware.

1974-1977: A number of personal computers hit the market, including


Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 —
affectionately known as the "Trash 80" — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the


Altair 8080, described as the "world's first minicomputer kit to rival
commercial models." Two "computer geeks," Paul Allen and Bill Gates,
offer to write software for the Altair, using the new BASIC language. On
April 4, after the success of this first endeavor, the two childhood friends
form their own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April
Fool's Day and roll out the Apple I, the first computer with a single-circuit
board, according to Stanford University. The TRS-80, introduced in
1977, was one of the first machines whose documentation was intended
for non-geeks (Image credit: Radioshack)

1977: Radio Shack's initial production run of the TRS-80 was just 3,000.
It sold like crazy. For the first time, non-geeks could write programs and
make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the
first West Coast Computer Faire. It offers color graphics and
incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first


computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International


releases WordStar. "The defining change was to add margins and word
wrap," said creator Rob Barnaby in email to Mike Petrie in 2000.
"Additional changes included getting rid of command mode and adding a
print function. I was the technical brains — I figured out how to do it, and
did it, and documented it. The first IBM personal computer, introduced
on Aug. 12, 1981, used the MS-DOS operating system. (Image credit:
IBM) 1981: The first IBM personal computer, code-named "Acorn," is
introduced. It uses Microsoft's MS-DOS operating system. It has an Intel
chip, two floppy disks and an optional color monitor. Sears & Roebuck
and Computerland sell the machines, marking the first time a computer
is available through outside distributors. It also popularizes the term PC.

1983: Apple's Lisa is the first personal computer with a GUI. It also
features a drop-down menu and icons. It flops but eventually evolves
into the Macintosh. The Gavilan SC is the first portable computer with
the familiar flip form factor and the first to be marketed as a "laptop."
1985: Microsoft announces Windows, according to Encyclopedia
Britannica. This was the company's response to Apple's GUI.
Commodore unveils the Amiga 1000, which features advanced audio
and video capabilities.

1985: The first dot-com domain name is registered on March 15, years
before the World Wide Web would mark the formal beginning of Internet
history. The Symbolics Computer Company, a small Massachusetts
computer manufacturer, registers Symbolics.com. More than two years
later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture
provides as speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics


laboratory in Geneva, develops HyperText Markup Language (HTML),
giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and


music on PCs.

1994: PCs become gaming machines as "Command & Conquer," "Alone


in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big
Adventure" are among the games to hit the market.

1996: Sergey Brin and Larry Page develop the Google search engine at
Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at


the time, ending Apple's court case against Microsoft in which it alleged
that Microsoft copied the "look and feel" of its operating system.

1999: The term Wi-Fi becomes part of the computing language and
users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides


protected memory architecture and pre-emptive multi-tasking, among
other benefits. Not to be outdone, Microsoft rolls out Windows XP, which
has a significantly redesigned GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to
the consumer market.

2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the


dominant Web browser. Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires
Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core
mobile computer, as well as an Intel-based iMac. Nintendo's Wii game
console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin


applications to the taskbar and advances in touch and handwriting
recognition, among other features. 2010: Apple unveils the iPad,
changing the way consumers view media and jumpstarting the dormant
tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google
Chrome OS. 2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until


now, there hasn't been any quantum-computing platform that had the
capability to program new algorithms into their system. They're usually
each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the
University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is


developing a new "Molecular Informatics" program that uses molecules
as computers. "Chemistry offers a rich set of properties that we may be
able to harness for rapid, scalable information storage and processing,"
Anne Fischer, program manager in DARPA's Défense Sciences Office,
said in a statement. "Millions of molecules exist, and each molecule has
a unique three-dimensional atomic structure as well as variables such
as shape, size, or even colour. This richness provides a vast design
space for exploring novel and multi-value ways to encode and process
data beyond the 0s and 1s of current logic-based, digital architectures."
[Computers of the Future May Be Minuscule Molecular Machines]
6. Design the futuristic computer for 2050. Name your computer
and list down its capabilities.
Ans:
MJ’s Computer Invention
 Faster and Faster Memory Speed
 Automation
 Storage Capacity
 Faster and Faster Wireless
 Touchscreen
 Quick Speed
 Multitasking
 Versatility

7. Differentiate Web 2.0 and Web 3.0.


Ans:
Web 2.0 is the “Writable” phase of the web, one where users can
interact with the site, and with each other. While Web 3.0 is the
“Executable” phase of the web, computers can interpret information like
humans, to then generate personalized content for users.

8. Make 10 Rules in Staying Safe online and 10 Rules for Online


Ethics and Etiquette.
Ans:
10 Rules in Staying Safe online
1. Make a strong passwords
2. Keep your computer updated
3. Double check online information
4. Be careful what you post
5. Keep your passwords in private
6. Don’t overshare on social media
7. Beware of strangers
8. Keep your social media accounts secure
9. Be careful what you download
10. Keep your personal information
10 Rules for Online Ethics
1. Use respectful word
2. Choose friends wisely
3. Respect other people’s privacy
4. Make yourself look good online
5. Be ethical
6. Be responsible all the time
7. Respect other people’s data
8. Be friendly to the people
9. Share expert idea
10. Be forgiving to other

9. Write the steps how to use Google Scholars.


Ans:
Steps to use Google Scholars
Step1: Create your basic profile
Step2: Add publications
Step3: Make your profile public
Step4: Add co-authors
Step5: Add missing articles
Step6: Clean up your Google Scholar profile date

Let’s Wrap everything up!


~ I learned that Web 2.0 is the “Writable” phase of the web. While
the Web 3.0 is the “Executable” phase of the web.

You might also like