Ge Lite Module
Ge Lite Module
Ge Lite Module
LEARNING MODULES
Introduction
The quickening pace of evolution in technology is very evident in this era. It seems that it is
progressing faster than ever. From year to year, the evolution of technology is one of
staggering promise and opportunity--as well as uncertainty. Basically, technology has been
around before, and as long as there are people, information technology will be there also
because there were always ways of communicating through technology available at that point
in time. The future may be unknown, but digital advancement continues to reshape our world
in ways that encourage people to form new habits, find new ways to work together, and
become better human beings. And, in most cases, these changes translate into a range of
opportunities and disruptions across every industry. Humans have always been quick to adapt
technologies for better and faster communication.
Objectives
After successful completion of this module, the student can be able to;
• Demonstrate a sense of readiness for the upcoming semester;
• Identify their learning outcomes and expectations for the course;
• Recognize their capacity to create new understandings from reflecting on
the course;
• Know the role and importance of ICT.
Although there is no single, universal definition of ICT, the term is generally accepted
to mean all devices, networking components, applications and systems that combined
allow people and organizations (i.e., businesses, nonprofit agencies, governments
and criminal enterprises) to interact in the digital world.
Communication
We all know that ICT take a major role for us by means of communicating, way back
in the past our parents use to make letter and send it via post mail. But now with the
help of ICT it is easier to communicate with our love ones. We can use cellular phones
that design for communicating with other people even they are miles away far from
you.
Nowadays people are in touch with the help of ICT. Through chatting, E-mail, voice
mail and social networking people communicate with each other. It is the cheapest
means of communication.
ICT allows students to monitor and manage their own learning, think critically and
creatively, solve simulated real-world problems, work collaboratively, engage in ethical
decision-making, and adopt a global perspective towards issues and ideas. It also
provides students from remote areas access to expert teachers and learning
resources, and gives administrators and policy makers the data and expertise they
need to work more efficiently.
Job Opportunities
For example, many pharmacies use robot technology to assist with picking prescribed
drugs. This allows highly trained pharmaceutical staff to focus on jobs requiring human
intelligence and interaction, such as dispensing and checking medication.
Nowadays, employers expect their staff to have basic ICT skills. This expectation even
applies to job roles where ICT skills may not have been an essential requirement in
the past.
Nowadays, finding a job is different, you can just use your smart phone, laptop,
desktop or any gadgets that is available in the comfort of your home.
Education
Information and Communications Technology (ICT) can impact student learning when
teachers are digitally literate and understand how to integrate it into curriculum.
Schools use a diverse set of ICT tools to communicate, create, disseminate, store,
and manage information.(6) In some contexts, ICT has also become integral to the
teaching- learning interaction, through such approaches as replacing chalkboards with
interactive digital whiteboards, using students’ own smartphones or other devices for
learning during class time, and the “flipped classroom” model where students watch
lectures at home on the computer and use classroom time for more interactive
exercises.
When teachers are digitally literate and trained to use ICT, these approaches can lead
to higher order thinking skills, provide creative and individualized options for students
to express their understandings, and leave students better prepared to deal with
ongoing technological change in society and the workplace.
Socializing
Social media has changed the world. The rapid and vast adoption of these
technologies is changing how we find partners, how we access information from the
news, and how we organize to demand political change.
The internet and social media provide young people with a range of benefits, and
opportunities to empower themselves in a variety of ways. Young people can maintain
social connections and support networks that otherwise wouldn't be possible and can
access more information than ever before. The communities and social interactions
young people form online can be invaluable for bolstering and developing young
people's self-confidence and social skills.
As the ICT has become ubiquitous, faster and increasingly accessible to non-technical
communities, social networking and collaborative services have grown rapidly
enabling people to communicate and share interest in many more ways, sites like
Facebook, Twitter LinkedIn You tube, Flicker, second life delicious blogs wiki’s and
many more let people of all ages rapidly share their interest of the movement without
others everywhere. But Facebook seems to be the leading areas of where people
communicate and share their opinions. What a change! “Nothing is permanent, but
change” (As Heraditus in the 4thcentury BC). Internet can be seen as the international
networks of interconnection of computer networks, the main purpose for the institution
of internet is quest for information i.e., browsing, electronic mail, knew groups fill
transfer and access and use of another computer. Socialization can be seen as a
process by which a child adapts a behavior to be an effective member of the society,
which can only be achieved through learning or education.
Assessment 1
References
Objectives
Definition of Computer
• Computer is a programmable machine.
• Computer is an electronic device that manipulates information, or data. It has
the ability to store, retrieve, and process data.
• Computer is a machine that manipulates data according to a list of instructions
(program).
• Computer is any device which aids humans in performing various kinds of
computations or calculations.
1. Business
Almost every business uses computers nowadays. They can be employed to store
and maintain accounts, personnel records, manage projects, track inventory, create
presentations and reports. They enable communication with people both within and
outside the business, using various technologies, including e-mail. They can be used
to promote the business and enable direct interaction with customers.
2. Education
Computers can be used to give learners audio-visual packages, interactive exercises,
and remote learning, including tutoring over the internet. They can be used to access
educational information from intranet and internet sources, or via e-books. They can
be used to maintain and monitor student performance, including through the use of
online examinations, as well as to create projects and assignments.
3. Healthcare
Healthcare continues to be revolutionized by computers. As well as digitized medical
information making it easier to store and access patient data, complex information can
also be analyzed by software to aid discovery of diagnoses, as well as search for risks
of diseases. Computers control lab equipment, heart rate monitors, and blood
pressure monitors. They enable doctors to have greater access to information on the
latest drugs, as well as the ability to share information on diseases with other medical
specialists.
5. Government
Various government departments use computers to improve the quality and efficiency
of their services. Examples include city planning, law enforcement, traffic, and tourism.
Computers can be used to store information, promote services, communicate
internally and externally, as well as for routine administrative purposes.
6. Marketing
Computers enable marketing campaigns to be more precise through the analysis and
manipulation of data. They facilitate the creation of websites and promotional
materials. They can be used to generate social media campaigns. They enable direct
communication with customers through email and online chat.
7. Science
Scientists were one of the first groups to adopt computers as a work tool. In science,
computers can be used for research, sharing information with other specialists both
locally and internationally, as well as collecting, categorizing, analyzing, and storing
data. Computers also play a vital role in launching, controlling, and maintaining space
craft, as well as operating other advanced technology.
8. Publishing
Computers can be used to design pretty much any type of publication. These might
include newsletters, marketing materials, fashion magazines, novels, or newspapers.
Computers are used in the publishing of both hard-copy and e-books. They are also
used to market publications and track sales.
10. Communication
Computers have made real-time communication over the internet easy, thanks to
software and videoconferencing services such as Skype. Families can connect with
audio and video, businesses can hold meetings between remote participants, and
news organizations can interview people without the need for a film crew. Modern
computers usually have microphones and webcams built-in nowadays to facilitate
software like Skype. Older communications technologies such as email are also still
used widely.
12. Transport
Road vehicles, trains, planes, and boats are increasingly automated with computers
being used to maintain safety and navigation systems, and increasingly to drive, fly,
or steer. They can also highlight problems that require attention, such as low fuel
levels, oil changes, or a failing mechanical part. Computers can be used to customize
settings for individuals, for example, seat setup, air-conditioning temperatures.
13. Navigation
Navigation has become increasingly computerized, especially since computer
technology has been combined with GPS technology. Computers combined with
satellites mean that it's now easy to pinpoint your exact location, know which way that
you are moving on a map, and have a good idea of amenities and places of interest
around you.
15. Military
Computers are used extensively by the military. They are used for training purposes.
They are used for analyzing intelligence data. They are used to control smart
technology, such as guided missiles and drones, as well as for tracking incoming
missiles and destroying them. They work with other technologies such as satellites to
provide geospatial information and analysis. They aid communications. They help
tanks and planes to target enemy forces.
20. Robotics
Robotics is an expanding area of technology which combines computers with science
and engineering to produce machines that can either replace humans, or do specific
jobs that humans are unable to do. One of the first use of robotics was in
manufacturing to build cars. Since then, robots have been developed to explore areas
where conditions are too harsh for humans, to help law enforcement, to help the
military, and to assist healthcare professionals.
Earliest Computers originally calculations were computed by humans, whose job title
was computers.
a) Tally sticks
A tally stick was an ancient memory aid device to record and document
numbers, quantities, or even messages.
b) Abacus
An abacus is a mechanical device used to aid an individual in performing mathematical
calculations.
• The abacus was invented in Babylonia in 2400 B.C.
• The abacus in the form we are most familiar with was first used in China in
around 500 B.C.
• It used to perform basic arithmetic operations.
c) Napier’s Bones
• Invented by John Napier in 1614.
• Allowed the operator to multiply, divide and calculate square and cube roots
by moving the rods around and placing them in specially constructed
boards.
d) Slide Rule
Invented by William Oughtred in 1622.
• Is based on Napier's ideas about logarithms.
• Used primarily for – multiplication – division – roots – logarithms –
Trigonometry
• Not normally used for addition or subtraction.
e) Pascaline
• Invented by Blaise Pascal in 1642.
• It was its limitation to addition and subtraction.
• It is too expensive.
f) Stepped Reckoner
• Invented by Gottfried Wilhelm Leibniz in 1672.
• The machine that can add, subtract, multiply and divide automatically.
g) Jacquard Loom
• The Jacquard loom is a mechanical loom, invented by Joseph-Marie
Jacquard in 1881.
• It is an automatic loom controlled by punched cards.
h) Arithmometer
l) Tabulating Machine
• Invented by Herman Hollerith in 1890.
• To assist in summarizing information and accounting.
m) Harvard Mark 1
• Also known as IBM Automatic Sequence Controlled Calculator (ASCC).
• Invented by Howard H. Aiken in 1943
• The first electro-mechanical computer.
n) Z1
• The first programmable computer.
• Created by Konrad Zuse in Germany from 1936 to 1938.
• To program the Z1 required that the user insert punch tape into a punch
tape reader and all output was also generated through punch tape.
Fig. 2.14 Z1
p) ENIAC
• ENIAC stands for Electronic Numerical Integrator and Computer.
• It was the first electronic general-purpose computer.
• Completed in 1946.
• Developed by John Presper Eckert and John Mauchly.
q) UNIVAC 1
• The UNIVAC I (UNIVersal Automatic Computer 1) was the first commercial
computer.
• Designed by John Presper Eckert and John Mauchly.
r) EDVAC
• EDVAC stands for Electronic Discrete Variable Automatic Computer
• The First Stored Program Computer
• Designed by Von Neumann in 1952.
• It has a memory to hold both a stored program as well as data.
a. Premechanical
The premechanical age is the earliest age of information technology. It can be defined
as the time between 3000B.C. and 1450A.D. We are talking about a long time ago.
When humans first started communicating, they would try to use language or simple
picture drawings known as petroglyphs which were usually carved in rock. Early
alphabets were developed such as the Phoenician alphabet.
As alphabets became more popular and more people were writing information down,
pens and paper began to be developed. It started off as just marks in wet clay, but
later paper was created out of papyrus plant. The most popular kind of paper made
was probably by the Chinese who made paper from rags.
Now that people were writing a lot of information down, they needed ways to keep it
all in permanent storage. This is where the first books and libraries are developed.
You’ve probably heard of Egyptian scrolls which were popular ways of writing down
information to save. Some groups of people were actually binding paper together into
a book-like form.
Also, during this period were the first numbering systems. Around 100A.D. was when
the first 1-9 system was created by people from India. However, it wasn’t until 875A.D.
(775 years later) that the number 0 was invented. And yes, now that numbers were
created, people wanted stuff to do with them, so they created calculators. A calculator
was the very first sign of an information processor. The popular model of that time was
the abacus.
b. Mechanical
The mechanical age is when we first start to see connections between our current
technology and its ancestors. The mechanical age can be defined as the time between
1450 and 1840. A lot of new technologies are developed in this era as there is a large
explosion in interest with this area. Technologies like the slide rule (an analog
computer used for multiplying and dividing) were invented. Blaise Pascal invented the
Pascaline which was a very popular mechanical computer. Charles Babbage
developed the difference engine which tabulated polynomial equations using the
method of finite differences.
There were lots of different machines created during this era and while we have not
yet gotten to a machine that can do more than one type of calculation in one, like our
modern-day calculators, we are still learning about how all of our all-in-one machines
started. Also, if you look at the size of the machines invented in this time compared to
the power behind them it seems (to us) absolutely ridiculous to understand why
anybody would want to use them, but to the people living in that time ALL of these
inventions were HUGE.
c. Electromechanical
Now we are finally getting close to some technologies that resemble our modern-day
technology. The electromechanical age can be defined as the time between 1840 and
1940. These are the beginnings of telecommunication. The telegraph was created in
the early 1800s. Morse code was created by Samuel Morse in 1835. The telephone
(one of the most popular forms of communication ever) was created by Alexander
Graham Bell in 1876. The first radio developed by Guglielmo Marconi in 1894. All of
these were extremely crucial emerging technologies that led to big advances in the
information technology field.
The first large-scale automatic digital computer in the United States was the Mark 1
created by Harvard University around 1940. This computer was 8ft high, 50ft long, 2ft
wide, and weighed 5 tons - HUGE. It was programmed using punch cards. How does
your PC match up to this hunk of metal? It was from huge machines like this that
people began to look at downsizing all the parts to first make them usable by
businesses and eventually in your own home.
d. Electronic
The electronic age is what we currently live in. It can be defined as the time between
1940 and right now. The ENIAC was the first high-speed, digital computer capable of
being reprogrammed to solve a full range of computing problems. This computer was
designed to be used by the U.S. Army for artillery firing tables. This machine was even
bigger than the Mark 1 taking up 680 square feet and weighing 30 tons - HUGE. It
mainly used vacuum tubes to do its calculations.
There are 4 main sections of digital computing. The first was the era of vacuum tubes
and punch cards like the ENIAC and Mark 1. Rotating magnetic drums were used for
internal storage. The second generation replaced vacuum tubes with transistors,
punch cards were replaced with magnetic tape, and rotating magnetic drums were
replaced by magnetic cores for internal storage. Also, during this time high-level
programming languages were created such as FORTRAN and COBOL. The third
generation replaced transistors with integrated circuits, magnetic tape was used
throughout all computers, and magnetic core turned into metal oxide semiconductors.
An actual operating system showed up around this time along with the advanced
programming language BASIC. The fourth and latest generation brought in CPUs
(central processing units) which contained memory, logic, and control circuits all on a
single chip. The personal computer was developed (Apple II). The graphical user
interface (GUI) was developed.
Examples: UNIVAC III, RCA 501, Philco Transact S-2000, NCR 300 series,
IBM 7030 Stretch, IBM 7070, 7080, 7090 series
Fourth generation computers also saw the development of GUIs, the mouse
and handheld devices.
Assessment 2
Instruction: Write your answers on a separate sheet of paper.
7) John Mauchly and J. Presper Eckert are the inventors of ______ computer.
a) UNIAC
b) ENIAC
c) EDSAC
d) Mark 1
9) In the late ______ Herman Hollerith invented data storage on punched cards that
could then be read by a machine.
a) 1860
b) 1900
c) 1890
d) 1880
a) Transistors
b) Integrated Circuits
c) Vacuum Tubes
d) Microprocessor
References
• https://ftms.edu.my/v2/wp-content/uploads/2019/02/csca0201_ch01.pdf
• https://www.sutori.com/story/history-of-ict-information-and-communications-
technology--N7J51bQqSU7vLWcVfdn5M9qa
• https://www.livescience.com/20718-computer-history.html
• https://www.explainthatstuff.com/historyofcomputers.html
Introduction
Objectives
At the end of this lesson, the student should be able to:
• Explore the current breakthrough technologies and disruptive innovations
that have emerged over the past few years.
• Identify and analyze various emerging technologies.
• Explore the evolution of the internet.
• Identify and understand the different uses of internet in today’s generation.
• Discuss the fundamental terms and definitions used in the internet.
It is an old internet that only allows people to read from the internet. First stage
worldwide linking web pages and hyperlink. Web is use as “information portal”. It uses
table to positions and align elements on page.
Disadvantages
• Read only web
• Limited user interaction
• Lack of standards
A term used to describe a new generation of Web services and applications with an
increasing emphasis on human collaboration.
• It is a platform that gives users the possibility (liberty) to control their data.
• This is about user-generated content and the read-write web.
• People are consuming as well as contributing information through blogs or
sites.
• Allows the user to interact with the page known as DYNAMIC PAGE; instead
of just reading a page, the user may be able to comment or create a user
account. Dynamic page refers to the web pages that are affected by user input
or preference.
• Is focused on the ability for people to collaborate and share information online
via social media, blogging and Web-based communities.
Example
Facebook Twitter
LinkedIn Google+
Pinterest Tumblr
Instagram Page
Example
Wordpress Blogger Tumbler
Example:
Wikipedia Wikibooks Wikiversity
Commons Wiktionary Wikiquote
Wikivoyage Wikidata Wikinews
Wikispecies MediaWiki
D. Video Sharing Sites - a website that lets people upload and share their video clips
with the public at large or to invited guests.
Example:
Youtube Facebook LinkedIn
Flickr Photobucket Flickr
Photobucket Twitter Veoh Dailymotion
VimeoPRO Myspace.com Metacafe
• Changing the web into a language that can be read and categorized by the
system rather than humans.
Types of websites:
• eCommerce Website
are a website people can directly buy products from you’ve probably
used a number of eCommerce websites before, most big brands and plenty of
smaller ones have one. Any website that includes a shopping cart and a way
for you to provide credit card information to make a purchase fall into this
category.
• Business Website
is any website that’s devoted to representing a specific business. It
should be branded like the business (the same logo and positioning) and
communicate the types of products and/or services the business offers.
• Entertainment Website
If you think about your internet browsing habits, you can probably think
of a few websites that you visit purely for entertainment purposes.
• Portfolio Website
are sites devoted to showing examples of past work. Service providers
who want to show potential clients the quality of the work they provide can use
a portfolio website to collect some of the best samples of past work they’ve
done. This type of website is simpler to build than a business website and more
focused on a particular task: collecting work samples.
• Media Website
collect news stories or other reporting. There’s some overlap here with
entertainment websites, but media websites are more likely to include reported
pieces in addition to or instead of content meant purely for entertainment.
• Brochure Website
are a simplified form of business websites. For businesses that know
they need an online presence, but don’t want to invest a lot into it (maybe you’re
confident you’ll continue to get most of your business from other sources), a
simple brochure site that includes just a few pages that lay out the basics of
what you do and provide contact information may be enough for you.
• Nonprofit Website
In the same way that businesses need websites to be their online
presence, nonprofits do as well. A nonprofit website is the easiest way for many
potential donors to make donations and will be the first place many people look
to learn more about a nonprofit and determine if they want to support it.
• Educational Website
The websites of educational institutions and those offering online
courses fall into the category of educational websites. These websites have the
• Infopreneur Website
websites overlap a bit with business and eCommerce websites, but they
represent a unique type of online business. Infopreneurs create and sell
information products. That could be in the form of courses, tutorials, videos or
eBooks.
• Personal Website
Not all websites exist to make money in some way or another. Many
people find value in creating personal websites to put their own thoughts out
into the world. This category includes personal blogs, vlogs, and photo diaries
people share with the world.
• Web Portal
are often websites designed for internal purposes at a business,
organization, or institution. They collect information in different formats from
different sources into one place to make all relevant information accessible to
the people who need to see it. They often involve a login and personalized
views for different users that ensure the information that’s accessible is most
useful to their particular needs.
in our life. Vinton Gray Cerf ForMemRS is an American Internet pioneer and is
recognized as one of "the fathers of the Internet", sharing this title with TCP/IP co-
developer Bob Kahn.
Types of Servers
Name Entity
.com commercial
.org organization
.net network
.edu education
.gov National and State Government
Agencies
.ph Philippines
.au Australia
C. Uses of Internet
• Look for information
• School works, jobs, and home purposes
• Send and receive electronic mail
• Video teleconferencing (video call, video chat)
• Buy and sell product
• Social networking
• Watch & post videos
• Games
• Take college courses
• Monitor home while away
• Financial transactions
• Download music and movies
• HTTPS - is the acronym for Hypertext Transfer Protocol Secure. This indicates
that the web page has a special layer of encryption added to hide your personal
information and passwords from others.
• Router or router-modem combination is the hardware device that acts as the
traffic cop for network signals arriving at your home or business from your ISP.
A router can be wired or wireless or both.
• Encryption - is the mathematical scrambling of data so that it is hidden from
eavesdroppers. Encryption uses complex math formulas to turn private
data into meaningless gobbledygook that only trusted readers can unscramble.
• Web Bot - A term that applies to programs/applets (macros and intelligent
agents) used on the Internet. Such bots perform a repetitive function, such as
posting messages to multiple newsgroups or doing searches for information.
• Search Engine - specialized software, such as Google and Yahoo, that lets
www browser users search for information on the web by using keywords,
phrases.
Assessment 3
References
• Abraham, R., Jas, F., Russell, W. (2005) The Web Empowerment Book: An
Introduction and Connection Guide to the Internet and the World-Wide Web.
Springer-Verlag New York
• Evolution of the web retrieved from
https://www.slideshare.net/sububasistha/web-10-to-web-30-evolution-of-the-
web-and-its-various-challenges
Introduction
The Netiquette and The Computer ethics discusses about the ethical issues in the field
of computer. May it be in online or practicing in professional.
Objectives
At the end of this module, you should be able to:
• Discuss the importance of being a responsible netizen by following the rules
of common courtesy online and the informal “rules of the road” of
cyberspace.
• Discuss the difference between privacy and security.
• Explain various risks to internet privacy.
Lesson 1: Netiquette
What is Netiquette?
What is Netiquette? Simple stated, it’s network etiquette – that is the etiquette of
cyberspace and “etiquette” means the forms of required by good breeding or
prescribed by authority to be required in social or official life. In other words, netiquette
is a set of rules for behaving properly online.
When you use e-mail, instant messenger, video calls, or discussion boards to
communicate with others online, please be sure to follow the rules of professional
online communications known as netiquette. These rules will help you communicate
with instructors, classmates, and potential employers more effectively and will help
prevent misunderstandings.
REMEMBER THE GOLDEN RULE – Even though you may be interacting with
a computer screen, you are communicating with a real person who will react to
your message. Make a good impression - treat others with the same respect
that you would like to receive and avoid confrontational or offensive language.
• To protect your privacy and safety, do not share online any sensitive
personal information such as:
➢ Your home address or phone number
➢ Personal conversations
➢ Social plans, such as vacations
➢ Financial information
➢ Usernames, passwords, or hints
➢ Anything personal that you would not want shared by others over the
Internet
• If the material you share with others online came from another source, make
every effort to gain permission from the original author or copyright holder.
Copying someone else's work and passing it off as your own is plagiarism. It
damages your reputation and could subject you to serious academic and legal
consequences.
b) Rule 2: Adhere to the same standards of behavior online that you follow in real
life While it can be argued that standards of behavior may be different in the
virtual world, they certainly should not be lower. You should do your best to act
within the laws and ethical manners of society whenever you inhabit
"cyberspace." Would you behave rudely to someone face-to- face? On most
occasions, no. Neither should you behave this way in the virtual world.
c) Rule 3: Know where you are in cyberspace "Netiquette varies from domain to
domain." (Shea, 1994) Depending on where you are in the virtual world, the
same written communication can be acceptable in one area, where it might be
considered inappropriate in another. What you text to a friend may not be
appropriate in an email to a classmate or colleague. Can you think of another
example?
e) Rule 5: Make yourself look good online writing.colostate.edu One of the best
things about the virtual world is the lack of judgment associated with your
physical appearance, sound of your voice, or the clothes you wear (unless you
post a video of yourself singing Karaoke in a clown outfit.) You will, however,
be judged by the quality of your writing, so keep the following tips in mind:
Always check for spelling and grammar errors Know what you're talking about
and state it clearly Be pleasant and polite
f) Rule 6: Share expert knowledge The Internet offers its users many benefits;
one is the ease in which information can be shared or accessed and in fact, this
"information sharing" capability is one of the reasons the Internet was founded.
So, in the spirit of the Internet's "founding fathers," share what you know! When
you post a question and receive intelligent answers, share the results with
others. Are you an expert at something? Post resources and references about
your subject matter. Recently expanded your knowledge about a subject that
might be of interest to others? Share that as well.
g) Rule 7: Help keep flame wars under control What is meant by "flaming" and
"flame wars?" "Flaming is what people do when they express a strongly held
opinion without holding back any emotion." (Shea, 1994). As an example, think
of the kinds of passionate comments you might read on a sports blog. While
"flaming" is not necessarily forbidden in virtual communication, "flame wars,"
when two or three people exchange angry posts between one another, must be
controlled or the camaraderie of the group could be compromised. Don't feed
the flames; extinguish them by guiding the discussion back to a more productive
direction.
h) Rule 8: Respect other people's privacy Depending on what you are reading in
the virtual world, be it an online class discussion forum, Facebook page, or an
email, you may be exposed to some private or personal information that needs
to be handled with care. Perhaps someone is sharing some medical news about
a loved one or discussing a situation at work. What do you think would happen
if this information "got into the wrong hands?" Embarrassment? Hurt feelings?
Loss of a job? Just as you expect others to respect your privacy, so should you
respect the privacy of others. Be sure to err on the side of caution when deciding
to discuss or not to discuss virtual communication.
i) Rule 9: Don't abuse your power Just like in face-to-face situations, there are
people in cyberspace who have more "power" than others. They have more
expertise in technology or they have years of experience in a particular skill or
subject matter. Maybe it's you who possesses all of this knowledge and power!
Just remember: knowing more than others do or having more power than others
may have does not give you the right to take advantage of anyone. Think of
Rule 1: Remember the human.
j) Rule 10: Be forgiving of other people's mistakes Not everyone has the same
amount of experience working in the virtual world. And not everyone knows the
rules of netiquette. At some point, you will see a stupid question, read an
unnecessarily long response, or encounter misspelled words; when this
happens, practice kindness and forgiveness as you would hope someone
would do if you had committed the same offense. If it's a minor "offense," you
might want to let it slide. If you feel compelled to respond to a mistake, do so in
a private email rather than a public forum.
Lesson 2: Cybercrimes
What is Cyber?
It is the Characteristics of the culture of computers, information, technology and virtual
reality.
Republic Act No. 10175 Cybercrime Prevention Act of 2012 is a law in the
Philippines approved on September 12,2012 which aims to address legal issues
concerning online interactions and internet.
Republic Act No. 10173 Data Privacy Act of 2012 is an act protecting individual
personal information.
a. Copyright
The exclusive legal right, given to an originator or an assignee to print, publish,
perform, film, or record literary, artistic, or musical material, and to authorize others to
do the same.
b. Plagiarism
An act or instance of using or closely imitating the language and thoughts of another
author without authorization.
c. Computer Addiction
• Offline: generally used when speaking about excessive gaming
behavior, which can be practiced both offline and online.
CRIMINAL ACTIVITIES
a. Hacking
• Unauthorized access of or interference with computer systems, servers, or
other information and communication systems
• Unauthorized access to corrupt, alter, steal, or destroy electronic data using
computers or other information and communication systems without the
computer or system owner’s knowledge and consent
• The introduction of computer viruses resulting in the corruption, alteration,
theft, or loss of such data
• Illegal Access
• Illegal Interception
• Data Interference
• System Interference
• Misuse of Devices
• Infection of IT Systems with Malware – if the act is committed against critical
infrastructure of the Philippines the, penalty is between 12-20 years
reclusion temporal
• Six years up to twelve years of imprisonment also known as prison mayor.
c. Electronic theft
• Illegal Downloading
• Obtaining files that you do not have the right to use from the internet.
• Digital Piracy
• Practice of illegally copying and selling digital music, video, computer
software, etc.
• Copyright Infringement
• Penalty of Php 50,000 – 500, 000 and or prison mayor
d. Cyberbullying
• The use of electronic communication to bully a person, typically by sending
a message of an intimidating or threatening nature.
• The Anti-Bullying Act of 2013 (RA 10627)
e. Cybersex
• Willful engagement, maintenance, control, or operation, directly or indirectly
of any lascivious exhibition of sexual organs or sexual activity with the aid
of a computer system for favor or consideration.
• There is a discussion on this matter if it involves “couples” or “people in
relationship” who engage in cybersex.
• Penalty at least Php 200,000 and or prison mayor
f. Child Pornography
• Is a form of child sexual exploitation.
• Unlawful or prohibited acts defined and punishable by Republic Act No.
9775 or the Anti- Child Pornography Act of 2009, committed through a
computer system.
• Penalty of 12-20 years of imprisonment or reclusion temporal
g. Cyber Defamation
• Is an unprivileged false statement of fact which tends to harm the reputation
of a person or company.
• Penalty of 6-12 years of imprisonment or prison mayor.
Hacking
Hacking is a term used to describe actions taken by someone to gain
unauthorized access to a computer. The availability of information
online on the tools, techniques, and malware makes it easier for even
non-technical people to undertake malicious activities.
The process by which cyber criminals gain access to your computer.
Malware
Malware is one of the more common ways to infiltrate or damage your
computer. Malicious software that infects your computer, such as
computer viruses, worms, Trojan horses, spyware, and adware.
Pharming
Pharming is a common type of online fraud.
A means to point you to a malicious and illegitimate website by
redirecting the legitimate URL. Even if the URL is entered correctly, it
can still be redirected to a fake website.
Phishing
Phishing is used most often by cyber criminals because it's easy to
execute and can produce the results they're looking for with very little
effort. Fake emails, text messages and websites created to look like
they're from authentic companies. They're sent by criminals to steal
personal and financial information from you. This is also known as
“spoofing”.
What it does:
• Trick you into giving them information by asking you to update, validate or
confirm your account. It is often presented in a manner than seems official
and intimidating, to encourage you to take action.
• Provides cyber criminals with your username and passwords so that they
can access your accounts (your online bank account, shopping accounts,
etc.) and steal your credit card numbers.
Ransomware
Ransomware is a type of malware that restricts access to your
computer or your files and displays a message that demands payment
in order for the restriction to be removed. The two most common
means of infection appear to be phishing emails that contain malicious
attachments and website pop-up advertisements.
Spam
Spam is one of the more common methods of both sending
information out and collecting it from unsuspecting people.
The mass distribution of unsolicited messages, advertising or
pornography to addresses which can be easily found on the Internet
through things like social networking sites, company websites and
personal blogs.
Trojan Horses
A Trojan horse may not be a term you're familiar with, but there's a
good chance you or someone you know has been affected by one.
A malicious program that is disguised as, or embedded within,
legitimate software. It is an executable file that will install itself and run
automatically once it’s
downloaded.
Viruses
Most people have heard of computer viruses, but not many know
exactly what they are or what they do.
Malicious computer programs that are often sent as an email
attachment or a download with the intent of infecting your computer,
as well as the computers of everyone in your contact list. Just visiting
a site can start an automatic download of a virus.
• If you suspect a problem, make sure your security software is up to date and
run it to check for infection. If nothing is found, or if you are unsure of what to
do, seek technical help.
Wi-Fi Eavesdropping
WiFi eavesdropping is another method used by cyber criminals to
capture personal information.
Virtual “listening in” on information that's shared over an unsecure (not
encrypted) WiFi network.
Worms
Worms are a common threat to computers and the Internet as a
whole.
A worm, unlike a virus, goes to work on its own without attaching itself
to files or programs. It lives in your computer memory, doesn't
damage or alter the hard drive and propagates by sending itself to
other computers in a network – whether within a company or the Internet itself.
Assessment 4
References
• http://www.mccc.edu/~virtcoll/Netiquette
• http://ecampus.matc.edu/student_support/pdfs/7-essential-rules-of-
netiquette.pdf
• https://www.getcybersafe.gc.ca/cnt/rsks/cmmn-thrts-eng.aspx
The internet and telecommunication industry in the 1990s changed the way we
connect and exchange information. Digital technology impacted people in many ways.
By the way people live, work, learn, and socialize. Digital technology comprises of
electronic tools, devices and systems that generate, store and process data. It
enables us to experience the benefits of advanced information technology systems.
Such as efficiency and productivity, improved communication and collaboration and
faster acquisition of information. In this lesson, we will discuss the digital age and its
effect in society, the two-essential theory in technology, and how technological change
takes place.
Objectives
At the end of this lesson, the students should be able to:
• Explain the role of ICT in social change.
• Identify the strengths, weaknesses, opportunities, and imminent threats of
the digital age.
• Articulate basic, but fundamental definitions of complex issues and
dynamics that humans encounter every day, such as technology, social
progress, development, and digitalization.
Digital age, also known as Information age, is a period when the computer and internet
were introduced. It is caused by rapid shift from traditional industry to computerization
all the way to artificial intelligence which brought by Industrial Revolution. These
technologies enabled people to communicate information easily and rapidly. Digital
Technology became prevalent and widely used around the world. Information can be
accessed easily specially with the use of the internet. House chores and jobs are
getting automated with the help of machines and equipment. Mobile phones became
very useful in every area of life --- in education, entertainment, jobs, etc. Digital
technologies have radically changed the way people work, consume, and
communicate over a short period of time.
array of technical protocols. ICTs are indisputably important part of our social setting
today. The term ICTs has been used to embrace technological innovation and merging
in information and communication transforming our world into information or
knowledge societies. The rapid development of these technologies has fainted the
boundaries between information, communication, and various types of media.
Entertainment
With the advent of new technologies, the world of entertainment is constantly
evolving. Digital broadcasting has completely changed the way we experience
television and radio. Cinema can now be found at the comfort of your home
through application such as Netflix, iFlix, etc. We get entertained by the content
that we see in Facebook, YouTube, and Instagram. Computer gaming has also
been an important influence in the development of graphical interfaces.
Technology has been at the forefront of changes in production and distribution
of music. We can now listen to music and podcast using Spotify. These are
some of the many technologies we use for entertainment that arise in the Digital
Age.
Business
The impact of ICT on business is particularly significant. It empowers people to
share knowledge and advice instantaneously and set up an online shop or
website at a low cost, dramatically lowering the barriers to starting a business.
As such, ICT maturity is closely linked to economic growth.
Businesses in today’s life have promoted a lot with the coming of ICT. Its impact
cannot be over emphasized. For example, ICT helps to increase productivity in
business with the use of social Media platforms for marketing and promotion.
The use of websites now allowed companies to develop new and cheaper ways
of offering customers with opportunities of buying goods and services at their
convenient time and enhance the level of customer service. Online platform has
been the marketplace where people can transact and communicate.
Education
The impact of ICT on teachers, trainers, learners, researchers and the entire
education society is tremendous. It is changing the way of the education
delivery system in the world by enhancing access to information for all. It also
ensures effective and inclusive education. ICT supports the concept of open
learning where the thrust is upon enhanced student access and the
development of student autonomy.
ICT also addresses the need of mobile learning. It offers independent space
and flexibility that comes from working away from the learning institute or tutor.
It makes education accessible to all, irrespective of geographical barriers or
resource constraints. Learners from remote areas, working people who want to
learn further and update their knowledge and differently abled students who
find travelling an issue of concern - benefit from the mobile learning mode.
Digital resources in classrooms can help prepare students for a digital society
and economy
Despite fears for the automation, there is little evidence so far that technological
change has led to a net loss of jobs. There are theoretical reasons for which
technological progress may contribute to job creation. Efficiency gains and cost-
savings may induce job creation within industries by expanding the market and
therefore increasing demand. Increased productivity in one sector can also
have positive spillovers in other sectors, if this translates into lower prices and
higher demand across the economy. While these processes may imply short-
term unemployment among displaced workers, they have the potential to
generate economy-wide employment gains.
While the previous section has pointed to the lack of evidence of the negative
effects of technological change on total employment so far, a number of authors
have argued that ICT- based technological change will be more profound than
previous instances of great technological change. This argument is mainly
supported by the observation that the labor- saving potential of digital
technologies is far greater than in the case of previous technological changes.
As a result, automation may, in the future, have much more impactful
consequences on the need for human labor than it has so far. For the moment,
while a shift away from manufacturing jobs has been observed, this has not
translated to overall losses in employment, as middle-skill jobs have been
replaced by new high-skill and low-skill jobs.
for machines, such as truck driving, are already being threatened by rapid
advances in machine learning and AI. More recent estimates of the potential
job- displacement effects of automation have looked at job tasks rather than
entire job categories. Food preparation assistants, cleaners and helpers,
laborers in mining, construction, manufacturing and transport, and assemblers
are the most likely to see their job tasks automated, while teaching
professionals, health professionals and personal care workers are among the
least likely to lose their job to a machine. Similarly, Schwab (2016) and
Susskind and Susskind (2015) consider that the work of lawyers, financial
analysts, journalists, doctors or librarians could be partially or totally automated.
Schwab (2016) emphasizes that algorithms made available by AI are able to
successfully replace human actions, even creative ones. The author presents
the example of automated narrative generation, in which algorithms can
conceive written texts for particular types of audience.
Teleworking (Work from Home) allows people to save time and combine their
work and personal lives.
Teleworking (Work from Home), on the other hand, may present an opportunity
for work-life balance as it improves time management and may reduce time
spent commuting. A variety of studies have found that employees who engage
in telework have higher job satisfaction. Among positive effects, teleworkers
report reduced commuting times, more flexibility in organizing their working
time, and better overall work-life balance. Evidence from the American Time
Use Survey shows that reductions in the time spent commuting and in-home
production due to Internet increase labor force participation.
Health
Digitalization can affect people’s health status through the emergence of new
physical and mental health risks and through its impact on the health-care
delivery system. Health risks associated with the digital transformations include
mental health problems associated with the extreme use of digital technologies,
especially among children and teenagers and the crowding out of other
activities such as physical exercise. Health-care delivery is also affected by new
digital technologies, such as electronic records, new treatment options, tele-
care, and teleconsultation. An important aspect of digitalization concerns the
production and use of medical data to improve the effectiveness and efficiency
of health systems. As a caveat, the exchange and use of medical and health
data must meet high data protection and data security standards, considering
its sensitivity. How and where care is delivered is also affected by digital
innovations, which challenges the traditional role of care providers, with
implications for interactions among care providers and between providers and
patients. The effects of these changes in healthcare delivery of health
inequalities are potentially large, but also less well documented.
Extreme use of digital technologies may have negative mental health effects.
addiction. Extreme Internet use, defined as children who spend more than 6
hours on the Internet outside of school, is becoming more common among
children and teenagers, with time spent online by 15-year-olds increasing by
about 40 minutes between 2012 and 2015 on average. A study also found that
the iGeneration members (the generation grown up in an environment where
technology is ubiquitous) check their social media accounts on average every
15 minutes. While video games used to be the primary source of extreme use
of digital technologies, the smartphone has extended this risk to a wider range
of applications. A recent study found that 39% of 18- to 29-year-olds in the
United States are online “almost constantly”
There is evidence of a direct link between extreme Internet use and depression
and anxiety, but the nature of this relationship is disputed and is likely to be bi-
directional, as people with anxiety, depression and other mental health
problems are also potentially more likely to spend time online. A longitudinal
study run on 3 000 children in Singapore found that extreme video game use
and problems such as social phobia, attention deficit disorder, anxiety and
depression often occur together and are likely to be mutually reinforcing.
Theories in Technology
Technological Determinism
It is the theory which strongly believe that technology shapes the culture,
values, social structures of a society. The main reason why society progresses
is because of the kind technology the society has. Technological innovation is
the cause of the social progress. The technology has the control over the
society --- over human actions, culture and values. Technology greatly
influences human thought and action. In other words, the society is changing
because of technology.
According to Winner, technology is not the slave of the human being but rather
humans are slaves to technology as they are forced to adapt to the
technological environment that surrounds them.
Example. The invention of the stirrup. It is the foot support for horse-riders.
Before its invention, riders were not able to use swords while riding in a horse
because they may lose balance. When the stirrup was invented, it enabled
armored knights to fight on horseback. Because of this, it brought the
development of feudal societies --- a military rule by nobles or lords.
Example. The invention of gun. Before, the weapons used were swords and
archery --- soldiers had to be skilled and trained in using these weapons. But
a new invention in technology changed it all. A gun was invented which require
a less effort and can be used even from far distances. This technology
changed the way soldiers are trained. It also changed how soldiers fight in a
war.
Social Constructivism
Technological Change
The technology is the most powerful means of wresting power from nature in
all possible ways. It strengthens the facilities of man. Prof. Frankel assumes
that the, “Technological change is not a mere improvement in the technical
know-how. It means much more than this. It should be preceded by sociological
change also, a willingness and desire on the part of community to modify their
social, political and administrative institutions so as to make them fit with new
techniques of production and faster tempo of economic activity.” Technology,
according to J. P. Dewhurts, in fact, can be thought of as the change in the
production process of material and human skills.
new technique that can produce goods and services at lesser cost of
production.
The process of growth of technical knowledge can be divided into following
stages:
(a) Formulation of scientific principles
(b) Application of these principles to give technical problems
(c) Development of technical inventions to the point of commercial exploitation.
The first stage is the advancement in scientific knowledge, the second is that
of the application of this knowledge to some useful purposes and third is the
commercialization of invention which is called innovation. This has a great
significance in the process of development. Schumpeter has distinguished
between invention and innovation. Invention implies the discovery of new
technique while innovation is practical application of invention in production for
market.
Assessment 5
Essay: Read the statement/s carefully. Answer the given statement/s concisely. Write
your answers on a separate sheet of paper.
1. Discuss how digital technology change society in the area of: Cite concrete
examples for each item.
a. Business
b. Job
c. Health
d. Lifestyle
e. Entertainment
f. Education
References
GE LITE – LIVING
IN THE IT ERA
FINAL COVERAGE
Introduction
Technology is changing every aspect of our lives. The benefits provided by new digital
approaches are having a huge impact on our societies. However, one of the greatest business
challenges is not about the devices, software or solutions – it is about how we manage the
process of cultural change and its effect to our society. In this module we will learn what are
the different technological advancements and future trends in technology that could potentially
change and shaped the way we live our lives.
Objectives
After successful completion of this module, the student can be able to;
• Identify what are the current and emerging trends in technology;
• Understand how technology affects culture and society trough through the
different advancements in technology;
• Asses the positive and negative effects of said advancement.
Lightbulbs, along with refrigerators, coffee makers, microwave ovens, baby monitors,
security cameras, speakers, televisions, and thermostats have, in the past few
decades, transformed from ordinary objects into conduits for the future. Embedded
with sensors that see, hear, and touch the world around them, they can turn physical
information into digital data. Collectively, these devices— and there are billions of them
around the world—make up the “internet of things.”
Just about anything with network connectivity belongs to the internet of things, from
security cameras and speakers to smart watches and denim jackets. In the “smart
home,” these internet- enabled gadgets liberate us from our chores, give us back some
of our time, and add a dash of novelty to ordinary experiences. (“Alexa, turn on the
disco lights.”) But the internet of things is about more than just using your voice to
preheat the oven or using your phone to turn off the lights.
The real promise of the internet of things is making our physical surroundings
accessible to our digital computers, putting sensors on everything in the world and
translating it into a digital format. Internet-connected objects could be the key to
unlocking predictions about everything from consumer behavior to climate events, but
those same objects could invite hackers into personal spaces and leak intimate data.
Depending on who you ask, the growing internet of things either represents the
promise of technology—the thing that will reinvent modern life as we know it—or that
which will be our technological undoing.
The connectivity, networking and communication protocols used with these web-
enabled devices largely depend on the specific IoT applications deployed. IoT can
also make use of artificial intelligence (AI) and machine learning to aid in making data
collecting processes easier and more dynamic.
The internet of things helps people live and work smarter, as well as gain complete
control over their lives. In addition to offering smart devices to automate homes, IoT is
essential to business. IoT provides businesses with a real-time look into how their
systems really work, delivering insights into everything from the performance of
machines to supply chain and logistics operations.
IoT enables companies to automate processes and reduce labor costs. It also cuts
down on waste and improves service delivery, making it less expensive to
manufacture and deliver goods, as well as offering transparency into customer
transactions.
As such, IoT is one of the most important technologies of everyday life, and it will
continue to pick up steam as more businesses realize the potential of connected
devices to keep them competitive.
The new rule for the future is going to be, "Anything that can be connected, will be
connected." But why on earth would you want so many connected devices talking to
each other? There are many examples for what this might look like or what the
potential value might be. Say for example you are on your way to a meeting; your car
could have access to your calendar and already know the best route to take. If the
traffic is heavy your car might send a text to the other party notifying them that you will
be late. What if your alarm clock wakes up you at 6 a.m. and then notifies your coffee
maker to start brewing coffee for you? What if your office equipment knew when it was
running low on supplies and automatically re-ordered more? What if the wearable
device you used in the workplace could tell you when and where you were most active
and productive and shared that information with other devices that you used while
working?
On a broader scale, the IoT can be applied to things like transportation networks:
"smart cities" which can help us reduce waste and improve efficiency for things such
as energy use; this helping us understand and improve how we work and live. The
reality is that the IoT allows for virtually endless opportunities and connections to take
place, many of which we can't even think of or fully understand the impact of today.
1990
John Romkey creates the first IoT device: a toaster that he controls with his
computer
1999
Kevin Ashton coins the term “internet of things” to describe the eyes and ears
of a computer
2000
LG introduces its first connected refrigerator with a $20,000 pricetag
2008
The world’s first IoT conference is held in Zurich, Switzerland
2010
Tony Fadell founds Nest, maker of the smart thermostat
2013
Oxford Dictionary adds the term “internet of things”
2014
Amazon introduces the Echo speaker, along with the Alexa voice assistant—a
new way to control the smart home.
2016
The Mirai botnet infects over 600,000 IoT devices with malware
2020
The number of internet-connected devices, by some estimates, exceeds 20
billion
The first internet-connected “thing” to make use of this new protocol was a toaster.
John Romkey, a software engineer and early internet evangelist, had built one for the
1990 show floor of Interop, a trade show for computers. Romkey dropped a few slices
of bread into the toaster and, using a clunky computer, turned the toaster on. It would
still be a decade before anyone used the phrase “internet of things,” but Romkey’s
magic little toaster showed what a world of internet-connected things might be like. (Of
course, it wasn’t fully automated; a person still had to introduce the bread.) It was part
gimmick, part proof of concept—and fully a preview of what was to come.
The term “internet of things” itself was coined in 1999, when Kevin Ashton put it in a
PowerPoint presentation for Procter & Gamble. Ashton, who was then working in
supply chain optimization, described a system where sensors acted like the eyes and
ears of a computer—an entirely new way for computers to see, hear, touch, and
interpret their surroundings.
As home internet became ubiquitous and Wi-Fi sped up, the dream of the smart home
started to look more like a reality. Companies began to introduce more and more of
these inventions: “smart” coffee makers to brew the perfect cup, ovens that bake
cookies with precision timing, and refrigerators that automatically restocked expired
milk. The first of these, LG’s internet-connected refrigerator, hit the market in 2000. It
could take stock of shelf contents, mind expiration dates, and for some reason, came
with an MP3 player. It also cost $20,000. As sensors became cheaper, these internet-
connected devices became more affordable for more consumers. And the invention of
smart plugs, like those made by Belkin, meant that even ordinary objects could
become “smart”—or, at least, you could turn them on and off with your phone.
Any IoT system today contains a few basic components. First, there’s the thing
outfitted with sensors. These sensors could be anything that collects data, like a
camera inside a smart refrigerator or an accelerometer that tracks speed in a smart
running shoe. In some cases, sensors are bundled together to gather multiple data
points: a Nest thermostat contains a thermometer, but also a motion sensor; it can
adjust the temperature of a room when it senses that nobody’s in it. To make sense of
this data, the device has some kind of network connectivity (Wi-Fi, Bluetooth, cellular,
or satellite) and a processor where it can be stored and analyzed. From there, the data
can be used to trigger an action—like ordering more milk when the carton in the smart
refrigerator runs out, or adjusting the temperature automatically given a set of rules.
Most people didn’t start building an ecosystem of “smart” devices in their homes until
the mass adoption of voice controls. In 2014, Amazon introduced the Echo, a speaker
with a helpful voice assistant named Alexa built in. Apple had introduced Siri, its own
voice assistant, four years prior— but Siri lived on your phone, while Alexa lived inside
the speaker and could control all of the “smart” devices in your house. Positioning a
voice assistant as the centerpiece of the smart home had several effects: It demystified
the internet of things for consumers, encouraged them to buy more internet-enabled
gadgets, and encouraged developers to create more “skills,” or IoT commands, for
these voice assistants to learn.
The same year that Amazon debuted Alexa, Apple came out with HomeKit, a system
designed to facilitate interactions between Apple-made smart devices, sending data
back and forth to create a network. These unifying voices have shifted the landscape
away from single-purpose automations and toward a more holistic system of
connected things. Tell the Google Assistant “goodnight,” for example, and the
command can dim the lights, lock the front door, set the alarm system, and turn on
your alarm clock. LG’s SmartThinQ platform connects many home appliances, so you
can select a chocolate chip cookie recipe from the screen of your smart fridge and it’ll
automatically preheat the oven. Manufacturer bill this as the future, but it’s also a
convenient way to sell more IoT devices. If you already have an Amazon Echo, you
might as well get some stuff for Alexa to control.
The internet of things offers several benefits to organizations. Some benefits are
industry-specific, and some are applicable across multiple industries. Some of the
common benefits of IoT enable businesses to:
IoT encourages companies to rethink the ways they approach their businesses and
gives them the tools to improve their business strategies.
IoT can benefit farmers in agriculture by making their job easier. Sensors can collect
data on rainfall, humidity, temperature and soil content, as well as other factors, that
would help automate farming techniques.
The ability to monitor operations surrounding infrastructure is also a factor that IoT can
help with.
Sensors, for example, could be used to monitor events or changes within structural
buildings, bridges and other infrastructure. This brings benefits with it, such as cost
saving, saved time, quality-of-life workflow changes and paperless workflow.
A home automation business can utilize IoT to monitor and manipulate mechanical
and electrical systems in a building. On a broader scale, smart cities can help citizens
reduce waste and energy consumption.
IoT touches every industry, including businesses within healthcare, finance, retail and
manufacturing.
Advantages of IoT
Disadvantages of IoT
There are numerous real-world applications of the internet of things, ranging from
consumer IoT and enterprise IoT to manufacturing and industrial IoT (IIoT). IoT
applications span numerous verticals, including automotive, telecom and energy.
In the consumer segment, for example, smart homes that are equipped with smart
thermostats, smart appliances and connected heating, lighting and electronic devices
can be controlled remotely via computers and smartphones.
Wearable devices with sensors and software can collect and analyze user data,
sending messages to other technologies about the users with the aim of making users’
lives easier and more comfortably. Wearable devices are also used for public safety -
- for example, improving first responders' response times during emergencies by
providing optimized routes to a location or by tracking construction workers' or
firefighters' vital signs at life-threatening sites.
In healthcare, IoT offers many benefits, including the ability to monitor patients more
closely using an analysis of the data that's generated. Hospitals often use IoT systems
to complete tasks such as inventory management for both pharmaceuticals and
medical instruments.
Smart buildings can, for instance reduce energy costs using sensor that detect how
many occupants are in a room. The temperature can adjust automatically -- for
example, turning the air conditioner on if sensors detect a conference room is full or
turning the heat down if everyone in the office has gone home.
In agriculture, IoT-based smart farming systems can help monitor, for instance, light,
temperature, humidity and soil moisture of crop fields using connected sensors. IoT is
also instrumental in automating irrigation systems.
In a smart city, IoT sensors and deployments, such as smart streetlights and smart
meters, can help alleviate traffic, conserve energy, monitor and address environmental
concerns, and improve sanitation.
The internet of things brings all the benefits of the internet to items like lightbulbs and
thermostats, but it brings all the problems of the internet, too. Now that people have
their speakers, television sets, refrigerators, alarm clocks, toothbrushes, light bulbs,
doorbells, baby monitors, and security cameras connected to the Wi-Fi, nearly every
device in the house can be compromised, or rendered useless. Consider the whims
of internet connectivity: When your Wi-Fi goes down, so do your devices. Router
problems? That means you can’t turn on the heat with your smart thermostat or unlock
your smart door lock. Things that used to be easy become potentially faulty, if not
impossible, when they require an Alexa command or a smartphone control rather than
a physical button. Many of these devices also run-on proprietary software—meaning,
if their manufacturer goes bunk, gets sold, or stops issuing software updates, your
clever little gadget becomes a useless hunk of plastic.
Risk of bricking aside, connecting things to the internet also leaves those objects, and
everything else on your Wi-Fi network, more vulnerable to hackers. Laura DeNardis,
in her recent book The Internet in Everything, has called this threat to cybersecurity
the greatest human rights issue of our time. The risk isn’t just that some prankster
breaks into your smart washing machine and upsets the spin cycle, or that your Nest
The threat to internet-connected devices comes not just because they’re connected to
the internet, but because device manufacturers have not always designed their
products with security as a priority. In 2016, malware called Mirai exploited these kinds
of vulnerabilities in over 600,000 IoT devices to create a massive distributed denial of
service (DDoS) attack. The following year, an attack called Krack infected nearly every
internet-connected device connected to Wi-Fi. The attack was crippling and difficult to
defend against, in part because the internet of things runs on so many disparate
operating systems. When a phone or a computer gets hit with a virus, software makers
are generally quick to issue a patch. But things like routers or internet-connected
doorbells don’t usually receive software updates needed to protect against
vulnerabilities, and many of them weren’t built with the same kind of security protocols
as computers. After the Krack attack, one security researcher predicted that we would
still “find vulnerable devices 20 years from now.”
Then there’s the question of privacy. If cameras and microphones are studded around
your home, they are definitely watching and listening to you. Everything in the internet
of things collects data— and all that data has value. In a recent study, researchers
found that 72 of the 81 IoT devices they surveyed had shared data with a third party
unrelated to the original manufacturer. That means the finer details of your personal
life—as depicted by your smart toothbrush, your smart TV, or your smart speaker—
can be repackaged and sold to someone else. Google and Apple both admitted, last
year, that the recordings captured by their smart speakers are reviewed by
contractors, including awkward and intimate snippets of audio. Amazon has
partnerships with over 400 police departments, who use the footage from its Ring
doorbell cameras to keep watch on neighborhoods. An ever-expanding internet of
things doesn’t just have consequences for personal privacy. It can create a network of
computer eyes and ears everywhere we go.
Because IoT devices are closely connected, all a hacker has to do is exploit one
vulnerability to manipulate all the data, rendering it unusable. Manufacturers that don't
update their devices regularly -- or at all -- leave them vulnerable to cybercriminals.
Additionally, connected devices often ask users to input their personal information,
including names, ages, addresses, phone numbers and even social media accounts -
- information that's invaluable to hackers.
Hackers aren't the only threat to the internet of things; privacy is another major concern
for IoT users. For instance, companies that make and distribute consumer IoT devices
could use those devices to obtain and sell users' personal data.
Beyond leaking personal data, IoT poses a risk to critical infrastructure, including
electricity, transportation and financial services.
One day, the internet of things will become the internet of everything. The objects in
our world might sense and react to us individually all the time, so that a smart
thermostat automatically adjusts based on your body temperature or the house
automatically locks itself when you get into bed. Your clothes might come with
connected sensors, too, so that the things around you can respond to your movements
in real time. That’s already starting to happen: In 2017, Google announced Project
Jacquard, an effort to create the connected wardrobe of the future.
This vision extends far beyond your clothes, and even your home. You’ll also have
smart offices, smart buildings, smart cities. Smart hospital rooms will have sensors to
ensure that doctors wash their hands, and airborne sensors will help cities predict
mudslides and other natural disasters. Autonomous vehicles will connect to the
internet and drive along roads studded with sensors, and governments will manage
the demands on their energy grids by tracking household energy consumption through
the internet of things. The growth of the internet of things could also lead to new kinds
of cyber warfare; imagine a bad actor disabling every smart thermostat in the dead of
winter, or hacking into internet-connected pacemakers and insulin pumps. It could
create new class systems: those with robot maids, and those without. Or, as Ray
Bradbury described in one short story from 1950, all the people might disappear—but
the smart homes, preparing meals and sweeping the floors, will live on.
If we’re going to get there—whether we like “there” or not—we’re going to need faster
internet. (Enter: 5G.) We’ll also need to keep all those devices from mucking up the
airwaves, and we’ll need to find a better way to secure the data that’s transmitted
across those airwaves. Recently, the Swiss cryptography firm Teserakt introduced an
idea for a cryptographic implant for IoT devices, which would protect the data that
streams from these devices. There are also ideas for creating a better standard for IoT
devices, and plans to help them get along with each other, regardless of which
company makes them or which voice assistant lives inside.
• https://youtu.be/6YaXKxXSli0
• https://youtu.be/mLg95dLm-Gs
Overview
Information technology is an industry on the rise, and business structure, job growth,
and emerging technology will all shift in the coming years. Current trends are
improving and presenting new functions in fields like medicine, entertainment,
business, education, marketing, law enforcement, and more. Still, other much-
anticipated technology is only now coming on the scene.
Innovations in IT change internal company processes, but they are also altering the
way customers experience purchasing and support — not to mention basic practices
in life, like locking up your home, visiting the doctor, and storing files. The following
trends in information technology are crucial areas to watch in 2019 and viable
considerations that could influence your future career choices.
The latest technology methods and best practices of 2019 will primarily stem from
current trends in information technology. Advancements in IT systems relate to what
the industry is leaning toward or disregarding now. Information technology is
advancing so rapidly that new developments are quickly replacing current projections.
a. Cloud Computing
Cloud storage and sharing is a popular trend many companies have adopted
and even implemented for employee interaction. A company-wide network will
help businesses save on information technology infrastructure. Cloud services
will also extend internal functions to gain revenue. Organizations that offer
cloud services will market these for external products and continue their
momentum.
Organizations will transfer their stored files across multiple sources using
virtualization. Companies are already using this level of virtualization, but will
further embrace it in the year to come. Less installation across company
computers is another positive result of cloud computing because the Internet
allows direct access to shared technology and information. The freedom of new
products and services makes cloud computing a growing trend.
Mobile phones, tablets, and other devices have taken both the business world
and the personal realm by storm. Mobile usage and the number of applications
generated have both skyrocketed in recent years. Now, 77 percent of
Americans own smartphones — a 35 percent increase since 2011. Pew
Research Center also shows using phones for online use has increased and
fewer individuals use traditional Internet services like broadband.
Expert’s project mobile traffic to increase even further in 2019, and mobile
applications, consumer capabilities, and payment options will be necessary for
businesses. The fastest-growing companies have already established their
mobile websites, marketing, and apps for maximized security and user-
friendliness. Cloud apps are also available for companies to use for on-the-go
capabilities.
This practice in information technology can be observed for its potential in data
management positions for optimal organizations. Database maintenance is a
growing sector of technology careers. To convert various leads into paying
customers, big data is an essential trend to continue following in 2019.
d. Automation
Trends in information technology emerging in 2019 are new and innovative ways for
the industry to grow. These movements in information technology are the areas
expected to generate revenue and increase demand for IT jobs. Pay attention to these
technological changes and unique products that enhance business operations.
Artificial Intelligence, or AI, has already received a lot of buzz in recent years, but it
continues to be a trend to watch because its effects on how we live, work, and play
are only in the early stages. In addition, other branches of AI have developed, including
Machine Learning, which we will go into below. AI refers to computer systems built to
mimic human intelligence and perform tasks such as recognition of images, speech or
patterns, and decision making. AI can do these tasks faster and more accurately than
humans.
Five out of six Americans use AI services in one form or another every day, including
navigation apps, streaming services, smartphone personal assistants, ride-sharing
apps, home personal assistants, and smart home devices. In addition to consumer
use, AI is used to schedule trains, assess business risk, predict maintenance, and
improve energy efficiency, among many other money-saving tasks.
In fact, Artificial intelligence are already being used in different organization to help
solve problems such as AI face recognition is beginning to help with missing people
reports, and it even helps identify individuals for criminal investigations when cameras
have captured their images. According to the National Institute of Standards and
Technology, face recognition is most effective when AI systems and forensic facial
recognition experts’ team up. AI will continue to promote safety for citizens in the future
as software improvements shape these applications.
Medical AI is another trend that reflects surprising success. Given patient information
and risk factors, AI systems can anticipate the outcome of treatment and even
estimate the length of a hospital visit. Deep learning is one way AI technology gets
applied to health records to find the likelihood of a patient’s recovery and even
mortality. Experts evaluate data to discover patterns in the patient’s age, condition,
records, and more.
Home AI systems are also increasingly popular to expedite daily tasks like listening to
tunes, asking for restaurant hours, getting directions, and even sending messages.
Many problem-solving AI tools also help in the workplace, and the helpfulness of this
technology will continue to progress in 2020.
b. Virtual Reality
Virtual reality (VR), the use of computer modeling and simulation that enables a person
to interact with an artificial three-dimensional (3-D) visual or other sensory
environment. VR applications immerse the user in a computer-generated environment
that simulates reality through the use of interactive devices, which send and receive
information and are worn as goggles, headsets, gloves, or body suits. In a typical VR
format, a user wearing a helmet with a stereoscopic screen views animated images of
a simulated environment. The illusion of “being there” (telepresence) is affected by
motion sensors that pick up the user’s movements and adjust the view on the screen
accordingly, usually in real time (the instant the user’s movement takes place). Thus,
a user can tour a simulated suite of rooms, experiencing changing viewpoints and
perspectives that are convincingly related to his own head turnings and steps. Wearing
data gloves equipped with force-feedback devices that provide the sensation of touch,
the user can even pick up and manipulate objects that he sees in the virtual
environment.
the single biggest difference between immersive Virtual Reality systems and traditional
user interfaces.
For instance, CAVE automatic virtual environments actively display virtual content
onto room-sized screens. While they are fun for people in universities and big labs,
consumer and industrial wearables are the wild west.
With a multiplicity of emerging hardware and software options, the future of wearables
is unfolding but yet unknown. Concepts such as the HTC Vive Pro Eye, Oculus Quest
and Playstation VR are leading the way, but there are also players like Google, Apple,
Samsung, Lenovo and others who may surprise the industry with new levels of
immersion and usability. Whomever comes out ahead, the simplicity of buying a
helmet-sized device that can work in a living-room, office, or factory floor has made
HMDs center stage when it comes to Virtual Reality technologies.
c. Augmented Reality
Augmented reality is a more versatile and practical version of virtual reality, as it does
not fully immerse individuals in an experience. Augmented reality features interactive
scenarios that enhance the real world with images and sounds that create an altered
experience. The most common current applications of this overlay of digital images on
the surrounding environment include the recent Pokémon Go fad.
As it happens, phones and tablets are the way augmented reality gets into most
people's lives. One of the most popular ways AR has infiltrated everyday life is through
mobile games. In 2016, the AR game "Pokémon Go" became a sensation worldwide,
with over 100 million estimated users at its peak, according to CNET. It ended up
making more than $2 billion and counting, according to Forbes. The game allowed
users to see Pokémon characters bouncing around in their own town. The goal was
to capture these pocket monsters using your smartphone camera, and then use them
to battle others, locally, in AR gyms.
Another app called Layar uses the smartphone's GPS and its camera to collect
information about the user's surroundings. It then displays information about nearby
restaurants, stores and points of interest.
Augmented reality can impact many industries in useful ways. Airports are
implementing augmented-reality guides to help people get through their checks and
terminals as quickly and efficiently as possible. Retail and cosmetics are also using
augmented reality to let customers test products, and furniture stores are using this
mode to lay out new interior design options.
This doesn't mean that phones and tablets will be the only venue for AR. Research
continues apace on including AR functionality in contact lenses, and other wearable
devices. The ultimate goal of augmented reality is to create a convenient and natural
immersion, so there's a sense that phones and tablets will get replaced, though it isn't
clear what those replacements will be. Even glasses might take on a new form, as
"smart glasses" are developed for blind people.
Like any new technology, AR has a lot of political and ethical issues. Google Glass,
for example, raised privacy concerns. Some worried that conversations might be
surreptitiously recorded or pictures snapped, or thought that they might be identified
by face recognition software. AR glasses, contacts and more, like the Glass - X and
Google Lens, though, are moving ahead in production and sales.
The possibilities for augmented reality in the future revolve around mobile applications
and health care solutions. Careers in mobile app development and design will be
abundant, and information technology professionals can put their expertise to use in
these interactive experiences.
d. Blockchain Data
Blockchain data, like the new cryptocurrency Bitcoin, is a secure method that will
continue to grow in popularity and use in 2019. This system allows you to input
additional data without changing, replacing, or deleting anything. In the influx of shared
data systems like cloud storage and resources, protecting original data without losing
important information is crucial.
The authority of many parties keeps the data accounted for without turning over too
much responsibility to certain employees or management staff. For transaction
purposes, blockchain data offers a safe and straightforward way to do business with
suppliers and customers. Private data is particularly secure with blockchain systems,
and the medical and information technology industries can benefit equally from added
protection.
e. Internet of Things
The Internet of Things (IoT) is an emerging movement of products with integrated Wi-
Fi and network connectivity abilities. Cars, homes, appliances, and other products can
now connect to the Internet, making activities around the home and on the road an
enhanced experience. Use of IoT allows people to turn on music hands-free with a
simple command, or lock and unlock their doors even from a distance.
these IoT products can aid in many company procedures. Voice recognition and
command responses will allow you to access stored data on cloud services.
IoT enriches the IT industry, especially in job creation. Within the next few years, IoT-
related careers will increase, and there will be a need for 200,000 additional IT
workers, according to IT Pro Today. Design, troubleshooting, and support of IoT
products need extensive training and a specific set of skills.
f. 5G
5G is the 5th generation mobile network. It is a new global wireless standard after 1G,
2G, 3G, and 4G networks. 5G enables a new kind of network that is designed to
connect virtually everyone and everything together including machines, objects, and
devices.
5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra-
low latency, more reliability, massive network capacity, increased availability, and a
more uniform user experience to more users. Higher performance and improved
efficiency empower new user experiences and connects new industries.
5G is a unified, more capable air interface. It has been designed with an extended
capacity to enable next-generation user experiences, empower new deployment
models and deliver new services. With high speeds, superior reliability and negligible
latency, 5G will expand the mobile ecosystem into new realms. 5G will impact every
industry, making safer transportation, remote healthcare, precision agriculture,
digitized logistics — and more — a reality.
Broadly speaking, 5G is used across three main types of connected services, including
enhanced mobile broadband, mission-critical communications, and the massive IoT.
A defining capability of 5G is that it is designed for forward compatibility—the ability to
flexibly support future services that are unknown today.
References
• https://internetofthingsagenda.techtarget.com/definition/Internet-of-Things-IoT
• https://www.wired.com/story/wired-guide-internet-of-things/
• https://www.forbes.com/sites/jacobmorgan/2014/05/13/simple-explanation-
internet-things- that-anyone-can-understand/#16d8137b1d09
• https://www.vistacollege.edu/blog/careers/it/trends-in-information-technology-
for-2019/
• https://www.simplilearn.com/top-technology-trends-and-jobs-article
• https://thebossmagazine.com/future-virtual-reality/
• https://www.livescience.com/34843-augmented-reality.html
• https://www.qualcomm.com/invention/5g/what-is-5g
Assessment 6
1. Enumerate 5 IoT devices and discuss what smart abilities do they have.
2. Differentiate virtual reality technology and augmented technology from one
another.
3. What are the positive and negative implications of these emerging trends to our
culture and society?
Overview
The graphic design concepts are almost like building blocks. Each layer is on top of
each other before you have the base to create something unbelievable — whether
you're creating a logo, a website, or a unique picture. The basic principles of graphic
design come with different fundamentals to consider. In this module, we will explore
some of it for us to be able to create a good
design.
Objectives
Graphic Design is a process in which we use typography, images, colors, icons and
other illustrations to communicate visually. This term was first coined by William
Addison Dwiggins on 1992 as he called himself a “graphic designer”. However, graphic
design is a thousand-year-old craft which dates back to ancient cave drawings. In
today’s era, we use graphic design not just to communicate visually but also to having
good user experience (for software developers) and also to boost/improve one’s
emotions (with the use of colors). The fundamentals of graphic design varied from
PowerPoint presentations, web/mobile applications, posters, logos, and even
paintings.
Line
A line is a kind of shape which connects two or more points. It is also considered as
one of the essential elements of graphic design. Lines can be thick, thin, curved, or
jagged. Figure 7.1 shows the different styles of a line.
Shape
Figure 7.3 shows the different examples of geometric and organic shapes.
Form
or lightings to create an illusion of a form. It also gives the object a sense of place.
Figure 7.5 shows that a ball, if you make it two-dimensional is just a circle.
Texture
Texture refers to the physical quality of the surface of an object in an artwork or design.
It also refers to how an object look or feels like. An object might be smooth, rough,
shiny, hard, or soft. It can be in 3D (real texture) or 2D (visual texture). Texture adds
depth and visual interest to the flat images or objects.
Balance
Visual balance is the creation of visual equilibrium by relating elements such as line,
shape, color, space or form in terms of their visual weight. Basically, there are two
kinds of visual balance:
Branding is simply what people thinks about you, your company, your product or your
service. For example, we think of Albert Einstein as the epitome of intelligence, and
that is how he was branded. Identity or visual identity is the visual representation of a
brand. It can be in a form of image, choice of color or typography, and many more. For
example, we quickly recognize the company and its service but just looking at its logo,
typeface, or color combinations.
Let’s cite one example. We can easily associate the school by simple looking at their
color combination. For instance, if we see colors red and maroon, we can easily
connect it with PUP, or red and green for UP.
Branding and identity are not just for products and services, we can even apply it to
how we work or what type of output we produce. For example, the use of bright and
bold colors is often associated with the famous painter, Vincent Van Gogh.
1. Establish clear purpose and positioning. Recall why you or your company
exists, who your target audience are, and what makes you or your service
unique from your competitors.
2. Conduct thorough market research. Having a deep analysis on your target
audience on what their personalities are which will lead you to the next step.
3. Get a personality. Based on your research, determine your brand’s
personality. Brand personality make a huge impact on the visuals of your
marketing materials.
4. Create a polished logo. In creating your logo, it should be simple, scalable,
and memorable. Observe the logos of famous companies like Amazon, Google,
and IBM. What do these logos have in common?
5. Create an attractive color palette. Your color palette should be simple and
contain one to three primary colors. Once you have established your color
palette, you may play with their color family. For example, if you choose blue
as your primary color, you may use sky blue, baby blue, and other colors under
the blue family to support your primary color.
6. Select professional typography. When selecting fonts, it is important to
consider these things:
a. Do not make it fancy. Fancy typefaces only make your text confusing.
Example:
Example:
We might be confused sometimes in using lay out instead of layout. Please take note
that these two are not the same. Lay out is a verb phrase which simply means to
arrange something, while layout is a noun which means how things are organized.
Proximity
Proximity is the process of placing related elements together. Elements that are not
related to that group should be separated to show that these elements are not related
to that group. Take family reunion as an example, you are grouped by family, and
anyone who don’t have any relationship to your family should be separated. In design,
block texts or graphics that are related should be grouped together to make your
design easier to understand.
Fig. 7.9 Sample design which shows the division of related elements.
White Space
White space is not literally the white spaces that you found on the design but rather
the negative space between lines, paragraphs, and element on the design. In his
article Importance in White Space in Design, Pratik Hedge described white space as:
2. Focus and attention. Macro white spaces help guide the viewers to the focus
area in the design.
3. Increased interaction rate. If used wisely, white space in design helps the
viewer to get the message quickly even without looking at the instructions. Take
a look at Google’s homepage UI. White space helps the viewer to get the
message, which is to search
4. Guide the user through local grouping. White space helps you to achieve the
proximity of your design.
5. Branding and Design Tone. Let us look back at the steps of brand identity
design, the way how is you going to apply white spaces in your design helps
you create your own brand’s personality.
6. Creates a breathing space for users. A lot of people believe that in design, one
must maximize the space by putting contents on it. However, this might make
your design stuffy. Having enough white space makes your eye rest, helps us
breathe and not to be overwhelmed with the information.
Alignment
Contrast
Contrast means one element is opposite to the other element. This does not only apply
to colors, but to typeface and size of elements as well. Contrast helps you to catch the
viewer’s eye, create a direction, or giving emphasis to something. For example, if you
use dark color for your background, you should use light color for your foreground; or
if you use different test style to give emphasis on your content.
Repetition
Repetition simply means to use of the same typefaces, color palettes, or other
elements to achieve consistency in your composition. This creates unity in your
composition or make your projects connected to each other. For example, if you create
a PowerPoint presentation, you should use only the same color palette or text style in
all of your slides.
Lesson 4: Typography
Typography is the art of arranging texts that makes it readable and appealing to the
viewer. It involves font style, typeface, and text structure.
Some people often misuse the term “font” as typeface. So, let us explain first the
difference between font and typeface.
Font refers to the variation of weights of a typeface, while typeface refers to the text
style. Font also refers to the format or storage mechanism of a text like .otf and .ttf.
For example, Arial Narrow, Arial Black, and Arial Rounded are fonts under the Arial
typeface.
Types of Fonts
Serif
Serif fonts are fonts that have little strokes called serif on each end of the letter. They
are typically used in formal or traditional projects. Examples of typefaces with serifs
are Times New Roman, Baskerville Old Face, and Californian FB.
Sans Serif
Sans serif are fonts with no extra strokes. Sans serif simple means “without serifs” as
sans is a French word for without. These fonts are normally found in mobile phones,
and computer screens. Examples of this type are Calibri, Arial, and Roboto.
Display
Display fonts are sometimes called as fancy or decorative fonts. It can be script,
blackletter or all caps. These types of fonts are used in special occasions like
invitations, titles, or posters. Examples of display fonts are Advertising Script, Bangers,
and Forte.
Whether you are new or old in graphic design, one dilemma that most graphic
designers experienced is on what fonts or typefaces are they going to use. One
mistake that beginners commit is the misuse of fonts or typefaces. In choosing a
font/typeface, it should portray the message that you want to say to your viewers. In
design, fonts and typefaces do matter.
Fig. 7.16 shows the different interpretations of these notes with the
same message but using different typefaces
There are typefaces that are overused and outdated like Comic Sans, Papyrus,
Jokerman and Curls Tt. Though, there is nothing wrong in using them, these typefaces
are being discouraged to use.
In choosing typefaces on your design, limit yourself to one or two per project, and you
may play with their family of fonts for emphasis or contrast. You may combine serifs
and sans serifs, display and serifs, or display and sans serifs.
Lesson 5: Color
Colors are very essential to your compositions. One may use a combination or one
or more colors. It may be our instinct to choose color but there is a science behind it,
called Color Theory. Color Theory describes how different colors contribute to each
other and how they appear as they are mixed into other color schemes. Before we
proceed to different color schemes, let us go over some terminologies used in color.
• Hue. Refers to pure, vibrant colors.
• Saturation. Refers to the intensity of the color. It ranges from black and white
(or grayscale) to vibrant color.
• Value. Refers to the lightness or darkness of a color. For example, from light
blue to dark blue.
Color Schemes
Of course, we can still remember the lessons about color during art lessons. We
have primary colors, then secondary colors and tertiary colors. A circular diagram of
these colors is called a color wheel (Figure 7.17). Using this wheel, we can create
our own color scheme or combination.
This color scheme only focuses on one color, and often using variations by
incorporating saturations or values. For example, if you chose the color blue, then you
may have other colors under the same color family like sky blue, baby blue, navy blue,
or dark blue.
This color scheme only revolves on using desaturated colors like black, gray, and
white.
Analogous color scheme selects a group of three colors that are adjacent in the color
wheel.
These are colors that are direct opposite to each other in the color wheel. Usually, a
combination of a primary and secondary colors.
Split-complementary color scheme uses the colors on both sides of the opposite color.
This color scheme uses colors that form an equilateral triangle. It may be a
combination of primary, secondary, or tertiary colors.
Also known as double complementary. This color scheme uses two pairs of
complementary colors.
Whenever we open an image editing application or buy a printer ink at the store, we
may observe labels like RGB or CMYK. So, what are they? They are color profiles that
we need to consider if we create designs.
• RGB. This color profile consists of Red, Green, and Blue. You should use this
profile for design that are intended for screen displays.
• CMYK. This color profile consists of Cyan, Magenta, Yellow, and Key (Black).
If you have a printer in your house, you probably see these colors as inks. This
profile is intended for designs that are to be printed.
Lesson 6: Image
Images are not just limited to photographs, it also includes graphics, and other
illustrations. Having images on your composition makes it appealing to the eyes of
your viewer. Take magazine as an example, imagine that your favorite magazine
contains no image. You do not want to read it right? That is the power of images –
they are not just decorations on your composition.
Finding and placing the right image is not a difficult thing as long as you know what
kind of image you are going to use in your composition. Do you remember using clip
arts on your project designs? If yes, then I encourage you to not use it today as we
are over with the clip art era. Stock photos are now popular in any project.
Most people are now relying on stock photos as they are free or sometimes cost less.
There are various stock image websites all over the internet. The only thing that you
need to do is to choose pictures/images for your composition.
There are different file types of images, and they are grouped into two categories:
vector and raster.
• Vector. It is a type of image that does not lose its quality when zoomed in.
Your image will not be pixelated when enlarged.
• Raster. Opposite to vector, raster images become pixelated when enlarged.
• Vector Image File Extensions
• Encapsulated Postscript (EPS). This vector format is designed to produce
high-resolution graphics for print. Being a universal file type, EPS files can be
opened in any design editor.
• Adobe Illustrator Document (AI). Most preferable and commonly used image
file type by designers. If you want to create a vector image, AI is one of the best
tools for you.
Assessment 7
Using the fundamentals of graphics design, enumerate and explain the elements on
this image that needs to improve and be able to create your version of this design.
References
• https://www.slideshare.net/LeahLewman/elements-of-art-shape-88242623
• https://visme.co/blog/wp-content/uploads/2015/09/designelements.jpg
• https://99designs.com/blog/tips/graphic-design-basics/#4
• https://faculty.washington.edu/farkas/dfpubs/Farkas-Farkas-
Graphic%20Design- Ch11Principles%20of%20Web%20Design.pdf
• https://www.interaction-design.org/literature/topics/graphic-design
• http://www.wcs.k12.mi.us/cousino/wcsart/art%20foundatons%20site/texture.ht
ml#:~:text=T
exture%20refers%20to%20the%20surface%20quality%20in%20a%20work%
20of%20art.&t ext=Some%20things%20feel%20just%20as,
called%20visual%20or%20implied%20texture.
• https://www.thoughtco.com/definition-of-balance-in-art-
182423#:~:text=Balance%20refers%20to%20how%20the,not%20seem%20h
eavier%20than%20another.
• https://writingexplained.org/layout-or-lay-out-difference
• https://www.edgee.net/the-principles-of-graphic-design-how-to-use-proximity-
effectively/#: ~:text=The%20principle%20of%20proximity%20is,
give%20structure%20to%20 a%20layout.
• https://www.portlandlocalist.com/blog/how-to-improve-the-design-of-your-
email-newsletter
• https://edu.gcfglobal.org/en/beginning-graphic-design/typography/1/
• https://www.shutterstock.com/blog/complete-guide-color-in-design
• https://blog.hubspot.com/insiders/different-types-of-image-files
Introduction
Opening Photoshop® for the first time is like cracking open a fantasy novel that opens
up an entirely new world of strange creatures, opposite natural laws and a completely
new language. That new fantasy world is bursting with exciting possibilities yet bogged
down by so many unknowns.
Objectives
Photography
Photography is an art form like drawing and painting. Photographers use their camera
to make us see life in a different way, feel emotions, and record stories and events.
Greek for “Painting with Light” and can be considered both an art and science.
Photography is a science, because there are basic principles of physics that govern
success and Photography is art because its beauty is subjective.
The world’s first photograph made in a camera was taken in 1826 by Joseph
Nicéphore Niépce. The photograph was taken from the upstairs’ windows of Niépce’s
estate in the Burgundy region of France. This image was captured via a process known
as heliography, which used Bitumen of Judea coated onto a piece of glass or metal;
the Bitumen than hardened in proportion to the amount of light that hit it.
Landscape Photography
Portrait Photography
A portrait photograph is a picture of a person or animal that shows an emotional
connection.
Documentary Photography
Documentary photography tells a story without changing the facts. It can be a portrait
or landscape. Remember that a good documentary photograph makes you wonder
the story behind the picture.
Exposure
Overexposed vs Underexposed
Aperture
It is the size of the hole in the diaphragm that allows light into the camera. The larger
the hole, the more light that enters the camera in a given time. Aperture comes from
the Latin for ‘opening’. f/stop values : f/1.0 f/1.1 f/1.2 f/1.4 f/1.6 f/1.8 f/2.0 f/2.2 f/2.5
f/2.8 f/3.2 f/3.5 f/4.0 f/4.5 f/5.0 f/5.6 f/6.3 f/7.1 f/8.0 f/9.0 f/10 f/11 f/13 f/14 f/16 f/18 f/20
f/22 f/25 f/29 f/32
The aperture does more than just control the amount of light that hits the sensor – the
size of the aperture affects the way an image looks well. Specifically, it affects the
depth of field you can achieve. Depth of field is an expression describing how much
of a photo is in focus. If you use a large aperture (a smaller f-number), you get shallow
depth of field, which means that if you take a portrait photo, your subject will be in
focus, but the background will be out of focus. Aperture (Av) are measured using F-
STOPS, shown as f/# (i.e. f/16)
The Av controls the amount of depth of field in an image. The wider the aperture, the
shallower the depth of field, and vice versa.
Av & Tv Together
The wider the aperture used, the less time – i.e. the faster shutter speed – needed to
properly expose the image. Conversely the slower the shutter speed the smaller the
aperture needs to be. For any image, there number of combinations that will make a
correct expo
Shutter speed
The function of the shutter mechanism is to admit light into the camera, and onto the
digital media or film for a specific length of time.
B=Bulb
Note: Faster shutter speeds mean less light on the image sensor. Slower shutter
speeds mean more light.
Control the amount of time the shutter or curtain is open. Shutter speed is measured
in fractions of seconds. Depending on the camera, it may show the shutter speed
without the numerator, i.e., 250 instead of 1/250. A doubling or halving of the time
value (Tv) represents one stop of EV. Like with aperture, shutter speed affects more
than just the amount of light. It also affects motion in photo, which makes sense, when
you think about it. Your camera chip is measuring light as long as the shutter is open.
If the shutter is open for a second and if scene changes in the duration of that second,
the light reflecting off your subject will also move across the frame.
It is the measure of the sensitivity of the film sensor in a camera. It measured in values
using ISO numbers. ISO simply stands for International Organization of
Standardization. With both analog and digital cameras, ISO refers to the same thing:
the light sensitivity of either the film or imaging sensor. ISO numbers are linear in their
relationship. The higher the ISO number, the more sensitive the film/sensor and also
the more noise or grain in the image.
ISO: 100, 200, 400, 800, 1600, 3200, 6400, 12800
Shutter speed: B 1” 0”8 0”6 0”5 0”4 0”3 1/4 1/5 1/6 1/8 1/10 1/13 1/ 15 1/20 1/25 1/30
1/40 1/50 1/60 1/80 1/100 1/125 1/160 1/200 1/250 1/320 1/400 1/500 1/640 1/800
1/1000 1/1250 1/1600 1/2000 1/2500 1/ 3200 1/4000 1/5000 1/6400 1/8000
Stops
The amount of light that strikes the film/chip is measured in stop and is also known as
exposure value (Ev). A difference of 1 stop is a doubling or halving of the light making
the image. It is used to measure differences in Ev of apertures, shutter speed and film
speeds. “Fast” means the camera expose a photo too quickly.
Lighting
Light is the essential ingredient of photos. One of the skills that separates
photographers from snap shooters is the ability to solve lighting problems. There are
two primary factors to consider for light: Direction and Color.
Lighting – Direction
The direction the light comes from can make the image seem flat or 3-Dimensional.
Front lighting is easy to photograph, but images are generally flat. Top lighting, such
as from the sun overhead, also makes image flat and shadows are short and dark.
Side lighting will emphasize texture and contours and create long shadows.
Lighting – Color
The color of light is measured by temperature in Kelvin (° K). Natural light changes
through the day and humans respond psychologically to different colors, therefore the
color of a photo will affect emotional responses. Light that is in the reds, oranges and
yellows is said to be “Warm”, conversely, “Cool” light is blue in tone.
Lighting – Color
When taking a photo with digital camera, the while balance setting of a camera will
affect the color cast of the image, balancing the lighting of the subject. Typical while
balance settings of a camera include tungsten, florescent, shade sunny, cloudy, flash,
auto and manual. Filters can also be used to affect the color of light in the image
Move a few steps closer or use the zoom until the subject fills the
viewfinder. You will eliminate background and distractions and show off
the details in your subject.
For small object, use the camera’s macro or ‘flower’ mode to get sharp
close-ups.
5. Take some vertical pictures.
Many subjects look better in a vertical picture from the Eifel Tower
portraits of your friends
Make a conscious effort to turn your camera sideways and take some
vertical pictures. (Sample Water Falls)
6. Lock the focus.
Lock the focus to create a sharp picture of off-center subjects
1. Center the subject
2. Press the shutter button halfway down
3. Re-frame your picture (while still holding the shutter button)
4. Finish by pressing the shutter button all the way
7. Move it from the middle. (Rule of thirds)
• Bring your picture to life simply by placing your subject off-center
• Imagine a tic-tac-toe grid in your viewfinder. Now place your subject
at one of the intersections of lines.
• Since most cameras focus on whatever’s in the middle remember to
lock the focus on your subject before re-framing the shot.
8. Know your flash range.
• Pictures taken beyond the maximum flash range will be too dark
• For many cameras that’s only ten feet – about four steps away.
Check your manual to be sure.
• If the subject is further than ten feet from the camera, the picture may
be too dark.
9. Watch the light.
• Great light makes great pictures. Study the effects of light in your
pictures
• For people pictures, choose the soft lighting of cloudy days. Avoid
overhead sunlight that casts harsh shadows across faces
• For scenic pictures, use the long shadows and color of early and late
daylight
10. Be a picture director.
• Take an extra minute and become a picture director, not just a
passive picture- taker
• Add some props, rearrange your subjects, or try a different viewpoint
Post Processing
Post processing is process of editing the data captured by camera while taking the
photo to enhance the image. Better the data captured during clicking of photo better
is the enhancement possibility. There is more and more camera which have come into
market which can capture RAW files. Raw files have much more data at pixel level
which and help in post processing and enhancing the image.
Post processing can surely help in enhancing the image but might not be able to
convert a really bad exposure to excellent one. There are various stages of post
processing based on what is the final result that one wants to achieve.
There are basically two things that are done in post processing:
1. An algorithm is run on all existing data of pixel and minor changes are applied
to pixel data.
2. Manually selecting and replacing the pixel data with total new data.
There is software by camera manufacturers, specialist software vendors and there are
also freeware and free software available for taking care of post processing needs.
RAW file handling and conversion is possible in RawTherapee, UFRAW, darktable,
Adobe Lightroom, Adobe camera RAW, FSViewer and many more.Jpg file editing is
generally done in image editors that have various features such as grain and red-eye
removal, for example, the Gimp(free), or Adobe Photoshop, or Photoshop Elements.
There are lot of changes possible during post processing. Sequence of these changes
is important as change made at one stage can affect the effectiveness of next stage.
Most of the software are also organized in a fashion to guide the users through a
smooth work flow.
Few actions can achieve great results when applied in RAW file. These can also yield
result in jpgfiles in case you do not have RAW file.
Some of actions that may be preferred while editing RAW file are
1. Exposure Value adjustment
2. White balance adjustment
3. Hue and tone adjustment
4. Highlight and shadow recovery
5. Vibrance and saturation adjustment
6. Cropping & Rotation
Some of the actions that can be done at RAW or jpg stage are
1. Noise reduction
2. Sharpening
Some of the actions that are preferred after conversion to jpg files are
1. Red-eye removal
2. Local touch up of cloning to erase unwanted object in frame
3. Adding of frame
4. Mixing with other jpg files like changing the background.
Online Resources
Assessment 8
References
• Post processing retrieved from
• https://en.wikibooks.org/wiki/Digital_Photography/Post_Processing
• Introduction to Photography retrieved from
• https://carleton.ca/healthy-workplace/wp-content/uploads/Intro-to-Photo-
presention-20112.pdf
•
• Photography guide retrieved from
• https://www.creativelive.com/photography-guides/post-processing
• What is photography?
• http://163.14.73.3/~chiang/course/CG/ClassNotes/photography.ppt
• History of Photography
• https://petapixel.com/2015/05/23/20-first-photos-from-the-history-of-
photography/
• Top ten tip in photography
• http://www.kodak.com
Nearly two-third of the population are visual learners, which means it’s important to
put a heavy emphasis on showing, not telling.
An infographic is the perfect way to visually represent important data and information
so that your audience has a greater chance of understanding and retaining it.
And the best way to do that is with an infographic design guide. Infographics can
communicate information in a condensed and highly visual way—when designed well.
Here's one example:
The problem is that for every well-made infographic published, there are a handful of
poorly produced infographic designs circling the web.
Poorly designed infographics can skew and obscure information, rather than make it
easier to understand.
Objectives
Creating a beautiful and effective infographic design isn't hard, it just takes a basic
understanding of infographic design best practices.
Find a story. In every set of data there's a story. Before you begin designing your
infographic, think of the story you are trying to tell. The angle you choose will help you
determine which information to include.
For example, this infographic design tells the story of completing a project from start
to finish:
Because infographics allow for limited space for content, the purpose of your
infographic should be focused. That's why the layout of your infographic should not
only reflect the theme of your information, but also enhance the communication of it.
Before diving into your design, create an infographic outline. In your outline, include
your headers, data, and any design details you don't want to forget.
Come up with a title that is catchy and descriptive. Readers should get a sense of what
the information will be, to engage them and make them want to read further.
Grids and wireframes are the structural base to any design. Designing on a grid allows
you to easily organize elements and information. Grid designs also play an essential
role in keeping objects and elements aligned.
For example, when aligning items, you can reference the same vertical grid line and
then space each list item accordingly.
The below example shows how color palettes are aligned on a graphic design grid.
The Venngage editor also has the option of using "smart guides" which help align
items for you automatically, without using a grid design tool:
For your infographic design, use a grid design system to create margins. It's good
practice to keep enough space in between your elements and the edge of your canvas
to avoid visual tension.
An infographic design can potentially have any size of margin you desire but it's
important to keep the margin consistent all the way along the edge of your canvas.
A good rule of thumb is to keep all objects and elements at least 20px (one square on
the grid) away from the edge of the canvas.
For example, you could use a one column layout for a minimal infographic, or create
a list infographic by spitting the layout into two columns, like in these examples:
Always start your infographic planning with pen and paper. This way, you can work
through concepts and designs roughly before finalizing a digital copy.
The type of data you are trying to convey will determine which chart type is the best
for your data. To decide which type of chart would best convey your data, you first
have to determine what kind of data you want to present: a single important number?
A comparison between data points? A trend over time? An outlier?
The types of charts most commonly used in infographics are pie chart, bar graphs,
column graphs, and line charts. For example, a column graph is one of the easiest
ways to compare data.
And to convey a trend over time, the most common type of chart to use is a line chart.
But if you think of a more unusual chart that would convey the data effectively, don't
be afraid to get more creative. For example, you can identify and show the trends such
as sales over time, correlations such as sales compared temperature or outliers such
as sales in an unusual area.
A good rule of thumb is that your charts should be easy enough to read that it only
takes readers ten second or less to understand.
Typography
Typography is a very important element for infographic design and your one
way to explain your ideas and information when images, graphs or icons can’t.
That's why it's important to choose the best font for your data.
That being said, try to limit the amount of text you include on your infographic.
The best infographics have visual impact, with the text acting as a secondary
explanation for the visual content.
This infographic design relies on icons, dates and headers, with minimal
explanatory text:
First and foremost, make sure your type is legible. In most cases, avoid decorative or
script type as it tends to be hard to read.
If you have to use small text or an or an elegant font, use it sparingly. It always helps
to increase the line height of bodies of small text if it starts to become hard to read.
To keep infographic designs cohesive, limit your use of fonts to a maximum of three
types, but also don’t stick with just one. A tasteful use of two brand fonts can create a
nice dynamic and hierarchy of information.
Alternatively, a number or statistic in one font type next subtext in another creates a
hierarchy of information:
The use of photography can be tricky if there is not a photographer available to take
the shots of exactly what you need. Be sure that the photos you use have a consistent
style and lighting. Try to pick photos with the same lighting effects, same backdrops,
same amount of dark areas, etc. It's important to stick to a certain style as images that
clearly don’t fit the set will distract from the information being communicated.
If you're going for a simple modern use of photography, use only images with flat
colour (or white) backdrops. If you're going for a neutral newspaper approach, use
only black and white images.
Photos that take up a majority of space in your infographic, which can distract from
information. This issue can be solved by using a cropping such as circle frames.
Contrast creates visual impact by placing two strikingly different elements beside each
other. If an infographic has a light background with bold colored shapes, our eyes are
immediately attracted to the bold colors. This allows you to organize information by
having a certain element more prominent than another.
Contrasting visuals
An infographic with visual balance is pleasing to the eyes because everything fits
together seamlessly. A balanced infographic keeps the entire composition cohesive,
especially in a long form infographic.
If there are heavy visuals on the top of an infographic, you should keep the flow going
right to the bottom. There are two types of balance: symmetrical and asymmetrical.
Symmetrical balance is when each side of the composition has equal weight. This
layout is effective in a comparison infographic like this one:
For example, if you are creating timeline infographics, alternate text between both
sides of the timeline for a balanced composition.
Decide on a color scheme before creating your infographic. A good rule of thumb is to
design your infographic with two or three main colors, and to use minor color accents.
When choosing your color scheme, decide on the tone of your infographic. Is it a
business infographic? If so, try using neutral colors like blue or green, or, of course,
your brand colors, especially if you're including your logo.
For fun, eye popping infographics, use brighter hues, but be careful not to use large
amounts dark or neon colors as they can be straining to the eyes when viewed on the
web.
Color can also be used as a sectional tool. Add blocks of color to section your
infographic, giving the eye some breathing room as viewers scroll down.
In order for your infographic design to flow from start to finish, the design elements
need to be consistent. If you are using icons that are filled in, rather than line art icons,
then keep using the same style troughout the entire infographic.
The same goes for the style of images you use, the font style, and the color palette.
This will prevent your infographic from looking cluttered and will actually make it easier
to read.
Negative space is the blank space surrounding objects in a design. Negative space
has a big impact on your design. If your infographic design is too crowded, it can
overwhelm readers and make it difficult to read the information.
Creating space around the elements in your design allows readers the breathing room
to process the information. Pro tip: if you are using a 16pt size, the line height should
be no less than 1.2.
Leaving negative space can be as simple as making sure there is enough space
between lines of text. Just look at the difference that a little space makes in the
example below:
This is going to sound cliché, but when it comes designing infographics, it will probably
take your a couple of goes at it to get the hang of it. You will need to figure out what
works in a design and what doesn't.
Luckily, this learning process is made a lot easier by infographic templates and guide.
And there are certainly a lot of examples out there for you to draw inspiration from.
When in doubt, ask someone else to look over your design before you publish it--they
will be able to tell you if there is any information that is unclear, or if there is any way
that you could make your design even better.
Online Resources
• https://youtu.be/uQXf_d5Mgjg
• https://youtu.be/tN8_85gKOTc
• https://youtu.be/lCXFJEK_lVk
Assessment 9
Instruction: Using what you learn on this module, now it is your turn to create your
own Infographic. Think of any social issues in our society and create a story out of it,
using infographic tell us a story. Use any application available to you to create it.
References
• https://venngage.com/blog/infographic-design/#1
• https://visme.co/blog/infographic-design-guide/