Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
103 views22 pages

From Abacus To Analytical Engine

Download as docx, pdf, or txt
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 22

From Abacus to Analytical Engine

When you hear the word computer, 


maybe you think of something like a beefy gaming desktop with flashing lights, 
or maybe you think of a slim and sleek laptop. 
These fancy devices aren't what people had in mind when computers were first created. 
To put it simply, a computer is a device that stores 
and processes data by performing calculations. 
Before we had actual computer devices, 
the term computer was used to refer to someone who actually did the calculation. 
You're probably thinking that's crazy talk. 
A computer lets me check social media, browse the Internet, 
design graphics, how can it possibly just perform calculations? 
Well, friends, in this course, 
we'll be learning how computer calculations are baked into applications, 
social media, games et cetera, 
all the things that you use every day. 
But to kick things off, 
we'll learn about the journey computers took from 
the earliest known forms of computing into the devices that you know and love today. 
In the world of technology, 
and if I'm getting really philosophical, in life, 
it is important to know where we've been in 
order to understand where we are and where we are going. 
Historical context can help you understand why things work the way they do today. 
Have you ever wondered why the alphabet isn't laid out in order on your keyboard? 
The keyboard layout that most of the world uses today is the qwerty layout, 
distinguished by the Q, W, E, R, 
T, and Y keys in the top row of the keyboard. 
The most common letters that you type aren't found on the home row, 
where your fingers hit the most. 
But why? 
There are many stories that claim to answer this question. 
Some say it was developed a slowdown typist so 
they wouldn't jam old mechanical typewriters. 
Others claim it was meant to resolve problem for telegraph operators. 
One thing is for sure, 
the keyboard layout that millions of people use today isn't the most effective one. 
Different keyboard layouts have even been created to try and make typing more efficient. 
Now that we're starting to live in a mobile-centric world with our smartphones, 
the landscape for keyboards may change completely. 
My typing fingers are crossed. 
In the technology industry, 
having a little context can go a long way to 
making sense of the concepts you'll encounter. 
By the end of this lesson, 
you'll be able to identify some of 
the most major advances in the early history of computers. 
Do you know what an abacus is? 
It looks like a wooden toy that a child would play with, 
but it's actually one of the earliest known computers. 
It was invented in 500 BC to count large numbers. 
While we have calculators like the old reliable TI-89s or the ones in our computers, 
abacuses actually are still used today. 
Over the centuries, humans built 
more advanced counting tools but they still 
required a human to manually perform the calculations. 
The first major step forward was the invention of 
the mechanical calculator in the 17th by Blaise Pascal. 
This device used a series of gears and levers 
to perform calculations for the user automatically. 
While it was limited to addition, subtraction, 
multiplication and division for pretty small numbers, 
it paved the way for more complex machines. 
The fundamental operations of 
the mechanical calculator were later applied to the textile industry. 
Before we had streamlined manufacturing, 
looms were used to weave yarn into fabric. 
If you wanted to design patterns on your fabric, 
that took an incredible amount of manual work. 
In the 1800s, a man by the name of Joseph Jacquard invented a programmable loom. 
These looms took a sequence of cards with holes in them. 
When the loom encountered a hole, 
it would hook the thread underneath it. 
If it didn't encounter a hole, 
the hook wouldn't thread anything. 
Eventually this spun up a design pattern on the fabric. 
These cards were known as punch cards. 
And while Mr. Jacquard reinvented the textile industry, 
he probably didn't realize that his invention would 
shaped the world of computing and the world itself today. 
Pretty epic Mr. Jacquard, pretty epic. 
Let's fast forward a few decades and meet a man by the name of Charles Babbage. 
Babbage was a gifted engineer who developed a series of machines that are 
now known as the greatest breakthrough on our way to the modern computer. 
He built what was called a difference engine. 
It was a very sophisticated version of some 
of the mechanical calculators we were just talking about. 
It could perform fairly complicated mathematical operations but not much else. 
Babbage's follow up to 
the difference engine was a machine he called the Analytical Engine. 
He was inspired by Jacquard's use of punch cards to automatically 
perform calculations instead of manually entering them by hand. 
Babbage used punch cards in his Analytical engine to allow 
people to predefine a series of calculations they wanted to perform. 
As impressive as this achievement was, 
the Analytical engine was still just a very advanced mechanical calculator. 
It took the powerful insights of a mathematician named 
Ada Lovelace to realize the true potential of the analytical engine. 
She was the first person to recognize that the machine 
could be used for more than pure calculations. 
She developed the first algorithm for the engine. 
It was the very first example of computer programming. 
An algorithm is just a series of steps that solves specific problems. 
Because of Lovelace's discovery that algorithms could 
be programmed into the Analytical engine, 
it became the very first general purpose computing machine in history, 
and a great example that women have had some of 
the most valuable minds in technology since the 1800s. 
We've covered a lot of ground already, 
learning about how primitive counting devices like the abacus 
evolved into huge complex devices like the Analytical engine, 
proof that there was life before social media. 
In the next video, we'll learn about how 
these mechanical machines made the leap into modern computing.

The Path to Modern Computers

Welcome back. In this video, 


we'll be learning how huge devices like the Analytical Engine grew, 
I mean, shrunk into the computing devices that we use today. 
The development of computing has been steadily growing since the invention 
of the Analytical Engine but didn't make a huge leap forward until World War II. 
Back then, research into computing was super expensive, 
electronic components were large and you 
needed lots of them to compute anything of value. 
This also meant that computers took up a ton of space and 
many efforts were underfunded and unable to make headway. 
When the war broke out, governments started pouring money 
and resources into computing research. 
They wanted to help develop technologies that would give 
them advantages over other countries, 
lots of efforts were spun up and advancements were made in fields like cryptography. 
Cryptography is the art of writing and solving codes. 
During the war, computers were used to process 
secret messages from enemies faster than a human could ever hope to do. 
Today, the role cryptography plays in secure communication is 
a critical part of computer security which we'll learn more about in a later course. 
For now, let's look at how computers started to make dramatic impact on society. 
First up is Alan Turing, 
an English mathematician and now famous computer scientist. 
He helped develop the top-secret Enigma machine which 
helped Allied Forces decode Axis messages during World War II. 
The Enigma machine is just one of the examples of how 
governments started to recognize the potential of computation. 
After the war, companies like IBM, Hewlett-Packard, 
and others were advancing their technologies into the academic, 
business, and government realms. 
Lots of technological advancements and computing were made in 
the 20th century thanks to direct interest from governments, 
scientists, and companies left over from World War II. 
These organizations invented new methods to store data in 
computers which fueled the growth of computational power. 
Consider this, until the 1950s punch cards were a popular way to store data. 
Operators would have decks of ordered punch cards that were used for data processing. 
If they dropped the deck by accident and the cards got out of order, 
it was almost impossible to get them sorted again. 
There were obviously some limitations to punch cards, 
but thanks to new technological innovations like magnetic tape and its counterparts, 
people began to store more data on more reliable media. 
A magnetic tape worked by magnetizing data onto a tape. 
Back in the 1970s and 80s, 
people used to listen to music on vinyl records or cassette tapes. 
These relics are examples of how magnetic tapes 
can store information and run that information from a machine. 
This left stacks and stacks of punch cards to collect dust while 
their new magnetic tape counterparts began to revolutionize the industry. 
I wasn't joking when I said early computers took up a lot of space. 
They had huge machines to read data and racks of vacuum tubes that help move that data. 
Vacuum tubes control the electricity voltages and 
all sorts of electronic equipment like televisions and radios, 
but these specific vacuum tubes were bulky and broke all the time. 
Imagine what the work of an I.T. 
support specialist was like in those early days of computing. 
The job description might have included crawling around 
inside huge machines filled with dust and creepy crawly things, 
or replacing vacuum tubes and swapping out those punch cards. 
In those days, doing some debugging might have taken on a more literal meaning. 
Renowned computer scientist Admiral Grace Hopper had 
a favorite story involving some engineers working on the Harvard Mark II computer. 
They were trying to figure out the source of the problems in a relay. 
After doing some investigating, 
they discovered the source of their trouble was a moth, 
a literal bug in the computer. 
The ENIAC was one of the earliest forms of general purpose computers. 
It was a wall-to-wall convolution of massive electronic components and wires. 
It had 17,000 vacuum tubes and took up about 1,800 square feet of floor space. 
Imagine if you had to work with that scale of equipment today. 
I wouldn't want to share an office with 1,800 square feet of machinery. 
Eventually, the industry started using transistors to control electricity voltages. 
This is now a fundamental component of all electronic devices. 
Transistors perform almost the same functions as 
vacuum tubes but they are more compact and more efficient. 
You can easily have billions of transistors in a small computer chip today. 
Throughout the decades, more and more advancements were made. 
The very first compiler was invented by Admiral Grace Hopper. 
Compilers made it possible to translate 
human language via a programming language into machine code. 
In case you didn't totally catch that, 
we'll talk more about compilers later in this course. 
The big takeaway is that this advancement was 
a huge milestone in computing that led to where we are today. 
Now, learning programming languages is accessible for almost anyone anywhere. 
We no longer have to learn how to write machine code in ones and zeros. 
You get to see these languages in action in 
future lessons where you'll write some code yourself. 
Side note, if the thought of that scares you, 
don't worry, we'll help you every step of the way. 
But for now, let's get back to the evolution of computers. 
Eventually, the industry gave way to the first hard disk drives and microprocessors. 
Then, programming language started becoming 
the predominant way for engineers to develop computer software. 
Computers were getting smaller and smaller, 
thanks to advancements in electronic components. 
Instead of filling up entire rooms like ENIAC, 
they were getting small enough to fit on tabletops. 
The Xerox Alto was the first computer 
that resembled the computers we're familiar with now. 
It was also the first computer to implement 
a graphical user interface that used icons, a mouse, and a window. 
Some of you may remember that the sheer size and cost of 
historical computers made it almost impossible for an average family to own one. 
Instead, they were usually found in military and university research facilities. 
When companies like Xerox started building machines at 
a relatively affordable price and at a smaller form factor, 
the consumer age of computing began. 
Then in the 1970s, 
a young engineer named Steve Wozniak invented the Apple I, 
a single-board computer MIT for hobbyists. 
With his friend Steve Jobs, 
they created a company called Apple Computer. 
Their follow up to the Apple I, 
the Apple II, was ready for the average consumer to use. 
The Apple II was a phenomenal success, 
selling for nearly two decades and giving 
a new generation of people access to personal computers. 
For the first time, computers became affordable for 
the middle class and helped bring computing technology into both the home and office. 
In the 1980s, IBM introduced its personal computer. 
It was released with a primitive version of an operating system 
called MS DOS or Microsoft Disk Operating System. 
Side note, modern operating systems don't just have text anymore, 
they have beautiful icons, words, 
and images like what we see on our smartphones. 
It's incredible how far we've come from 
the first operating system to the operating systems we use today. 
Back to IBM's PC, 
it was widely adopted and made more accessible to consumers, 
thanks to a partnership with Microsoft. 
Microsoft, founded by Bill Gates, 
eventually created Microsoft Windows. 
For decades it was the preferred operating system in the workplace and 
dominated the computing industry because it could be run on any compatible hardware. 
With more computers in the workplace, the dependence on I.T. 
rose and so did the demand for skilled workers who could support that technology. 
Not only were personal computers entering the household for the first time, 
but a new type of computing was emerging: video games. 
During the 1970s and 80s, 
coin-operated entertainment machine called arcades became more and more popular. 
A company called Atari developed one of 
the first coin-operated arcade games in 1972 called Pong. 
Pong was such a sensation that people were standing in 
lines at bars and rec centers for hours at a time to play. 
Entertainment computers like Pong launch the video game era. 
Eventually, Atari went on to launch 
the video computer system which help bring personal video consoles into the home. 
Video games have contributed to the evolution of computers in a very real way, 
tell that to the next person who dismisses them as a toy. 
Video game show people that computers didn't always have to be all work and no play, 
they were a great source of entertainment too. 
This was an important milestone for the computing industry, 
since at that time, 
computers were primarily used in the workplace or at research institutions. 
With huge players in the market like Apple Macintosh and 
Microsoft Windows taking over the operating systems space, 
a programmer by the name of Richard Stallman started 
developing a free Unix-like operating system. 
Unix was an operating system developed by Ken Thompson and Dennis Ritchie, 
but it wasn't cheap and wasn't available to everyone. 
Stallman created an OS that he called GNU. 
It was meant to be free to use with similar functionality to Unix. 
Unlike Windows or Macintosh, 
GNU wasn't owned by a single company, 
its code was open source which meant that anyone could modify and share it. 
GNU didn't evolve into a full operating system, 
but it set a foundation for the formation of one 
of the largest open source operating system, 
Linux, which was created by Linus Torvalds. 
We'll get into the technical details of Linux later in this course, 
but just know that it's a major player in today's operating systems. As an I.T. 
support specialist, it is very likely that you'll work with an open source software. 
You might already be using one like the internet browser Mozilla Firefox. 
By the early 90s, computers started getting even smaller, 
then a real game changer made its way into the scene: 
PDAs or personal digital assistants, 
which allows computing to go mobile. 
These mobile devices included portable media players, word processors, 
email clients, Internet browsers, 
and more all in one handy handheld device. 
In the late 1990s, 
Nokia introduced a PDA with mobile phone functionality. 
This ignited an industry of pocketable computers or as we know them today, smartphones. 
In mere decades, we went from having computers that weigh tons and 
took up entire rooms to having powerful computers that fit in our pockets. 
It's almost unbelievable, and it's just the beginning. 
If you're stepping into the I.T. 
industry, it's essential that you understand how 
to support the growing need of this ever-changing technology. 
Computer support 50 years ago consisted of 
changing vacuum tubes and stacking punch cards, 
things that no longer exist in today's I.T. 
world. While computers evolve in both complexity and prevalence, 
so did knowledge required to support and maintain them. 
In 10 years, I.T. 
support could require working through virtual reality lenses, you never know. 
Who knows what the future holds? 
But right now, it is an exciting time to be at the forefront of this industry. 
Now that we've run down where computers came 
from and how they've evolved over the decades, 
let's get a better grasp on how computers actually work.

Kevin Career

I think I realized I could pursue this like IT support 


as a career my freshman year of high school. 
So, I took an intro to computer applications class and 
that's when you just learn about like a lot of the like the very, 
very basics of computers. 
And our teacher always talked about how this is where the world is going. 
This is in 2001 and getting this foundational knowledge at a young age of 14, 
15 is like going to help you a lot and you're 
moving into a college and leaving school and trying to get an actual job. 
Well fortunately enough, my first job was working with Google. 
I started here maybe a month after graduating and I was like doing like very, 
very entry level low level tech support. 
One of the best memories, 
one like the best accomplishments I think I have from 
my IT support job was training some of 
the new people in the program that I was a part of. 
So, I guess, it's like a win knowing that not only 
myself who eventually left the program and went on to other things, 
people that I brought on, 
helped train, helped teach, 
have moved on and done better things.

Computer Language

Remember when I said that a computer is a device that stores and 


processes data by performing calculations? 
Whether you're creating an artificial intelligence that can beat humans at chess 
or something more simple, like running a video game, 
the more computing power you have access to, the more you can accomplish. 
By the end of this lesson, 
you'll understand what a computer calculates, and how. 
Let's look at this simple math problem. 
0 +1 equals what? 
It only takes a moment to come up with the answer 1, but 
imagine that you needed to do 100 calculations that were this simple. 
You could do it, and if you were careful, you might not make any mistakes. 
Well, what if you needed to do 1,000 of these calculations? 
How about 1 million? How about 1 billion? 
This is exactly what a computer does. 
A computer simply compares 1s and 0s, but millions or billions of times per second. 
Wowza! 
The communication that a computer uses is referred to as binary system, 
also known as base-2 numeral system. 
This means that it only talks in 1s and 0s. 
You may be thinking, okay, my computer only talks in 1s and 0s. 
How do I communicate with it? 
Think of it like this. 
We use the letters of the alphabet to form words and we give those words meaning. 
We use them to create sentences, paragraphs, and whole stories. 
The same thing applies to binary, except instead of A, B, C, and so on, 
we only have 0 and 1 to create words that we give meaning to. 
In computing terms, we group binary into 8 numbers, or bits. 
Technically, a bit is a binary digit. 
Historically, we used 8 bits because in the early days of computing, 
hardware utilized the base-2 numeral system to move bits around. 
2 to the 8th numbers offered us a large enough range 
of values to do the computing we needed. 
Back then, any number of bits was used, but 
eventually the grouping of 8 bits became the industry standard that we use today. 
You should know that a group of 8 bits is referred to as a byte. 
So a byte of zeroes and ones could look like 10011011. 
Each byte can store one character, and we can have 256 possible values, 
thanks to the base-2 system, 2 to the 8th. 
In computer talk, this byte could mean something like the letter C.
Play video starting at 2 minutes 22 seconds and follow transcript2:22
And this is how computer language was born. 
Let's make a quick table to translate something a computer might see into 
something we'd be able to recognize. 
What does the following translate to?
Play video starting at 2 minutes 35 seconds and follow transcript2:35
Did you get hello? 
Pretty cool, right?
Play video starting at 2 minutes 38 seconds and follow transcript2:38
By using binary, we can have unlimited communication with our computer. 
Everything you see on your computer right now, whether it's a video, 
an image, text or anything else, is nothing more than a 1 or a 0. 
It is important you understand how binary works. 
It is the basis for everything else we'll do in this course, so 
make sure you understand the concept before moving on.

ASCII
Hexadecimal Binary
Character

NUL 00 00000000

SOH 01 00000001

STX 02 00000010
ASCII
Hexadecimal Binary
Character

ETX 03 00000011

EOT 04 00000100

ENQ 05 00000101

ACK 06 00000110

BEL 07 00000111

BS 08 00001000

HT 09 00001001

LF 0A 00001010

VT 0B 00001011

FF 0C 00001100

CR 0D 00001101

SO 0E 00001110

SI 0F 00001111

DLE 10 00010000

DC1 11 00010001

DC2 12 00010010

DC3 13 00010011

DC4 14 00010100

NAK 15 00010101

SYN 16 00010110

ETB 17 00010111
ASCII
Hexadecimal Binary
Character

CAN 18 00011000

EM 19 00011001

SUB 1A 00011010

ESC 1B 00011011

FS 1C 00011100

GS 1D 00011101

RS 1E 00011110

US 1F 00011111

Space 20 00100000

! 21 00100001

" 22 00100010

# 23 00100011

$ 24 00100100

% 25 00100101

& 26 00100110

' 27 00100111

( 28 00101000

) 29 00101001

* 2A 00101010

+ 2B 00101011

, 2C 00101100
ASCII
Hexadecimal Binary
Character

- 2D 00101101

. 2E 00101110

/ 2F 00101111

0 30 00110000

1 31 00110001

2 32 00110010

3 33 00110011

4 34 00110100

5 35 00110101

6 36 00110110

7 37 00110111

8 38 00111000

9 39 00111001

: 3A 00111010

; 3B 00111011

< 3C 00111100

= 3D 00111101

> 3E 00111110

? 3F 00111111

@ 40 01000000

A 41 01000001
ASCII
Hexadecimal Binary
Character

B 42 01000010

C 43 01000011

D 44 01000100

E 45 01000101

F 46 01000110

G 47 01000111

H 48 01001000

I 49 01001001

J 4A 01001010

K 4B 01001011

L 4C 01001100

M 4D 01001101

N 4E 01001110

O 4F 01001111

P 50 01010000

Q 51 01010001

R 52 01010010

S 53 01010011

T 54 01010100

U 55 01010101

V 56 01010110
ASCII
Hexadecimal Binary
Character

W 57 01010111

X 58 01011000

Y 59 01011001

Z 5A 01011010

[ 5B 01011011

\ 5C 01011100

] 5D 01011101

^ 5E 01011110

_ 5F 01011111

` 60 01100000

a 61 01100001

b 62 01100010

c 63 01100011

d 64 01100100

e 65 01100101

f 66 01100110

g 67 01100111

h 68 01101000

i 69 01101001

j 6A 01101010

k 6B 01101011
ASCII
Hexadecimal Binary
Character

l 6C 01101100

m 6D 01101101

n 6E 01101110

o 6F 01101111

p 70 01110000

q 71 01110001

r 72 01110010

s 73 01110011

t 74 01110100

u 75 01110101

v 76 01110110

w 77 01110111

x 78 01111000

y 79 01111001

z 7A 01111010

{ 7B 01111011

| 7C 01111100

} 7D 01111101

~ 7E 01111110

DEL 7F 01111111

 
Character Encoding

Remember from the earlier video that a byte can store only zeros and ones. 
That means we can have 256 possible values. 
By the end of this video, 
you'll learn how we can represent the words, numbers, 
emojis and more we see on our screens, 
from only these 256 possible values. 
It's all thanks to character encoding. 
Character encoding is used to assign 
our binary values to characters so that we as humans can read them. 
We definitely wouldn't want to see all the text in our emails and 
Web pages rendered in complex sequences of zeros and ones. 
This is where character encodings come in handy. 
You can think of character encoding as a dictionary. 
It's a way for your computers to look up 
which human characters should be represented by a given binary value. 
The oldest character encoding standard used this ASCII. 
It represents the English alphabet, 
digits, and punctuation marks. 
The first character in ASCII to binary table, a lowercase a, 
maps to 0 1 1 0 0 0 0 1 in binary. 
This is done for all the characters you can find in 
the English alphabet as well as numbers and some special symbols. 
The great thing with ASCII was that we only needed to use 
127 values out of our possible 256. 
It lasted for a very long time, 
but eventually it wasn't enough. 
Other character encoding standards recreated to represent different languages, 
different amounts of characters and more. 
Eventually they would require more than 256 values we were allowed to have. 
Then came UTF 8. 
The most prevalent encoding standard used today. 
Along with having the same ASCII table, 
it also lets us use a variable number of bytes. 
What do I mean by that? Think of any emoji. 
It's not possible to make emojis with a single byte, 
so as we can only store one character in a byte, 
instead UTF 8 allows us to store a character in more than one byte, 
which means endless emoji fun. 
UTF 8 is built off the Unicode Standard. 
We won't go into much of detail, 
but the Unicode Standard helps us represent character encoding in a consistent manner. 
Now that we've been able to represent letters, numbers, 
punctuation marks and even emojis, 
how do we represent color? 
Well, there are all kinds of color models. 
For now, let's stick to a basic one that's used in a lot of computers. 
RGB or red, green, and blue model. 
Just like the actual colors, 
if you mix a combination of any of these, 
you'll be able to get the full range of colors. 
In computerland, we use 3 characters for the RGB model. 
Each character represents a shade of the color and 
that then changes the color of the pixel you see on your screen. 
With just eight combinations of zeros and ones, 
were able to represent everything that you see on your computer, 
from a simple letter a, 
to the very video that you're watching right now on the Coursera website. 
Very cool. In the next video, 
we'll discuss how we actually generate the zeros and ones.

Binary

You might be wondering how our computers get these ones and zeros. 
It's a great question. Imagine we have a light bulb and 
a switch that turns the state of the light on or off. 
If we turn the light on, 
we can denote that state is one. 
If the light bulb is off, 
we can represent the state is zero. 
Now imagine eight light bulbs and switches, 
that represents eight bits with a state of zero or one. 
Let's backtrack to the punched cards that were used in Jacquard's loom. 
Remember that the loom used cards with holes in them. 
When the loom would reach a hole it would hooked to thread underneath, 
meaning that the loom was on. 
If there wasn't a hole, 
it would not hook the thread, so it was off. 
This is a foundational binary concept. 
By utilizing the two states of on or off, 
Jacquard was able to weave intricate patterns of the fabric with his looms. 
Then the industry started refining the punch cards a little more. 
If there was a hole, 
the computer would read one. 
If there wasn't a hole, it would read zero. 
Then, by just translating the combination of zeros and ones, 
our computer could calculate any possible amount of numbers. 
Binary in today's computer isn't done by reading holes. 
It uses electricity via transistors allowing electrical signals to pass through. 
There's an electric voltage, 
we would denote it as one. 
If there isn't, we would denote it by zero. 
For just having transistors isn't enough for our computer to be able to do complex tasks. 
Imagine if you had two light switches on opposite ends of a room, 
each controlling a light in the room. 
What if when you went to turn on the light with one switch, 
the other switch wouldn't turn off? 
That would be a very poorly designed loom. 
Both switches should either turn the light on or off depending on the state of the light. 
Fortunately, we have something known as logic gates. 
Logic gates allow our transistors to do more complex tasks, 
like decide where to send electrical signals depending on logical conditions. 
There are lots of different types of logic gates, 
but we won't discuss them in detail here. 
If you're curious about the role that 
transistors and logic gates play in modern circuitry, 
you can read more about it in the supplementary reading. 
Now we know how our computer gets its ones and 
zeros to calculate into meaningful instructions. 
Later in this course, we'll be able to talk about how we're able to turn 
human-readable instructions into zeros and 
ones that are computer understands through a compiler. 
That's one of the very basic building blocks of 
programming that's led to the creation of our favorite social media sites, 
video games, and just about everything else. 
And I'm super excited to teach you how to count in binary, that's up next.

How to Count in Binary

Binary is the fundamental communication block of computers, 


but it's used to represent more than just text and images. 
It's used in many aspects of computing like computer networking, 
which you'll learn about in a later course. 
It's important that you understand how computers count in binary. 
We've shown you simple lookup tables that you can use like the ASCII to binary table, 
but as an IT support specialist, 
whether you're working on networking or security, 
you'll need to know how binary works. 
So let's get started. You'll probably need a trusty pen and paper, 
a calculator, and some good old-fashioned brain power to help you in this video. 
The binary system is how our computers count using ones and zeros, 
but humans don't count like that. 
When you were a child, you may have counted using ten fingers on your hand. 
That innate counting system is called the decimal form or base-10 system. 
In the decimal system, 
there are 10 possible numbers you can use ranging from zero to nine. 
When we count binary, 
which only uses zero and one, 
we convert it to a system that we can understand, decimal. 
330, 250, 2, 40, 
4 million, they're all decimal numbers. 
We use the decimal system to help us figure out what bits our computer can use. 
We can represent any number in existence just by using bits. That's right. 
And we can represent this number just using ones and zeros. 
So how does that work? 
Let's consider these numbers: 128, 64, 
32, 16, 8, 4, 2, and 1. 
What patterns do you see? 
Hopefully, you'll see that each number is a 
double of the previous number going right to left. 
What happens if you add them all up? 
You get 255. 
That's kind of weird. I thought we could have 256 values for a byte. Well, we do. 
The zero is counted as a value, 
so the maximum decimal number you can have is 255. 
What do you think the number is represented here? 
See where the ones and the zeros are represented. 
Remember, if our computer sees a one, 
then the value was on. 
If it sees a zero, then the value is off. 
If you add these numbers up, 
you'll get a decimal value. 
If you guessed 10, then you're right. 
Good job. If you didn't get it, 
that's okay too. Take another look. 
The 2 and 8 are on, 
and if we add them up, we get 10. 
Let's look at our ASCII to binary table again. 
The letter h in binary is 01101000. 
Now, let's look at an ASCII to decimal table. 
The letter h in decimal is 104. 
Now, let's try our conversion chart again. 
64 plus 32 plus 8 equals 104. 
Look at that. The math checks out. Now, we're cooking. 
Wow! We've gone over all the essentials of 
the basic building blocks of computing and machine language. 
Next, you're going to learn how we build on top of this layer of 
computing to perform the task you'll do day to day.

Abstraction

When we interact with our computers we use our mouse, keyboard or 
even a touch screen. 
We don't tell it the actual zeros and ones it needs to understand something. 
But wait, we actually do. 
We just don't ever have to worry about it. 
We use the concept of abstraction to take a relatively complex system and 
simplify it for our use. 
You use abstraction every day in the real world, and you may not even know it. 
If you've ever driven a car, 
you don't need to know how to operate the transmission or the engine directly. 
There's a steering wheel, some pedals, maybe a gear stick. 
If you buy a car from a different manufacturer, 
you operate it in pretty much the same way 
even though the stuff under the hood might be completely different. 
This is the essence of abstraction. 
Abstraction hides complexity by providing a common interface, the steering wheel, 
pedals, gear stick, and gauges in our car example.
Play video starting at :1:2 and follow transcript1:02
The same thing happens in our computer. 
We don't need to know how works underneath the hood. 
We have a mouse and a keyboard we can use to interact with it. 
Thanks to abstractions, 
the average computer user doesn't have to worry about the technical details. 
We'll use this under the hood e metaphor throughout the program to describe 
the area that contains the underlying implementation of the technology. 
In computing, we use abstraction to make a very complex problem, 
like how to make computers work, easier to think about. 
We do that by breaking it apart into simpler ideas that describe single 
concepts or individual jobs that need to be done, and then stack them in layers. 
This concept of abstraction will be used throughout this entire course. 
It's a fundamental concept in the computing world. 
One other simple example of abstraction in an IT role that you might see a lot 
is an error message. 
We don't have to dig through someone else's code and find a bug. 
This has been abstracted out for us already in the form of an error message. 
A simple error message like file not found actually tells us a lot of information and 
saves us time to figure out a solution. 
Can you imagine if instead of abstracting an error message 
our computer did nothing and we had no clue where to start looking for answers? 
Abstraction helps us in many ways that we don't even realize.

Module Introduction

Isn't the history of computers super interesting? 


I love going back in time and 
seeing how we got to this exciting point in computing. 
You've already taken the first few steps to building your foundational knowledge of 
IT, and before we dive deeper, I'd like to take a moment to introduce myself. 
My name is Devan Sri-Tharan, I've been working in IT for ten years. 
I'm a Corporate Operations Engineer at Google 
where I get to tackle challenging and complex IT issues. 
Thinking back, my first experience with tech began when I was about nine years 
old, when my dad brought home the family's first computer. 
I remember my dad holding a floppy disk and 
telling me that there was a game on it. 
To my dad's amazement I somehow managed to copy the game from disk 
onto the computer's hard drive. 
While it might seem like a trivial task now, this devise was just so 
new to us back then. 
Sure, I loved the different games I could play, but what I really loved was 
tinkering with the machine, trying to get it to do what I want it to do. 
While that floppy disk computer might have ignited my passion for technology, it 
was actually my first few job experiences that really started to shape my IT career. 
One was in retail, [LAUGH] selling baby furniture and 
the other was at a postal store, where I helped customers ship their packages, and 
became the one person IT crew. 
It might sound odd that working in retail inspired my career, but 
I've realized I really enjoy communicating with customers, 
trying to understand their needs and offering a solution. 
My first experience working directly in IT was in college 
as an IT support specialist intern. 
From there, I worked as an IT consultant to decommission an entire IT environment. 
This was my first experience working directly with large IT infrastructure, and 
pushing myself outside my comfort level as a college student. 
I bring up these few jobs for a reason. 
These experiences helped shape my career in IT. 
I knew at that time that I wanted to go into tech, but 
I struggled where I wanted to focus my career. 
Starting at Google as an IT generalist allowed me to experience many different 
areas of technology. 
It allowed me to figure out the jobs I didn't want to do, 
before I was able to identify exactly what I did want to do. 
Really passionate about IT infrastructure, but 
you can't understand infrastructure until you understand hardware. 
So let's dig in. 
In IT, hardware is an essential topic to understand. 
You might find yourself replacing faulty components or 
even upgrading an entire fleet of machines one day. 
By the end of this lesson, you'll be able to describe all the physical parts of 
a computer and how they work together. 
You'll even be able to build your own computer. 
Once you figure out how one computer works, 
you'll be able to understand how any type of computer works. 
Excited? I am, let's get started.

Introduction to Computer Hardware

Let's face it, computers are everywhere. 


You come into contact with them at home, work, the airport, 
the grocery store, you're using some type of computer to take this course. 
You know what? 
There's probably one in your pocket right now. 
While computers are complex and can seem daunting to learn, 
they ultimately just calculate, process, and store data. 
In this lesson, we're going to take a peek at what's inside of the computer. 
We'll spend the next few lessons explaining how each of 
these components work. 
But for now, let's check out a typical desktop setup. 
Desktops are just computers that can fit on or under our desks. 
So here we have a monitor, a keyboard, a mouse, and a desktop.
Play video starting at ::54 and follow transcript0:54
Sometimes you might even add a webcam, speakers, or a printer set up. 
We'll call these physical components, hardware. 
Let's take a look at the back of the computer.
Play video starting at :1:10 and follow transcript1:10
You can see common connectors here, the power outlet here, and 
the common ports here. 
Ports are connection points that we can connect devices to that 
extend the functionality of our computer. 
We're going to detail about the ports you see here in a later lesson. 
But here's a quick rundown. 
We have a port here to connect to a monitor, and 
a few ports here to plug your keyboard and mouse.
Play video starting at :1:35 and follow transcript1:35
There's another important one here for our network connection.
Play video starting at :1:40 and follow transcript1:40
With just these ports, 
we're able to have the basic functionality to browse the web and much more. 
Things look pretty similar in a laptop.
Play video starting at :1:48 and follow transcript1:48
Here are some of the same ports.
Play video starting at :1:52 and follow transcript1:52
A built-in monitor, And a keyboard.
Play video starting at :1:58 and follow transcript1:58
There are also physical components inside the laptop case that are hidden for 
portability. 
Once you figure out how one computer works, 
you can figure out how any other computer works. 
Okay, this is my favorite part. 
Let's open up this desktop and take a deeper look. 
Let me first clean up my desk.
Play video starting at :2:22 and follow transcript2:22
Get ready for it.
Play video starting at :2:25 and follow transcript2:25
Whoa, it looks pretty complicated, but that's okay. 
We'll take you through it. 
Let's start with a quick tour. 
Then we'll dive deeper into each of these parts in the next lesson. 
Right here, this component, it's a CPU or central processing unit, 
which is covered by this heat sink. 
You could think of the CPU as the brain of our computer. 
The CPU does all the calculations and data processing. 
It communicates pretty heavily with this component right here, 
RAM or Random Access Memory. 
RAM is our computer's short-term memory. 
We use this component when we want to store data temporarily. 
Like let's say, you're typing something into a chat or 
a piece of text in a word processor. 
This information is stored in the RAM. 
Don't worry, we'll cram in more details on RAM in the later lesson. 
When we want to store anything in long-term memory, 
we use this component here, the hard drive. 
The hard drive holds all of our data, 
which can include music, pictures, applications. 
Let me show you something else interesting. 
Have you noticed this large slab here? 
This is our motherboard.
Play video starting at :3:38 and follow transcript3:38
It holds everything in place and lets our components communicate with each other. 
It's the foundation of our computer. 
You can think of the motherboard as the body or 
circulatory system of the computer that connects all the pieces together. 
The last component we'll talk about is our power supply, which converts 
electricity from our wall outlet onto a format that our computer can use. 
You know what's interesting? 
All these components make up most computers, even a mobile phone. 
While it might look very different from your laptop, a mobile phone just uses 
a smaller version of the hardware that we saw in the desktop and laptop today. 
So now that we've covered the basic anatomy of the computer, 
we'll go over each of these components in depths in the next few lessons. 
Understanding how computer hardware works is a really helpful skill set in IT 
support, since an IT department maintains the hardware that a company uses. 
A solid understanding of these computer internals will come in handy when 
troubleshooting hardware related problems, and 
taking things apart to see how they work is just super fine.

You might also like