Computer Architecture Vs Computer Organisation
Computer Architecture Vs Computer Organisation
Architecture usually refers to the attributes of a computer which tell us how to design a computer such as
instruction set, how the momory would be organised, how the interaction between the memory and the
busses
Where as computer organization, usually refers to the operational features of a computer. Ie what
perticular instructions should be there in a set of computer instruction, whereas how these instructions
would be implemented.
Making the common case fast will tend to enhance performance better than
optimizing the rare case. Ironically, the common case is oft en simpler than the rare
case and hence is oft en easier to enhance. This common sense advice implies that
you know what the common case is, which is only possible with careful
experimentation and measurement. We use a sports car as the icon for making the
common case fast, as the most common trip has one or two passengers, and it's
surely easier to make a fast sports car than a fast minivan.
Since the dawn of computing, computer architects have offered designs that get
more performance by performing operations in parallel. We'll see many examples
of parallelism in this book. We use multiple jet engines of a plane as our icon for
parallel performance.
Following the saying that it can be better to ask for forgiveness than to ask for
permission, the next great idea is prediction. In some cases it can be faster on
average to guess and start working rather than wait until you know for sure,
assuming that the mechanism to recover from a misprediction is not too expensive
and your prediction is relatively accurate. We use the fortune-teller's crystal ball as
our prediction icon.
7. Hierarchy of memories
Programmers want memory to be fast, large, and cheap, as memory speed often
shapes performance, capacity limits the size of problems that can be solved, and
the cost of memory today is often the majority of computer cost. Architects have
found that they can address these conflicting demands with a hierarchy of
memories, with the fastest, smallest, and most expensive memory per bit at the top
of the hierarchy and the slowest, largest, and cheapest per bit at the bottom. Caches
give the programmer the illusion that main memory is nearly as fast as the top of
the hierarchy and nearly as big and cheap as the bottom of the hierarchy. We use a
layered triangle icon to represent the memory hierarchy. The shape indicates speed,
cost, and size: the closer to the top, the faster and more expensive per bit the
memory; the wider the base of the layer, the bigger the memory.
Computers not only need to be fast; they need to be dependable. Since any physical
device can fail, we make systems dependable by including redundant components
that can take over when a failure occurs and to help detect failures. We use the
tractor-trailer as our icon, since the dual tires on each side of its rear axels allow
the truck to continue driving even when one tire fails. (Presumably, the truck driver
heads immediately to a repair facility so the fl at tire can be fixed, thereby restoring
redundancy!)