MCQs On Cache Memories
MCQs On Cache Memories
MCQs On Cache Memories
Cache Memory
This set of Embedded Systems Multiple Choice Questions & Answers (MCQs) focuses on
“Cache Memory”.
Answer: b
Explanation: The cache memory is a small random access memory which is faster than a normal
RAM. It has a direct connection with the CPU otherwise, there will be a separate bus for
accessing data. The processor will check whether the copy of the required data is present in the
cache memory if so it will access the data from the cache memory.
Answer: a
Explanation: The proportion of accesses of data that forms the cache hit, which measures the
effectiveness of the cache memory.
3. Which of the following determines a high hit rate of the cache memory?
a) size of the cache
b) number of caches
c) size of the RAM
d) cache access
View Answer
Answer: a
Explanation: The size of the cache increases, a large amount of data can be stored, which can
access more data which in turn increases the hit rate of the cache memory.
Answer: c
Explanation: The translation lookaside buffer is common cache memory seen in almost all CPUs
and desktops which are a part of the memory management unit. It can improve the virtual
address translation speed.
Answer: b
Explanation: The set associativity is a criterion which describes the number of cache entries
which could possibly contain the required data.
9. Which of the following refers to the number of consecutive bytes which are associated with
each cache entry?
a) cache size
b) associative set
c) cache line
d) cache word
View Answer
Answer: c
Explanation: The cache line refers to the number of consecutive bytes which are associated with
each cache entry. The data is transferred between the memory and the cache in a particular size
which is called cache line.
Answer: a
Explanation: The cache performance is completely dependent on the system and software. In
software, the processor checks out each loop and if a duplicate is found in the cache memory,
immediately it is accessed.
13. What does DMA stand for?
a) direct memory access
b) direct main access
c) data main access
d) data memory address
View Answer
Answer: a
Explanation: The DMA is direct memory access which can modify the memory without the help
of the processor. If any kind of memory access by DMA to be done, it will passes a request to the
processor bus and the processor provides an acknowledgement and gives the control of the bus to
the DMA.
Answer: b
Explanation: This difference in the speeds of operation of the system caused it to be inefficient.
Answer: a
Explanation: This means that the cache depends on the location in the memory that is referenced
often.
Answer: c
Explanation: None.
Answer: d
Explanation: The spatial aspect of locality of reference tells that the nearby instruction is more
likely to be executed in future.
5. The correspondence between the main memory blocks and those in the cache is given by
_________
a) Hash function
b) Mapping function
c) Locale function
d) Assign function
View Answer
Answer: b
Explanation: The mapping function is used to map the contents of the memory to the cache.
6. The algorithm to remove and place new contents into the cache is called _______
a) Replacement algorithm
b) Renewal algorithm
c) Updation
d) None of the mentioned
View Answer
Answer: a
Explanation: As the cache gets full, older contents of the cache are swapped out with newer
contents. This decision is taken by the algorithm.
8. The bit used to signify that the cache location is updated is ________
a) Dirty bit
b) Update bit
c) Reference bit
d) Flag bit
View Answer
Answer: a
Explanation: When the cache location is updated in order to signal to the processor this bit is
used.
Answer: b
Explanation: This is another way of performing the write operation, wherein the cache is updated
first and then the memory.
10. The approach where the memory contents are transferred directly to the processor from the
memory is called ______
a) Read-later
b) Read-through
c) Early-start
d) None of the mentioned
View Answer
Answer: c
Explanation: None.