Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
6 views

W 08 Data Replacement Algorithms for Memory 

The document covers memory replacement algorithms and cache logic in computer architecture, detailing cache memory hierarchy, addressing, and various mapping techniques. It outlines key concepts such as cache size, placement, replacement, and write policies, along with examples and activities for student engagement. Intended learning outcomes include understanding pipelined execution and multi-core processors.

Uploaded by

dreamy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

W 08 Data Replacement Algorithms for Memory 

The document covers memory replacement algorithms and cache logic in computer architecture, detailing cache memory hierarchy, addressing, and various mapping techniques. It outlines key concepts such as cache size, placement, replacement, and write policies, along with examples and activities for student engagement. Intended learning outcomes include understanding pipelined execution and multi-core processors.

Uploaded by

dreamy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Bachelor of Science Honours in SE

DES220130 - Computer Architecture

8. Memory Replacement Algorithms


ACKNOWLEDGEMENT

Presented by: Mr. Chandana Deshapriya

DES220130 - Computer Architecture 2


RESOURCES

• J. L. Hennessy and D. A. Patterson. Computer Architecture: A


Quantitative Approach. Morgan Kaufmann, San Francisco, CA,
fifth edition, 2012.
• B. Ram , Fundamentals of Microprocessors and Microcomputers
Dhanpat Rai & Sons, 2012.
• John L. Hennessy and David A. Patterson, Computer Architecture,
A Quantitative Approach, 6th ed. Morgan Kaufmann, 2017.

DES220130 - Computer Architecture 3


AGENDA

8. Memory Replacement algorithms


8.1 Cache logic
8.2 Examples of caches
8.3 Fully associative and direct mapped

DES220130 - Computer Architecture 4


INTENDED LEARNING OUTCOMES (ILO)

By the end of this section, students should be able to:


ILO1: articulate the basics of pipelined execution;
ILO2: explain the parallelism and multi-core processors;

DES220130 - Computer Architecture 5


Cache logic

DES220130 - Computer Architecture 6


Cache Memory Hierarchy
Memory Structure
Cache Read Operation - Flowchart
Cache Addressing
• Where does cache it?
• Between processor and virtual memory or main memory
• Logical cache (virtual cache) stores data using virtual
addresses
• Processor accesses cache directly, not thorough physical
cache
• Cache access faster, before MMU address translation
• Virtual addresses use same address space for different
applications
• Physical cache stores data using main memory physical
addresses
Cache size
• Cache of 32kByte
Activity 1
What is the size of cache memory if there are
12 address lines and 16 data word ?

Examples should be shown using this

DES220130 - Computer Architecture 12


Activity 1

• What is the size of memory if there are 12 address lines and 16 data
word?

memory locations = 212 x 16 bit


= 2 x 22 x 16 bit
10
Examples should be shown using this

= 1k x 22 x 2 x 8 bit
= 1kB x 4 x 2 Byte
= 8kB

DES220130 - Computer Architecture 13


Memory Replacement Algorithms
• Memory replacement algorithms control how to manage
data in cache memory.
• Algorithms ensure optimal performance.

Key Concepts
1. Cache: A small, high-speed storage used to hold frequently
accessed data.
2. Cache Block: The unit of storage in the cache, holding a portion
of memory data.
3. Cache Miss: When data needed by the CPU is not found in the
cache.
Cache Logic
Cache logic is the set of rules and mechanisms that control
the cache operates, including data placement, replacement,
and eviction.

Core Functions of Cache Logic


1. Placement Policy – save data
2. Replacement Policy – delete data
3. Write Policy – Modify data
Placement Policy

Determines where data is stored in the cache.


• Example:
1. Direct Mapping
2. Associative Mapping
3. Set Associative Mapping
Mapping Techniques
1. Direct Mapping
How it works: Each memory block maps to a specific cache line.
Advantages: Simple and cost-effective.
Disadvantages: Collisions occur when multiple blocks map to the same line.
2. Associative Mapping
How it works: Any memory block can occupy any cache line.
Advantages: Reduces collisions significantly.
Disadvantages: Expensive and slower due to search time.
3. Set-Associative Mapping
How it works: Cache is divided into sets; each block maps to a specific set but can
occupy any line within that set.
Advantages: Balances the benefits of direct and associative mapping.
Disadvantages: More complex than direct mapping.
Replacement Policy
Decides which data to remove when the cache is full.

Example Algorithms:

1. Least Recently Used (LRU)


2. First-In, First-Out (FIFO)
3. Random Replacement
4. Optimal Algorithm
Replacement Algorithms
1. Least Recently Used (LRU):
Evicts the data that hasn’t been used for the
longest time.

2. First-In, First-Out (FIFO):


Evicts the oldest data in the cache.

3. Random Replacement:
Randomly chooses a block to evict.

4. Optimal Algorithm:
Evicts the data that will not be needed for the
longest future time (used for theoretical purposes).
Write Policy
Control how data modifications are written back to memory.

1. Write-Through: Updates both cache


and main memory simultaneously.

2. Write-Back: Updates only the cache,


writing to memory only on eviction.
Activity 1
What are the differences between Write-
Through and Write-Back policies?

Examples should be shown using this

DES220130 - Computer Architecture 22


Basic Principles of Cache Memory
Activity 2
Question 1
What is the primary purpose of cache memory in a computer system?
A. To store all the data permanently
B. To increase the CPU clock speed
C. To reduce the time needed to access frequently used data
D. To replace main memory

Question 2
Which of the following cache types is the fastest but smallest in size?
A. L1 Cache
B. L2 Cache
C. L3 Cache
D. Main Memory

Question 3
In which cache mapping technique can a memory block be stored in any
cache line?
A. Direct Mapping
B. Associative Mapping
C. Set-Associative Mapping
D. Random Mapping

ICT 2203 - Computer Architecture 23


Basic Principles of Cache Memory
Activity 2
Question 4
Which cache replacement policy replaces the cache line that has not been accessed for the longest time?
A. First In First Out (FIFO)
B. Least Recently Used (LRU)
C. Random Replacement
D. Most Frequently Used (MFU)

Question 5
What is the main challenge of maintaining cache coherence in multi-core processors?
A. Ensuring each core has its separate cache
B. Preventing cache lines from being overwritten
Examples data
C. Synchronizing should be shown
updates using
across this cores
multiple
D. Increasing cache size without reducing speed

ICT 2203 - Computer Architecture 24


Answers in Activity 03
Activity 3

1. Answer: C

2. Answer: A

3. Answer: B

Examples should be shown using this


4. Answer: B

5. Answer: C

ICT 2203 - Computer Architecture 25


Activity 3
Complete the activity given on LMS

Examples should be shown using this

DES220130 - Computer Architecture 26

You might also like