cache- Cache memory

In main memory and CPU Insert one or more levels between SRAM Composed cache memory . extend cache Limited , because SRAM It's expensive .

cache effect : In order to solve the problem CPU An important technique for speed mismatch between and main memory .

cache characteristic : It has spatial locality and temporal locality .

cache Composition of :SRAM And control logic . If cache stay CPU Off chip , Its control logic is generally combined with the main memory control logic , Called main storage /cache controller . if cache stay CPU within , By CPU Provide its control logic .

CPU And cache The data exchange between them is based on words , and cache Data exchange with main memory is in blocks . A block consists of several words , It's fixed length .

 

cache schematic diagram

cache Hit rate of : In order to make the average read time of main memory as close as possible cache Read out time of ,cache Hit rate should be close to 1.

Address mapping

meaning : In order to put the main memory block in the cache in , Some method must be used to locate the main memory address to cache in , It's called address mapping .

Address mapping method : Fully associative mapping , Direct mapping and group associative mapping .

Fully associative mapping

 

Summary :

(1) In full connection cache in , All tags are implemented with an associated memory , All data with a common RAM To achieve .

(2) advantage ” Low conflict rate ,cache High utilization rate

(3) shortcoming : Comparator is difficult to design and implement

(4) Small capacity only cache.

Direct mapping

Summary :

(1) advantage : The hardware is simple , Low cost .

(2) shortcoming : Each main memory block has only one fixed row location to store .

(3) High conflict rate .( If block numbers are spaced apart m Two blocks are stored in the same integer multiple cache Line time )

(4) Suitable for large capacity cache.

Group associative mapping

Summary :

Moderate consideration “ Fully associative mapping ” and “ Direct mapping ” And try to avoid the disadvantages of both .

Replacement strategy

meaning : When a new main memory block needs to be copied to cache, When the row position of the block is occupied by other main memory , It's going to take place .

Suitable address mapping : Fully associative mapping and group associative mapping

(1) Least frequently used (LFU) algorithm

meaning : Swap out the row of data that has been accessed the least in a period of time . Set one counter per line , from 0 Start counting , Each time a row is accessed, the counter is incremented 1. When replacement is needed , Swap out the row with the smallest count , At the same time, all counters of these lines are cleared .

characteristic : In this algorithm, the counting period is limited to the interval between two substitutions of these fixed lines ( Replace once , The counter is cleared once ), It can not strictly reflect the recent visit .

(2) Minimum use in the near future (LRU) algorithm

meaning : Replace those that have not been visited for a long time in the near future . Set one counter per line ,Cache Every hit , Hit line counter reset , Other lines of counter increase 1. When replacement is needed , Swap out the line with the largest count .

characteristic : This algorithm protects just copied to cache New data row in , Has a high hit rate .

(3) Random substitution

meaning : from cache Select a line randomly from the row position of .

characteristic : It is easy to implement in hardware , And it's faster than those strategies . But it may decrease cache Hit rate and work efficiency .

(4) First in, first out (FIFO) algorithm

meaning : It's always the first to be called in cache Replace the contents of , It is not necessary to record the usage of each block at any time .

characteristic : Easy to implement , The circuit is simple . However, some frequently used programs may be added ( Such as loop program ) As the earliest cache Block and replace it .

cache Write back operation strategy of

meaning :CPU Yes Cache The write of has changed Cache Content of . When the content is changed Cache The block is replaced Cache Time , Select the write back operation replacement strategy
Cache The content is consistent with the main memory content .

(1) Write back method

When CPU write Cache Hit time , Modify only Cache Content of , It does not write to main memory immediately ; Write back to main memory only when the row is replaced .

advantage : Reduced access to main memory .

shortcoming : There are hidden dangers of inconsistency .

solve the problem : each Cache Row must be configured with a modify bit , To reflect whether the trip was CPU Modified .

(2) Complete writing

When CPU write Cache Hit time ,Cache Write changes occur simultaneously with main memory . So it is well maintained Cache Consistency with main memory content . When CPU write Cache On Miss , Write directly to main memory .

advantage :Cache There is no need to set a modification bit and corresponding judgment logic for each line in .

shortcoming :cache Yes CPU Write to main memory without caching , Reduced Cache The efficacy of .

(3) Write once

Writing strategy based on writeback and full write , The processing method of write hit and write miss is basically the same as that of writeback , Only the first write hit is written to main memory at the same time ( Complete writing ).

advantage : Easy to maintain the whole system Cache Consistency of .

Technology
Daily Recommendation