answersLogoWhite

0

Which memory locations can be cached by which cache locations

The replacement policy decides where in the cache a copy of a particular entry of main memory will go. If the replacement policy is free to choose any entry in the cache to hold the copy, the cache is called fully associative. At the other extreme, if each entry in main memory can go in just one place in the cache, the cache is direct mapped. Many caches implement a compromise in which each entry in main memory can go to any one of N places in the cache, and are described as N-way set associative. For example, the level-1 data cache in an AMD Athlon is 2-way set associative, which means that any particular location in main memory can be cached in either of 2 locations in the level-1 data cache. Associativity is a trade-off. If there are ten places the replacement policy can put a new cache entry, then when the cache is checked for a hit, all ten places must be searched. Checking more places takes more power, chip area, and potentially time. On the other hand, caches with more associativity suffer fewer misses (see conflict misses, below), so that the CPU spends less time servicing those misses. The rule of thumb is that doubling the associativity, from direct mapped to 2-way, or from 2-way to 4-way, has about the same effect on hit rate as doubling the cache size. Associativity increases beyond 4-way have much less effect on the hit rate, and are generally done for other reasons (see virtual aliasing, below). In order of increasing (worse) hit times and decreasing (better) miss rates, * direct mapped cache -- the best (fastest) hit times, and so the best tradeoff for "large" caches * 2-way set associative cache * 2-way skewed associative cache -- "the best tradeoff for .... caches whose sizes are in the range 4K-8K bytes" -- André Seznec[3] * 4-way set associative cache * fully associative cache -- the best (lowest) miss rates, and so the best tradeoff when the miss penalty is very high

User Avatar

Wiki User

16y ago

What else can I help you with?

Related Questions

What is associative memory explain its advantage?

its also called content addressable memory .Content-addressable memory (CAM) is a special type of computer memory used in certain very high speed searching applications. It is also known as associative memory, associative storage, or associative array


What is memory in digital electronic?

Data and program instructions are stored in primary/electronic memory. Explain the concept of electronic memory ''the concept of electronic memory''


Explain concept of electronic memory?

ELECTRONIC MEMORY IS WHERE DATA AND PROGRAME INSTRUCTIONS ARE STORED


Diagram of auxiliary memory?

i need de diagram on associative memory..


What is the explanation for the working of associative memory?

u98u8989


What is a memory of a computer that is suited to parallel searches by data association?

Associative Memory.


Where is Associative memory used?

in memory allocation formates only we are using this associative memory. an it is widely used in database management systems etc................ If we want to get the information about this we will give the total information about the whole data.


What are the difference among direct mapping associative mapping and set associative mapping?

The differences among direct mapping and set-associative mapping :Direct mapping : Each line in main memory maps onto a single cache line.Set-associative : Each line in main memory maps onto a small (collection) set of cache line.Direct mapping : A memory block is mapped into a unique cache line, depending on the memory address of the respective block.Set-associative : A memory block is mapped into any of the line of a set. The set is determined by the memory address, but the line inside the set can be any one.dont knowyet


What are the differences among direct mapping associative mapping and set associative mapping?

The differences among direct mapping and set-associative mapping :Direct mapping : Each line in main memory maps onto a single cache line.Set-associative : Each line in main memory maps onto a small (collection) set of cache line.Direct mapping : A memory block is mapped into a unique cache line, depending on the memory address of the respective block.Set-associative : A memory block is mapped into any of the line of a set. The set is determined by the memory address, but the line inside the set can be any one.dont knowyet


What are associative memory deficits?

Associative memory deficits refer to difficulties in making connections between pieces of information or remembering how different elements are related to each other. People with associative memory deficits may struggle to recall names of people they have met or remember where they left an object because they have trouble associating the information with the context in which it occurred. These deficits can be caused by various conditions such as brain injury, dementia, or attention disorders.


Mapping function between cache lines and main memory block?

I didn't quite understand your question but I'll answer you anyways :P.. It's known that the main memory is much larger that the cache memory , and we need to transfer a block of instructions to the cache memory to be frequently used by the processor to improve performance and reduce time spent in fetching instructions or data ( dealing with cache memory is much faster than RAM ). for example lets say that a main memory has a 128 data blocks and you need to place them in the cache memory which consists of 32 data blocks? then you have to have sort of technique to place them or MAPPING FUNCTION. And there are plenty of 'em (four mapping techniques as far as i know) i will just mention them without getting into details: Direct mapping , fully-associative, set-associative and n-way set associative. if you need more details just ask. greetings Can you please provide some help over Direct mapping , fully-associative, set-associative and n-way set associative Thanx


Three cache mapping techniques?

Three types of mapping procedures are there? (1) Associative Mapping-The fastest and most flexible cache organizations uses associative mapping. The associative memory stores both the address and content of memory word. This permits any location in catche to store word in main memory. (2) Direct Mapping-Associative memories are expesive compared to RAM's because of added logic associated with each cell. (3) Set Associative Mapping-It is a more general method that includes pure associative and direct mapping as special case. It is an improvement over the direct mapping organization in that each word of cache can store two or more words of memory under the same index address. Each data word is stored together with its tag and the number of tag data items in one word of cache is said to form a set. With Regards Veer Thakur Chandigarh