answersLogoWhite

0

Cache memory is random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data there (from a previous reading of data), it does not have to do the more time-consuming reading of data from larger memory.

Cache Memory generally comes in smaller size (3MB, 6MB etc) than the RAM (512 MB, 1GB,2GB.....)

User Avatar

Wiki User

11y ago

What else can I help you with?

Related Questions

Why not make cache larger?

A cache is intended to speed things up. The larger the cache, the slower it performs. If it becomes slower to access the cache than the memory itself, it defeats the purpose of having a cache.


What cache typically has the most memory?

Caches are generally defined as L1, L2, and L3. If a CPU has any cache memory at all, it will have at least L1 cache. L1 cache is the fastest, and most expensive, type of cache memory. Usually CPUs will only have a very small amount of L1. L2 is typically larger, less expensive, and slower than L1. L3 is less expensive, larger, and slower than L1 or L2, if present. All three levels of cache memory are magnitudes faster than system memory. Systems withequivalenthardware, including CPUs will identical speeds, will perform better at certain tasks when more cache memory is present, with L1 cache adding the most performance boost.


What is the significance of a cache miss in computer systems and how does it impact overall performance?

A cache miss occurs when the CPU cannot find the needed data in the cache memory and has to retrieve it from the slower main memory. This impacts performance by causing a delay in processing instructions, as accessing main memory is slower than accessing the cache. This can lead to decreased overall system performance and efficiency.


Cache memory and register memory?

Register memory are smaller in size than cache memory and registers are faster than cache..Cache memory store the frequently used data from main memory..


Is cache memory a removable memory?

No, a cache memory is often used to store data that has been needed recently on grounds that it will be faster to access when/if it is needed again. When data that is requested is contained in the cache you have a cache hit, and when you have to retrieve it from the hard drive (or where ever its original storage was) again it is called a cache miss. Retrieving data from the hard drive is slower than retrieving it from the cache.


Which one is better catch memory or bigger RAMWhy?

Generally speaking, a user will find more benefit to having more RAM than a processor with a larger cache. Cache is many times faster than accessing from main memory, true, but it is inevitable that one will need to read from memory. More memory allows for one to actually do more without experiencing significant slowdowns. Cache is special on-die memory that speeds up performance of sequential operations. When an instruction is called, first the CPU will look in the L1 cache which is tiny (usually 16K), then the L2 and (if the processor has it) L3 cache which are larger but many times slower, and only then will it go to main memory.


What is an example of L2 cache memory size?

Level 2 (L2) cache is built onto the processor. On Older slot-mounted processors, L2 cache was external to the processor die, and ran at slower speeds than the processor. on socketed processors, L2 cache is built onto the processor. If the processor does not find the desired memory locations in L1 cache, it checks L2 cache next. However Processors with larger L2 caches perform most tasks much more quickly than processors that have smaller L2 caches for two reasons. Cache memory is faster than main memory and the processor checks cache memory for needed information before checking main memory.


Why is cache memory smaller than main memory?

because cache memory is costlier than main memory and physical size of cache memory also matters.ignoring the cost , if we use large cache memory, it will take larger physical space.so mother board won't be able to accomodate it


What is an examples of an L2 cache size?

Level 2 (L2) cache is built onto the processor. On Older slot-mounted processors, L2 cache was external to the processor die, and ran at slower speeds than the processor. on socketed processors, L2 cache is built onto the processor. If the processor does not find the desired memory locations in L1 cache, it checks L2 cache next. However Processors with larger L2 caches perform most tasks much more quickly than processors that have smaller L2 caches for two reasons. Cache memory is faster than main memory and the processor checks cache memory for needed information before checking main memory.


What is an example of an L2 cache size?

Level 2 (L2) cache is built onto the processor. On Older slot-mounted processors, L2 cache was external to the processor die, and ran at slower speeds than the processor. on socketed processors, L2 cache is built onto the processor. If the processor does not find the desired memory locations in L1 cache, it checks L2 cache next. However Processors with larger L2 caches perform most tasks much more quickly than processors that have smaller L2 caches for two reasons. Cache memory is faster than main memory and the processor checks cache memory for needed information before checking main memory.


Give two reasons why caches are useful?

It minimizes the miss match between the main memory and the processor Actually, there is one reason for a cache to exist: speed. A cache exists to speed up the time that the processor takes to obtain a bit of data. Typically, main memory is many times slower than the processor - in modern computers, it can be 1000 times slower. So, caches are made up of memory more expensive and faster than main memory, to store parts of main memory in, in the hopes that the data the processor wants will be available in the cache.


Why is cache memory more expensive than hard disk?

as the cache memory is mory faster than RAM or hard disk ,but the main reason is that the cache memory has comparator and storage medium at the same time, the comparator checks if the address of the value being accessed is in the associative memory(a part of cache)or not, and the ordinary memory(another part of cache)has the data....