answersLogoWhite

0

A cache is intended to speed things up. The larger the cache, the slower it performs. If it becomes slower to access the cache than the memory itself, it defeats the purpose of having a cache.

User Avatar

Wiki User

10y ago

What else can I help you with?

Related Questions

What is hard drive cache,does it make your computer run faster?

The hard drive cache is used when accessing files on your hard drive and a larger cache does improve file operation speeds.


Is level 3 cache faster for the CPU to reach and is larger in size than Level 1 cache?

False


What is the difference between cache-memory and primary-memory?

Cache memory is smaller and quicker, primary memory larger and slower.


What is the size of L1 and L2 cache?

Usually the size of the L2 cache will be larger than the L1 cache so that if data hit in L 1 cache occurs, it can look for it in L 2 cache.. If data is not in both of the caches, then it goes to the main memory...


What if Cache size is equal to main memory size?

If we took size of cache as large as size of main memory then the main purpose of cache(take less time) would come to end, as larger the cache slower would be it's processing speed.


Does larger cache increase performance?

Yes, larger cache can increase performance by reducing the time it takes for the CPU to access frequently used data and instructions. With more cache available, the likelihood of a cache hit increases, which minimizes the need to fetch data from slower main memory. However, the performance gain also depends on the specific workload and how well the data fits into the cache. In some cases, diminishing returns may occur if the cache becomes too large relative to the workload's needs.


What is the difference between cache memory disk cache?

memory cache is on memory RAM, disk Cache is on the hard drive. They make things to get faster. For instance Google Earth use this disk cache to show you offline images.


Which type of CPU cache has the highest latency?

The Level 3 (L3) cache has the highest latency. The CPU cache is memory that is used to decrease the time that it takes the CPU to access data. Because the data is cached, it can be accessed more quickly. The CPU cache is often found directly on the CPU or built into the CPU. The L3 cache is usually larger than the L1 and L2 cache, but it is searched last. The CPU searches for data in the following order: L1 cache, L2 cache, L3 cache, RAM.


Why is cache memory smaller than main memory?

because cache memory is costlier than main memory and physical size of cache memory also matters.ignoring the cost , if we use large cache memory, it will take larger physical space.so mother board won't be able to accomodate it


You have configured dns cache only server in windows 2003 you cant find where it keeps the cache you want to make a backup of the cache how should you do?

jump around and it will work out


What cache typically has the most memory?

Caches are generally defined as L1, L2, and L3. If a CPU has any cache memory at all, it will have at least L1 cache. L1 cache is the fastest, and most expensive, type of cache memory. Usually CPUs will only have a very small amount of L1. L2 is typically larger, less expensive, and slower than L1. L3 is less expensive, larger, and slower than L1 or L2, if present. All three levels of cache memory are magnitudes faster than system memory. Systems withequivalenthardware, including CPUs will identical speeds, will perform better at certain tasks when more cache memory is present, with L1 cache adding the most performance boost.


Why can't cache can be made as large as the device for which it is caching?

Cache cannot be made as large as the device it serves because of cost, speed, and efficiency considerations. Larger caches require more expensive memory technology, which can significantly increase the overall cost of the device. Additionally, as cache size increases, the access time can also increase, negating the speed advantages of caching. Finally, diminishing returns occur with larger caches, as the likelihood of retrieving data from a larger cache decreases after a certain point due to the nature of data access patterns.