yes
NO...The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.Thus Cache memory is not main memory of computer system.--- form Wiki
Cache is not an abbreviation. Cache comes from the French word caché, meaning "storage" or "hidden place."
Cache memory
While a hard drive does have volatile memory on it in the form of a cache, the user doesn't have access to it.
Cache is a high speed memory which is basically used for the following reason: As the speed of the main memory is not as much as the speed of the CPU.so just to compensate the speed mistmatch between the CPU and main memory the cache is used in between the two.so whenever the CPU asks for any data its being checked with the cache memory and if present then "cache hit" occurs or else "cache miss" occurs wher the CPU takes the data form the main memory and that data's cpoy is being send to the cache for any further operation where the CPU can request for the same data. Anand bhat(mca@kiit-870024)
No. It's a kind of RAM integrated into your processor. The cache memory is usually small and is used to store data that is being used by the processor because accessing the main RAM directly will be slower. In multiple-processor systems, a problem called cache incoherence can occur when two or more processors access the same region, modifies the data in different ways and not immediately writing back the data to the main RAM. It will cause two different versions of the same region existing simultaneously, causing problems. That is why mutexes or other locks should be used when accessing shared memory.
Temporary storage on chips is called memory. Most such solid-state memory is in the form of random-access memory (RAM) chips, usually dynamic RAM (DRAM). The people who write operating systems and the computer architects that design computer systems and CPUs often use many different temporary storage areas, each one with a different name. If you are building a high-speed computer or writing a high-performance operating system, you will learn about the temporary storage areas known as the disk page cache, the stack, the heap, and the virtual memory page table, are (more or less) stored in the main memory DRAM. The CPU has a few temporary locations called registers. Often there is one or more levels of cache (the L1 cache, the L2 cache, etc.) between the CPU and the main memory. High-performance CPUs typically put a cache on the same chip as the CPU; some older personal computers had an "external cache" SRAM chips between the CPU chip and the main memory DRAM chips. Many high-performance computers have several levels of successively larger and slower caches -- an extremely fast I-cache and D-cache and TLB, the L1 cache, the L2 cache, the L3 cache, and main memory.
Three types of mapping procedures are there? (1) Associative Mapping-The fastest and most flexible cache organizations uses associative mapping. The associative memory stores both the address and content of memory word. This permits any location in catche to store word in main memory. (2) Direct Mapping-Associative memories are expesive compared to RAM's because of added logic associated with each cell. (3) Set Associative Mapping-It is a more general method that includes pure associative and direct mapping as special case. It is an improvement over the direct mapping organization in that each word of cache can store two or more words of memory under the same index address. Each data word is stored together with its tag and the number of tag data items in one word of cache is said to form a set. With Regards Veer Thakur Chandigarh
Processors have internal memory-- In the form of the 3 L's. L1 Cache - Usually a very small amount, like 32-128kb, this is the fastest cache, and used to store very small strings of data for immediate use. L2 Cache - The heavy lifter of a processor-- Usually 1-2MB per core. This cache stores more complex values and helps with prioritization of workloads. L3 Cache - Relatively new in the processing world, L3 caches are larger, slower banks of memory in the upwards of 12-18MB used for basically anything not covered by the first 2 caches. If you mean internal memory as in RAM, a processor will not work without RAM.
Memory is technically any form of electronic storage. Personal computer system have a hierarchical memory structure consisting of auxiliary memory (disks), main memory (DRAM) and cache memory (SRAM). A design objective of computer system architects is to have the memory hierarchy work as through it were entirely comprised of the fastest memory type in the system.
form factor
The possessive form for the noun memory is memory's.