What does data CACHE means?
Hi,
How are you? What is meant by the term CACHE?
Please explain in detail as I am awaiting your answers.
Thanks.
Hi,
How are you? What is meant by the term CACHE?
Please explain in detail as I am awaiting your answers.
Thanks.
Cache is a small fast memory holding recently-accessed data designed to speed up subsequent access to the same data. Most often applied to processor-memory access but also used for a local copy of data accessible over a network etc. When data is read from, or written to, main memory a copy is also saved in the cache, along with the associated main memory address.
The cache monitors addresses of subsequent reads to see if the required data is already in the cache. If it is (a cache hit) then it is returned immediately and the main memory read is aborted (or not started). If the data is not cached (a cache miss) then it is fetched from the main memory and also saved in the cache. The cache is built from faster memory chips than main memory and so a cache hit takes much less time to complete than a normal memory access.
The cache may be located on the same integrated circuit as the CPU in order to further reduce the access time. In this case it is often known as {primary cache} since there may be a larger, slower secondary cache outside the CPU chip. The most important characteristic of a cache is its hit rate – the fraction of all memory accesses which are satisfied from the cache. This in turn depends on the cache design but mostly on its size relative to the main memory.
The size is limited by the cost of fast memory chips. The hit rate also depends on the access pattern of the particular program being run (the sequence of addresses being read and written). Caches rely on two properties of the access patterns of most programs:
When the processor wants to write to main memory, the data is first written to the cache on the assumption that the processor will probably read it again soon. Various different policies are used. In a write-through cache, data is written to main memory at the same time as it is cached. In a write-back cache it is only written to main memory when it is forced out of the cache. If all accesses were writes then, with a write-through policy, every write to the cache would necessitate a main memory write, thus slowing the system down to main memory speed. However, statistically, most accesses are reads and most of these will be satisfied from the cache.
Write-through is simpler than write-back because an entry that is to be replaced can just be overwritten in the cache as it will already have been copied to main memory whereas write-back requires the cache to initiate a main memory write of the flushed entry followed (for a processor read) by a main memory read. However, write-back is more efficient because an entry may be written many times in the cache without a main memory access.
Hello,
Here is a brief description of cache. Cache is a component of the computer system that stores data so that future requests of that data can be served faster.
Here's a scenario:
When you haven't opened a web page or a website before, for example Facebook.com on your computer, you would probably notice that it takes some time to jump to its home page. Upon entering to its homepage there is a register and login. Assuming you are already registered and you try to log in (please take note that the username field is totally empty that you have to type your username completely). After successfully logging in and trying to browse the pages, then you logged out and closed your browser. Now that you already opened that web page, it is stored in a web cache or in your browser. To experiment this, try opening the same browser that you used before and go to facebook.com. When you enter a single letter for your username, you will notice that there is a drop down with a username or it simply auto completes your username text field even if you just enter the first letter of your account's username and guess what, it is much faster than before.
This is the same with the applications on your computer like Word, Excel, PowerPoint, Photoshop, media player etc. Try experimenting with Photoshop from adobe if you have or any application you like that is not yet opened before and after opening just close the application and try to open again and you will see the difference in terms of speed.
Hi John!
Literally Cache is Short-term storage. It is used to accelerate up definite computer functions by transiently positioning data in a situation where it can be accessed more speedily than normal. For instance, data from a storage disk may be cached transiently in high-speed memory so that it can be read and written more rapidly than if it had to hail directly from the disk itself, or a microprocessor may use  an on-board memory cache to store transient data for use during operations.
'Cache' is obtained from the French word for a hiding place and so is pronounced like 'cash'. This particular kind of memory can be like a link between the main physical RAM and the CPU. The cache processes any data often at times used by the CPU when immediately available.
If the required data is not situated in the cache, a transfer is performed from the main memory. Â Let me know if you still need more enlightenment on this matter.
Cache Operations:
The CPU requests contents of memory location, which entails:
Cache includes tags to identify which block of main memory is in each cache slot  Â
Gawjus, you are like awesome in explaining things. You helped me in other questions as well and you are too good!
Thank you.