Inside Cache

By on
Inside Cache
Page 3 of 4  |  Single page
Main memory
The main memory in your system, even though you may not think of it this way, is also a cache. It’s the largest, and slowest, cache in your system. Off the CPU die and into the void of motherboard busses, main memory is a heck of a lot slower, but in turn you can load it up with cheap (by comparison to the cost of building memory on the CPU) RAM and store all the running data of your operating system and applications for the CPU to fetch as needed.

And we all know what happens when our operating system (gosh, here’s looking at you Vista) demands more memory than the system has installed – it’s off to the hard drive.


Hard drives
Hard drives are also a cache, of a particular fancy – in literal terms they are long-term non-volatile storage, and are also both larger and slower by an order of magnitude over main memory. They continue the ‘greater hits, slower access’ definition, with in fact a hit-rate of 100 per cent, if you think about it.

Really if RAM was non-volatile, voluminous, and cheap as chips you wouldn’t need a hard drive. This is of course the tenet behind SSDs, but they are just emerging, and it will be a while before they can match the volumes of a traditional hard drive.

click to view full size image


Then, on the drives, we frequently have the page file (or swap) which is roughly translated as RAM-on-your-disk. When main memory is full, the operating system maps data in and out of the swapfile, using it as an extension of main memory. So here we have another cache, a virtual one, caching data from main memory and stored as a file on the drive itself.

Finally, of course, hard drives have their own on-board RAM that’s used to buffer data in transit to and from the system. These literally empty almost as fast as they can be filled, and while the portion used for read caching is questionable, write caching can have a big impact. Here the drive can tell the system that a write has been completed even if it actually hasn’t been written to the drive yet, and in doing so allow the operating system to move onto the next write or task.


Applications
Much like using hard drive space as a buffer for memory with the page file, perversely the reverse is also true when it comes to the operating system and applications using main memory to buffer access to the disk. In one sense this is a means to effectively extend the drive’s own cache, but also to keep data coming to and from drives in main memory where, if needed again, is on-hand rather than needing to go to the hard drive.

All operating systems set aside a portion of main memory as a disk cache, although technically it’s a buffer. Here, the operating system can perform writes to the disk cache in memory, get an immediate response that the task is complete, and move on while the cache is later flushed to disk by the cache manager (usually handled by the operating system’s kernel).

Finally, it’s not just the operating system that will use main memory for caching – applications do as well. Take your browser, for example, which will use a small portion of memory for caching objects in memory in addition to a disk cache. Other applications that will use memory caching include web servers, proxies, bittorrent clients, video software, media players, and more.

All up, the cache is a concept and mechanism that is integral to the operation of your machine, and a means to maximise peformance where possible in any given subsystem.

Previous PageNext Page
1 2 3 4 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Tags:

Log in

Email:
Password:
  |  Forgot your password?