Abstract
Buffers, Caches and Queues are part of every data center architecture, and a critical part of performance – both in improving it as well as hindering it. A well-implemented buffer can mean the difference between a finely run system and a confusing nightmare of troubleshooting. Knowing how buffers and queues work in storage can help make your storage system shine.
However, there is something of a mystique surrounding these different data center components, as many people don’t realize just how they’re used and why. In this pod of the “Too Proud To Ask” series, we demystify this very important aspect of data center storage. You’ll learn:
- What are buffers, caches, and queues, and why you should care about the differences?
- What’s the difference between a read cache and a write cache?
- What does “queue depth” mean?
- What’s a buffer, a ring buffer, and host memory buffer, and why does it matter?
- What happens when things go wrong?