Buffers and Queues are part of every data center architecture, and a critical part of performance – both in improving it as well as hindering it. A well-implemented buffer can mean the difference between a finely run system and a confusing nightmare of troubleshooting. Knowing how buffers and queues work in storage can help make your storage system shine.
However, there is something of a mystique surrounding these different data center components, as many people don't realize just how they're used and why. Join our team of carefully-selected experts on February 14th in the next live webcast in our "Too Proud to Ask" series, "Everything You Wanted to Know About Storage But Were Too Proud To Ask – Part Teal: The Buffering Pod" where we'll demystify this very important aspect of data center storage. You'll learn:
- What are buffers, caches, and queues, and why you should care about the differences?
- What's the difference between a read cache and a write cache?
- What does "queue depth" mean?
- What's a buffer, a ring buffer, and host memory buffer, and why does it matter?
- What happens when things go wrong?
Leave a Reply