Your Cache is Overdue a Revolution: MRCs for Cache Performance and Isolation

webinar

Author(s)/Presenter(s):

Irfan Ahmad

Library Content Type

Podcast

Presentation

Tutorial

Library Release Date

Focus Areas

Physical Storage

Storage Management

Abstract

It is well-known that cache performance is non-linear in cache size and the benefit of caches varies widely by workload. Irrespective of whether the cache is in a storage system, database or application tier, no two real workload mixes have the same cache behavior! Existing techniques for profiling workloads don’t measure data reuse, nor do they predict changes in performance as cache allocations are varied. Recently, a new, revolutionary set of techniques have been discovered for online cache optimization. Based on work published at top academic venues (FAST '15 and OSDI '14), we will discuss how to 1) perform online selection of cache parameters including cache block size and read-ahead strategies to tune the cache to actual customer workloads, 2) dynamic cache partitioning to improve cache hit ratios without adding hardware and finally, 3) cache sizing and troubleshooting field performance problems in a data-driven manner. With average performance improvements of 40% across large number of real, multi-tenant workloads, the new analytical techniques are worth learning more about. Learning Objectives Storage cache performance is non-linear; Benefit of caches varies widely by workload mix Working set size estimates don't work for caching Miss ratio curves for online cache analysis and optimization How to dramatically improve your cache using online MRC, partitioning, parameter tuning How to implement QoS, performance SLAs/SLOs in caching and tiering systems using MRCs