Current enterprise storage devices have to service many diverse and continuously evolving application workloads (For e.g., OLTP, Big Data/Analytics and Virtualization). These workloads combined with additional enterprise storage services like deduplication, compression, snapshots, clones, replication, tiering etc. result in complex I/Os to the underlying storage. Traditional storage system tests make use of benchmarking tools, which generate a fixed and constant workload, comprised of a single or few I/O access patterns and are not sufficient for enterprise storage testing. Workload simulation-based tools, which are available in the market come with their own challenges like cost, learning curve and workload support. Hence, it has become a very big challenge to generate, debug and reproduce these workloads, which could eventually lead to many customer found defects. This arises a need for a robust testing methodology, which closely emulates production environment and helps identify issues early in testing. In our solution testing lab, we have been working on a unique test framework design which leverages software defined services and helps to uncover and reduce turnaround time for complex production issues. In our talk, we will show how we have built our test framework using containers and various open-source tools and its role in our solution testing efforts for our next generation storage products.
- Enterprise storage test challenges
- Impact of enterprise storage services on the underlying storage
- Benefits of our testing methodology
- Enterprise storage workloads