Sorry, you need to enable JavaScript to visit this website.

SNIA Developer Conference September 15-17, 2025 | Santa Clara, CA

Storage Devices for the AI Data Center

Abstract

The transformational launch of GPT-4 has accelerated the race to build AI data centers for large-scale training and inference. While GPUs and high-bandwidth memory are well-known critical components, the essential role of storage devices in AI infrastructure is often overlooked. This presentation will explore the AI processing pipeline within data centers, emphasizing the crucial role of storage devices such as SSDs in compute and storage nodes. We will examine the characteristics of AI workloads to derive specific requirements for flash storage devices and controllers.

Learning Objectives

Flash storage devices are essential components inside AI compute nodes, and of course also in the external storage tiers. Storage is however the slowest component in AI data path when compared to compute and memory. Training and inference workloads impose different requirements on storage devices inside the AI data center. AI is driving the adoption of next generation interfaces for storage devices in the AI data center. To meet the performance requirements for AI workloads, optimizations in following areas need to be considered for flash storage: NAND media, SSD controller, host interfaces and form factors.