Sorry, you need to enable JavaScript to visit this website.
Submitted by Anonymous (not verified) on

The transformational launch of GPT-4 has accelerated the race to build AI data centers for large-scale training and inference. While GPUs and high-bandwidth memory are well-known critical components, the essential role of storage devices in AI infrastructure is often overlooked. This presentation will explore the AI processing pipeline within data centers, emphasizing the crucial role of storage devices in both compute and storage nodes. We will examine the characteristics of AI workloads to derive specific requirements for storage devices and controllers
 

Bonus Content
Off
Presentation Type
Presentation
Start Date/Time
End Date/Time
Zoom Meeting Completed
Off
Main Speaker / Moderator