Sorry, you need to enable JavaScript to visit this website.

SNIA Developer Conference September 15-17, 2025 | Santa Clara, CA

CEO, Co-Founder,

Hammerspace

Hammerspace founder and Chief Executive Officer David Flynn is a recognized leader in IT innovation who has been architecting disruptive computing platforms since his early work in supercomputing and Linux systems. David pioneered the use of flash for enterprise application acceleration as founder and former CEO of Fusion-io, which was acquired by SanDisk in 2014. He served as Fusion-io President and CEO until May 2013 and board member until July 2013. Previously, David served as Chief-Architect at Linux Networx where he was instrumental in the creation of the OpenFabrics stack and designed several of the world’s largest supercomputers leveraging Linux clustering, InfiniBand, RDMA-based technologies. David holds more than 100 patents in areas across web browser technologies, mobile device management, network switching and protocols to distributed storage systems. He earned a bachelor’s degree in computer science at Brigham Young University and serves on boards for several organizations and startup companies.

Standards-Based Parallel Global File Systems and Automated Data Orchestration with NFS

Submitted by Anonymous (not verified) on

High-performance computing applications, web-scale storage systems, and modern enterprises increasingly have the need for a data architecture that will unify at the edge, and in data centers, and clouds. These organizations with massive-scale data requirements need the performance of a parallel file system coupled with a standards-based solution that will be easy to deploy on machines with diverse security and build environments.

Standards-Based Parallel Global File System - No Proprietary Clients

Introducing the Need for NFS-SSD the Ethernet Direct Attached SSD that Natively Speaks Network File System

Submitted by Anonymous (not verified) on

As the one of the inventors of NVMe at Fusion-io, David has long been a thought leader in the SSD space. Large steps forward were made in data processing when NVMe was embedded in the server, bringing large quantities of high-performance data into direct contact with processing. However, data driven workload requirements have since changed, dramatically: 1) Larger quantities of data are being created, analyzed, and processed; 2) Demands for performance are not just growing, but, accelerating; and, 3) Most importantly, data needs to be more usable.

Standards-based Data Platforms for HPC and AI Data Architectures

Submitted by Anonymous (not verified) on

The rapid evolution of GPU computing in the Enterprise has led to unprecedented demand for robust and scalable data platforms. This session will explore the critical role that standardized frameworks and protocols play in optimizing data creation, processing, collaboration and storage within these advanced computing environments. Attendees will gain insights into how adopting standards can enhance interoperability, move data to compute, facilitate efficient data exchange, and ensure seamless integration across diverse systems and applications.

Subscribe to David Flynn