This presentation examines the critical role of storage solutions in optimizing AI workloads, with a primary focus on storage-intensive AI training workloads. We will highlight how AI models interact with storage systems during training, focusing on data loading and checkpointing mechanisms. We will explore how AI frameworks like PyTorch utilize different storage connectors to access various storage solutions. Finally, the presentation will delve into the use of file-based storage and object storage in the context of AI training:
Attendees will:
- Gain a clear understanding of the critical role of storage in AI model training workloads
- Understand how AI models interact with storage systems during training, focusing on data loading and checkpointing mechanisms
- Learn how AI frameworks like PyTorch use different storage connectors to access various storage solutions.
- Explore how file-based storage and object storage is used in AI training

Unlocking a Sustainable Future for Data Storage
In a world of surging data demands, how can we reduce the environmental toll of storage solutions? Discover the power of the circular economy to reshape the storage industry for a greener tomorrow with this webinar, featuring Jonmichael Hands (Co-Chair SNIA SSD SIG and Board Member, Circular Drive Initiative) and Shruti Sethi (Sr. PM at Microsoft and Leadership Team, Open Compute Project-Sustainability).
Key Highlights:
- Circular Drive Initiative: Rethink the lifecycle of storage devices—from design to end-of-life—to unlock significant environmental benefits.
- Media Sanitization Best Practices: Securely erase data to enable reuse, extend device life, and cut down on e-waste. Explore techniques like:
- Cryptographic erase
- Block erase
- Overwrite methods
- Compliance & Transparency: Learn how standards like IEEE 2883-2022 and ISO/IEC 27040:2024 guide secure data disposal, with organizations like SERI R2 and ADISA leading the charge in setting industry benchmarks.
- Carbon Accounting in Storage: Understand how tracking and reducing carbon emissions in storage aligns with global sustainability goals.
This session is your roadmap to driving real change by adopting circular economy principles, embracing advanced sanitization methods, and leveraging carbon accounting to reduce the industry’s environmental footprint.

The key to optimal SAN performance is managing performance impacts due to congestion. The Fibre Channel industry introduced Fabric Notifications as a key resiliency mechanism for storage networks in 2021 to combat congestion, link integrity, and delivery errors. These functions have been implemented by the ecosystem and have enhanced the overall user experience when deploying Fibre Channel SANs. This webinar explores the evolution of Fabric Notifications and the available solutions of this exciting new technology. In this webinar, you will learn the following:
- The state of Fabric Notifications as defined by the Fibre Channel standards.
- The mechanisms and techniques for implementing Fabric Notifications.
- The currently available solutions deploying Fabric Notifications.

This session explores the evolution of transactions, implementation challenges, and insights into distributed database environments. Whether you're a database enthusiast or a tech enthusiast, this presentation offers valuable insights into the world of database management, and will include:
- Historical perspective of transactions
- Implementing transactions
- Challenges and trade-offs in ACID properties
- Distributed transactions in modern databases like Amazon Aurora, DynamoDB, and Google Spanner
Key Takeaways: Understanding the evolution of transactions in databases Insights into the challenges of implementing ACID properties and exploration of distributed transaction models in leading database systems

Large Language models (LLMs) based on Transformers architecture have demonstrated the state-of-the-art performance in different code generation benchmarks such as MBPP and HumanEval. In this talk, we will demonstrate how we have used open source LLM models to develop a code generation workflow that can be trained internally in an on-prem infrastructure and used for improving developer productivity by aiding in tasks such as unittest generation, code documentation, code refactoring, code translation, search, and code alignment.

The digital landscape is in hyperdrive, demanding an IT metamorphosis that transcends mere tools. Enter AIOps – not just a technological upgrade, but a paradigm shift redefining how we approach IT operations. This presentation delves beyond the nuts and bolts, unveiling AIOps as a revolution that infuses AI's intelligence into the very fabric of IT thinking and processes.

A decade ago, the market was aggressively embracing public storage because of its agility and scalability. People are now rethinking that approach, moving toward on-premises storage with cloud consumption models. The new cloud native architecture on-premises has the promise of the traditional data center’s security and reliability with cloud agility and scalability. In this webinar, we will describe how Ceph is uniquely qualified to satisfy this architecture and how the technology community is investing to enable the vision of “Ceph, the Linux of Storage Today”.

Hear from industry experts Jeff Janukowicz, Research Vice President at IDC; Brian Beeler, Owner and Editor in Chief, StorageReview.com; and Cameron T. Brett, SNIA STA Forum Chair on new storage trends developing in the coming year, the applications and other factors driving these trends, and market data that illustrates the assertions.

Training large language models (LLMs) is a complex task which requires substantial computational resources and infrastructure. Fine-tuning LLMs for domain-specific data has emerged as a crucial technique to enhance their performance in specialized tasks and industries. In this talk we give an overview of the basic concepts of LLMs , their pre-training process, highlighting the transfer learning paradigm that forms the basis of fine-tuning.

The latest buzz around generative artificial intelligence (AI) ignores the massive costs to run and power the technology. Without any guard rails in place, what are the impacts of AI on sustainability and costs across our technology resources? This webinar will offer insights on the potentially hidden technical and infrastructure costs associated with generative AI, best practices and potential solutions to be considered.
