Sorry, you need to enable JavaScript to visit this website.

On Demand Webinars

Webinars
10:00 am PT / 1:00 pm ET

Artificial intelligence and Machine learning (AI/ML) is a hot topic in every business at the moment, and there is a growing dialog about what constitutes an Open Model, is it the weights? Is it the data?

Those are important questions, but equally important is ensuring that the tooling and frameworks to train, validate, fine-tune, and perform inference are open source. Storage systems are a crucial component of these workflows, how can open-source solutions address the needs for high capacity and high performance?

Data is key to any and all AI/ML workflows, without it there would be no data to use as an input for model training, re-evaluation and refinement of models, or even just securely storing models once training is complete, especially if they have taken weeks to produce!

Open source solutions like Ceph can provide almost limitless scaling capabilities, both for performance and capacity. In this webinar learn how Ceph can be used as the backing store for AI/ML workloads.

We’ll cover:

  • The demands of AI on storage systems
  • How open source Ceph storage fits into the picture
  • How to approach Ceph cluster scaling to meet AI’s needs
  • How to get started with Ceph

Download PDF

Read Q&A Blog

Ceph Storage in a World of AI/ML Workloads
10:00 am PT / 1:00 pm ET

New Memories like MRAM, ReRAM, PCM, or FRAM are vying to replace embedded flash and, eventually, even embedded SRAM.  Are there other memory technologies threatened with similar fates?  What will the memory market look like in another 20 years?

Catch up on the latest in new memory technologies in this fast-paced, entertaining panel, as we explain what these new memory technologies are, the applications that have already adopted them in the marketplace, their impact on computer architectures and AI, the outlook for important near-term changes, and how economics dictate success or failure.  

A Deep Look at New Memories
10:00 am PT / 1:00 pm ET

The SNIA Cloud Storage Technologies Initiative (CSTI) conducted a poll early in 2024 during a live webinar “Navigating Complexities of Object Storage Compatibility,” citing 72% of organizations have encountered incompatibility issues between various object storage implementations. These results resulted in a call to action for SNIA to create an open expert community dedicated to resolving these issues and building best practices for the industry.

Since then, SNIA CSTI partnered with the SNIA Cloud Storage Technical Working Group (TWG) and successfully organized, hosted, and completed the first SNIA Cloud Object Storage Plugfest (multi-vendor interoperability testing), co-located at SNIA Developer Conference (SDC), September 2024, in Santa Clara, CA. Participating Plugfest companies included engineers from Dell, Google, Hammerspace, IBM, Microsoft, NetApp, VAST Data, and Versity Software. Three days of Plugfest testing discovered and resolved issues and included a Birds of a Feather (BoF) session to gain consensus on next steps for the industry. Plugfest contributors are now planning two 2025 Plugfest events in Denver in April and Santa Clara in September.

This webinar will share insights into industry best practices, explain the benefits your implementation may gain with improved compatibility, and welcome your client and server cloud object storage team to join us in building momentum. Join us for a discussion on:

  • Implications on client applications
  • Complexity and variety of APIs
  • Access control mechanisms
  • Performance and scalability requirements
  • Real-world incompatibilities found in various object storage implementations
  • Missing or incorrect response headersUnsupported API calls and unexpected behavior

Download PDF

Read Q&A Blog

Building Community to Tackle Cloud Object Storage Incompatibilities
9:00 am PT / 12:00 pm ET
Join us for an insightful webinar on the transformative impact of AI on networking. This session will delve into the various use cases of AI, the nature of traffic for different workloads, and the network impact of these workloads. We will explore the multiple networking challenges posed by AI and how Ethernet is evolving to meet these demands. Special focus will be given to congestion issues during model training, the role of Ultra Ethernet Consortium (UEC), and the specific requirements related to training large language models (LLMs) and other use cases.
 
Learning Objectives:
 
  • Understand the different types of Network Topologies typically used with AI workloads.
  • Identify the nature of traffic for various AI workloads and their impact on networks.
  • Learn about the challenges Ethernet faces with AI workloads and the solutions being implemented.
  • Explore a specific use case to see how Ethernet addresses bandwidth and congestion issues.

Ethernet in the Age of AI: Adapting to New Networking Challenges
10:00 am PT / 1:00 pm ET

This presentation examines the critical role of storage solutions in optimizing AI workloads, with a primary focus on storage-intensive AI training workloads. We will highlight how AI models interact with storage systems during training, focusing on data loading and checkpointing mechanisms. We will explore how AI frameworks like PyTorch utilize different storage connectors to access various storage solutions. Finally, the presentation will delve into the use of file-based storage and object storage in the context of AI training:

Attendees will:

  • Gain a clear understanding of the critical role of storage in AI model training workloads
  •  Understand how AI models interact with storage systems during training, focusing on data loading and checkpointing mechanisms
  • Learn how AI frameworks like PyTorch use different storage connectors to access various storage solutions.
  • Explore how file-based storage and object storage is used in AI training

Download PDF

The Critical Role of Storage in Optimizing AI Training Workloads
9:00 AM PT / 12:00 PM ET

Unlocking a Sustainable Future for Data Storage
In a world of surging data demands, how can we reduce the environmental toll of storage solutions? Discover the power of the circular economy to reshape the storage industry for a greener tomorrow with this webinar, featuring Jonmichael Hands (Co-Chair SNIA SSD SIG and Board Member, Circular Drive Initiative) and Shruti Sethi (Sr. PM at Microsoft and Leadership Team, Open Compute Project-Sustainability).

Key Highlights:

  • Circular Drive Initiative: Rethink the lifecycle of storage devices—from design to end-of-life—to unlock significant environmental benefits.
  • Media Sanitization Best Practices: Securely erase data to enable reuse, extend device life, and cut down on e-waste. Explore techniques like:
    • Cryptographic erase
    • Block erase
    • Overwrite methods
  • Compliance & Transparency: Learn how standards like IEEE 2883-2022 and ISO/IEC 27040:2024 guide secure data disposal, with organizations like SERI R2 and ADISA leading the charge in setting industry benchmarks.
  • Carbon Accounting in Storage: Understand how tracking and reducing carbon emissions in storage aligns with global sustainability goals.

This session is your roadmap to driving real change by adopting circular economy principles, embracing advanced sanitization methods, and leveraging carbon accounting to reduce the industry’s environmental footprint.

Download PDF

Advancing Sustainable Storage: The Impact of the Circular Economy, Media Sanitization Policies, and Carbon Accounting
10:00 am PT / 1:00 pm ET

The key to optimal SAN performance is managing performance impacts due to congestion. The Fibre Channel industry introduced Fabric Notifications as a key resiliency mechanism for storage networks in 2021 to combat congestion, link integrity, and delivery errors. These functions have been implemented by the ecosystem and have enhanced the overall user experience when deploying Fibre Channel SANs. This webinar explores the evolution of Fabric Notifications and the available solutions of this exciting new technology. In this webinar, you will learn the following:

  • The state of Fabric Notifications as defined by the Fibre Channel standards.
  • The mechanisms and techniques for implementing Fabric Notifications.
  • The currently available solutions deploying Fabric Notifications.

Download PDF

The Evolution of Congestion Management in Fibre Channel
3:00 PM IST / 2:30 AM PDT

This session explores the evolution of transactions, implementation challenges, and insights into distributed database environments. Whether you're a database enthusiast or a tech enthusiast, this presentation offers valuable insights into the world of database management, and will include:

  • Historical perspective of transactions
  • Implementing transactions
  • Challenges and trade-offs in ACID properties
  • Distributed transactions in modern databases like Amazon Aurora, DynamoDB, and Google Spanner

Key Takeaways: Understanding the evolution of transactions in databases Insights into the challenges of implementing ACID properties and exploration of distributed transaction models in leading database systems

Navigating Transactions: ACID Complexity in Modern Databases
11:00 am IST / 10:30 pm PT (April 24)

Large Language models (LLMs) based on Transformers architecture have demonstrated the state-of-the-art performance in different code generation benchmarks such as MBPP and HumanEval. In this talk, we will demonstrate how we have used open source LLM models to develop a code generation workflow that can be trained internally in an on-prem infrastructure and used for improving developer productivity by aiding in tasks such as unittest generation, code documentation, code refactoring, code translation, search, and code alignment.

Empowering Developers: Exploring LLM Models for Code Generation
10:00 am PT / 1:00 pm ET

The digital landscape is in hyperdrive, demanding an IT metamorphosis that transcends mere tools. Enter AIOps – not just a technological upgrade, but a paradigm shift redefining how we approach IT operations. This presentation delves beyond the nuts and bolts, unveiling AIOps as a revolution that infuses AI's intelligence into the very fabric of IT thinking and processes.

Download PDF

AIOps: Reactive to Proactive – Revolutionizing the IT Mindset