Embrace Hyper-scale Storage Architecture to Drastically Increase Reliability and Lower Costs

Author(s)/Presenter(s):
Library Content Type:
Publish Date: 
Wednesday, June 15, 2016
Event Name: 
Focus Areas:
Abstract: 

The tightly coupled architecture used by the vast majority of enterprises today is archaic and a new approach is needed to manage the explosion of data in a world of shrinking IT budgets. Enter: hyper-scale IT architecture.

Sudhakar Mungamoori will explain why a modern, software-driven and loosely coupled architecture is required for hyper-scale IT. He will highlight how this innovative architecture approach mitigates complexity, improves agility and reliability through on demand IT resources and reduces costs by as much as 10X.

Mungamoori will highlight how enterprises can learn from companies like Google and Facebook who built their own loosely coupled IT architectures to capitalize on its advantages. He will discuss use cases and best practices for IT departments that cannot similarly build their own, but are strategically looking to adopt loosely coupled architectures in order to remain competitive without blowing their budget in the face of today’s data deluge.

Learning Objectives

Why tightly coupled storage array architectures don’t meet the scale, performance or economic requirements of today’s enterprise, not to mention the future.
How distributed, hyper-scale software architectures provide increased scale, availability and resilience across many different workload profiles.
Challenges IT departments deploying hyper-scale solutions face, and best practices for how they can overcome these issues
How to migrate application workloads from legacy storage arrays to software-defined architectures using tiering and quality of service for guaranteed performance delivery.
How de-duplication, journaling and space efficient snapshots provide superior data protection with granular recovery across distributed nodes to minimize data loss and storage consumption.