Storing 85 Petabytes of Cloud Data without Going Broke

webinar

Author(s)/Presenter(s):

Gleb Budman

Library Content Type

Presentation

Library Release Date

Focus Areas

Abstract

One way to store data, especially bulk data, is to outsource the storage to some else. For 85 Petabytes, that would cost at least a couple of million dollars a month with a service such as Amazon S3. On the other end of the spectrum you can build your storage, deploy it to a colocation facility and then staff the operation and management of everything. In this session we’ll compare these two alternatives by covering the challenges and benefits of rolling your own data storage versus outsourcing the entire effort. The insights presented are based on real world observations and decisions made in the process growing a data center from 40 Terabytes to 85 Petabytes over a five year period.

Learning Objectives

Identify different types of data (transactional, bulk, et al) and how they merit
Understand the requirements in storing and managing 85 Petabytes of bulk storage
Outline the financial implications of choosing a given storage model
Compare different strategies of storing 85 Petabytes of data