So When Did We Move to Petabytes?

Editor’s Note: Hope you’ve been following all the buzz from #DellWorld today! Our next guest blogger is Zophar Sante of NTP. He  has nearly 30 years of experience in technology and was VP of Product Marketing and Market Development at Nexsan, founder and VP of Marketing at SANRAD, and founder and President of Disk Imaging Systems.

A few years ago many enterprises were talking terabytes of data…dozens to hundreds… but still terabytes.   Now the conversations are around petabytes and moving toward exabytes.  I guess a good indicator of why data is growing even faster than ever is what’s going on in the capacity of consumer products.  A 20 gigabyte laptop hard drive was considered the norm eight to ten years ago.  Now you can buy a laptop with 1 terabyte of capacity for about $1200 – that’s 50 times more capacity.   And since a petabyte is only 1000 terabytes (or the capacity of 1000 new laptops), you can easily see why large businesses with thousands of employees are talking about storage in the petabyte realm.

But there is a problem with all this data.  It needs to be stored, protected and remain accessible.  And that’s not cheap, especially if you are looking at enterprise storage.  To compound the problem, we are data “pack-rats.”  When I ask IT departments about deleting old data the standard answer is “we keep everything.”  The only time data is deleted is when it’s mandated by compliance and corporate governance.  Otherwise data is kept “just in case.”  And it makes sense because IT doesn’t know the value of the data, so the best practice is to simply store it indefinably until told otherwise. 

The theme of this year’s Dell World 2011 is “Unlocking Innovation in the Virtual Era.”   NTP is constantly considering ways to unlock innovation in our approach to file storage movement.  For example, all data needs to be stored, protected and remain assessable.  But how? 

Leading analyst say 90% of file data is only used once and never again but it needs to be kept.   The key to storing tons of data is to assess it, clean it and move it.  “Assess it” lets you know what you have and “clean it” lets you delete duplicate and none business related data.  Once that’s done you “move it” or migrate the data to a more appropriate storage tier.  With the Dell DX Object Storage Platform and NTP Software, you can archive large amounts of data over long periods of time while potentially reducing long-term storage costs and resources.

If you’re interested in how this works and the money you can  potentially  save by using continuous real-time policies to manage the cost of storing data, check out “Precision Tiering” at the Innovation in Data Management – Solutions Experience at Dell World http://www.dellworld.com or visit http://www.ntpsoftware.com/ and ask about Precision Tiering.

About the Author: Zophar Sante