Re: Big Data analytics with Hadoop

From: joel garry <joel-garry_at_home.com>
Date: Thu, 12 Jul 2012 08:46:30 -0700 (PDT)
Message-ID: <50034ca1-3166-4bec-b378-d4df26f5722f_at_f8g2000pbf.googlegroups.com>



On Jul 12, 4:23 am, janette.pellet..._at_yahoo.com.au wrote:
> Full Link
>
> http://www.nowillnevergetthesmelloffthosefish.com/YFCommunityNews-Addressing-Big-Data-analyt...
>
> Brief
>
> According to a recent research report by the International Data
> Corporation (IDC), global data will grow to 2.7 zettabytes in 2012 –
> up 48% on 2011. IDC predicts this figure to balloon to eight
> zettabytes worth of data by 2015. And if that’s not astonishing, maybe
> a quick calculation will help: One zettabyte =
> 1,000,000,000,000,000,000,000 bytes. Not only is data creation
> accelerating, but the capacity to store it is has doubled every 3.5
> years since 1980, according to Hilbert and López’s 2011 journal
> article The World’s Technological Capacity to Store, Communicate, and
> Compute Information. IBM claims that some 2.5 quintillion bytes of
> data have been created every day since the beginning of 2012.

And how much of that data is spam? You've added to it.

Let's see, with headers your message is about 2000 characters, copied to 1000's of computers all over the world.

I think that zettabyte is probably at least an order of magnitude over estimated if you remove redundant data, and most of what's left is useless.

jg

--
_at_home.com is bogus.
I'm gonna take my sponge and go home.
http://insidetv.ew.com/2012/07/12/viacom-yanks-free-online-programming-dispute-continues-with-directv/?cnn=yes
Received on Thu Jul 12 2012 - 10:46:30 CDT

Original text of this message