How much data does google handle??

Google Data Center
Image from Google

How much data does google handle??

This is one of those kind of questions whose answer can never be accurate. On a funnier note, it is like a child asking how many stars are there up in the sky?? which is somewhat similar to asking “how much data does google handle??”

Commonly a PC holds 500GB of storage data and a smartphone holds about 32GB, but as days pass there are newer PCs and smartphones with bigger storage than this. We all know Google is the only one who can answer any kind of question!! We simply conclude that Google knows everything!! And Everything means Everything! Now you must be wondering how much data does google handle to answer all these questions!!??

Yes it holds a whole lot of data to answer any kind of question u ask it!! Google doesn’t provide numbers on how much data they store.

Google now processes over 40,000 search queries every second on average, which translates to over 3.5 billion searches per day and 1.2 trillion searches per year worldwide.

A place where google stores and handles all its data is a data center. Google doesn’t hold the biggest of data centers but still it handles a huge amount of data. A data center normally holds petabytes to exabytes of data.

Google currently processes over 20 petabytes of data per day through an average of 100,000 MapReduce jobs spread across its massive computing clusters. The average MapReduce job ran across approximately 400 machines in September 2007, crunching approximately 11,000 machine years in a single month.

Google is also very much interested in collecting users data like photos to improve their ad delivery system

Also read: Why google offers unlimited cloud storage for free while apple restricts the use by 5 GB?


Now, What are these new terms? Petabytes or Exabytes? The highest data size I have heard till now is Terabyte(TB). 1 Petabyte(PB) = 1024 Terabytes(TB) 1 Exabyte(EB)= 1024 Petabyte(PB) An exabyte can be understood as 1 million Terabytes(TB). So , from this we can slowly understand this huge amount of data. Google uses its datacenters as well as collaborates with other datacenters to store their data. Each data center would cover an area of 20 Football fields combined. Its hard to calculate this huge amount of data. But with some educated guessing using the capital expenditures at remote locations and electricity consumption at each of the data centers and number of servers they have respectively, we can come to a conclusion that Google holds 10-15 Exabytes of data. This equals to data of 30 Million PCs combined. So now when someone stops you somewhere and asks you how much data does google handle!! You can boldly answer that Google handles 10-15 Exabytes of data.

Google processes its data on a standard machine cluster node consisting two 2 GHz Intel Xeon processors with Hyper-Threading enabled, 4 GB of memory, two 160 GB IDE hard drives and a gigabit Ethernet link. This type of machine costs approximately $2400 each through providers such as Penguin Computing or Dell or approximately $900 a month through a managed hosting provider such as Verio (for startup comparisons).

Google uses these data to improve their products like their search engines and Google maps!!


The average MapReduce job runs across a $1 million hardware cluster, not including bandwidth fees, data center costs, or staffing.

The January 2008 MapReduce paper provides new insights into Google’s hardware and software crunching processing tens of petabytes of data per day. Google converted its search indexing systems to the MapReduce system in 2003, and currently processes over 20 terabytes of raw web data. It’s some fascinating large-scale processing data that makes your head spin and appreciate the years of distributed computing fine-tuning applied to today’s large problems.