Zettabyte Era

The Zettabyte Era or Zettabyte Zone is a period of human and computer science history that started in the mid-2010s. The precise starting date depends on whether it is defined as when the global IP traffic first exceeded one zettabyte, which happened in 2016, or when the amount of digital data in the world first exceeded a zettabyte, which happened in 2012. A zettabyte is a multiple of the unit byte that measures digital storage, and it is equivalent to 1,000,000,000,000,000,000,000 (1021) bytes.

According to Cisco Systems, an American multinational technology conglomerate, the global IP traffic achieved an estimated 1.2 zettabytes in 2016, an average of 96 exabytes (EB) per month. Global IP traffic refers to all digital data that passes over an IP network which includes, but is not limited to, the public Internet. The largest contributing factor to the growth of IP traffic comes from video traffic (including online streaming services like Netflix and YouTube).

The Zettabyte Era can also be understood as an age of growth of all forms of digital data that exist in the world which includes the public Internet, but also all other forms of digital data such as stored data from security cameras or voice data from cell-phone calls. Taking into account this second definition of the Zettabyte Era, it was estimated that in 2012 upwards of 1 zettabyte of data existed in the world and that by 2020 there would be more than 40 zettabytes of data in the world at large.

The Zettabyte Era translates to difficulties for data centers to keep up with the explosion of data consumption, creation and replication. In 2015, 2% of total global power was taken up by the Internet and all its components, so energy efficiency with regards to data centers has become a central problem in the Zettabyte Era.

IDC forecasts that the amount of data generated each year will grow to 175 zettabytes by 2025. It further estimates that a total of 22 zettabytes of digital storage will be shipped across all storage media types between 2018 and 2025, with nearly 59 percent of this capacity being provided by the hard drive industry.

The zettabyte
A zettabyte is a digital unit of measurement. One zettabyte is equal to one sextillion bytes or 1021 (1,000,000,000,000,000,000,000) bytes, or, one zettabyte is equal to a trillion gigabytes. To put this into perspective, consider that "if each terabyte in a zettabyte were a kilometre, it would be equivalent to 1,300 round trips to the moon and back (768,800 kilometers)". As former Google CEO Eric Schmidt puts it, from the very beginning of humanity to the year 2003, an estimated 5 exabytes of information was created, which corresponds to 0.5% of a zettabyte. In 2013, that amount of information (5 exabytes) took only two days to create, and that pace is continuously growing.

Definitions
The concept of the Zettabyte Era can be separated into two distinct categories:
 * 1) In terms of IP traffic: This first definition refers to the total amount of data to traverse global IP networks such as the public Internet. In Canada for example, there has been an average growth of 50.4% of data downloaded by residential Internet subscribers from 2011 to 2016. According to this definition, the Zettabyte Era began in 2016 when global IP traffic surpassed one zettabyte, estimated to have reached roughly 1.2 zettabytes.
 * 2) In terms of all forms of digital data: In this second definition, the Zettabyte Era refers to the total amount of all the digital data that exists in any form, from digital films to transponders that record highway usage to SMS text messages. According to this definition, the Zettabyte Era began in 2012, when the amount of digital data in the world surpassed one zettabyte.

Cisco report –The Zettabyte Era: Trends and Analysis
In 2016, Cisco Systems stated that the Zettabyte Era was now reality when global IP traffic reached an estimated 1.2 zettabytes. Cisco also provided future predictions of global IP traffic in their report The Zettabyte Era: Trends and Analysis. This report uses current and past global IP traffic statistics to forecast future trends. The report predicts trends between 2016 and 2021. Here are some of the predictions for 2021 found in the report:
 * Global IP traffic will triple and is estimated to reach 3.3 ZB per annum.
 * In 2016 video traffic (e.g. Netflix and YouTube) accounted for 73% of total traffic. In 2021 this will increase to 82%.
 * The number of devices connected to IP networks will be more than three times the global population.
 * The amount of time it would take for one person to watch the entirety of video that will traverse global IP networks in one month is 5 million years.
 * PC traffic will be exceeded by smartphone traffic. PC traffic will account for 25% of total IP traffic while smartphone traffic will be 33%.
 * There will be a twofold increase in broadband speeds.

Factors that led to the Zettabyte Era
There are many factors that brought about the rise of the Zettabyte Era. Increases in video streaming, mobile phone usage, broadband speeds and data center storage are all contributing factors that led to the rise (and continuation) of data consumption, creation and replication.

Increased video streaming
There is a large, and ever-growing consumption of multimedia, including video streaming, on the Internet that has contributed to the rise of the Zettabyte Era. In 2011 it was estimated that roughly 25%–40% of IP traffic was taken up by video streaming services. Since then, video IP traffic has nearly doubled to an estimated 73% of total IP traffic. Furthermore, Cisco has predicted that this trend will continue into the future, estimating that by 2021, 82% of total IP traffic will come from video traffic.

The amount of data used by video streaming services depends on the quality of the video. Thus, Android Central breaks down how much data is used (on a smartphone) with regards to different video resolutions. According to their findings, per hour video between 240p and 320p resolution uses roughly 0.3 GB. Standard video, which is clocked in at a resolution of 480p, uses approximately 0.7 GB per hour.

Netflix and YouTube are at the top of the list in terms of the most globally streamed video services online. In 2016, Netflix represented 32.72% of all video streaming IP traffic, while YouTube represented 17.31%. The third spot is taken up by Amazon Prime Video where global data usage comes in at 4.14%.

Netflix
Currently, Netflix is the largest video streaming service in the world, accessible in over 200 countries and with more than 80 million subscribers. Streaming high definition video content through Netflix uses roughly 3 GB of data per hour, while standard definition takes up around 1 GB of data per hour. In North America, during peak bandwidth consumption hours (around 8 PM) Netflix uses about 40% of total network bandwidth. The vast amount of data marks an unparalleled period in time and is one of the major contributing factors that has led the world into the Zettabyte Era.

YouTube
YouTube is another major video streaming (and video uploading) service, whose data consumption rate across both fixed and mobile networks remains quite large. In 2016, the service was responsible for using up about 20% of total Internet traffic and 40% of mobile traffic. As of 2018, 300 hours of YouTube video content is uploaded every minute.

Increased wireless and mobile traffic
The usage of mobile technologies to access IP networks has resulted in an increase in overall IP traffic in the Zettabyte Era. In 2016, the majority of devices that moved IP traffic and other data streams were hard-wired devices. Since then, wireless and mobile traffic have increased and are predicted to continue to increase rapidly. Cisco predicts that by the year 2021, wired devices will account for 37% of total traffic while the remaining 63% will be accounted for through wireless and mobile devices. Furthermore, smartphone traffic is expected to surpass PC traffic by 2021; PCs are predicted to account for 25% of total traffic, down from 46% in 2016, whereas smartphone traffic is expected to increase from 13% to 33%.

According to the Organisation for Economic Co-operation and Development (OECD), mobile broadband penetration rates are ever-growing. Between June 2016 and December 2016 there was an average mobile broadband penetration rate increase of 4.43% of all OECD countries. Poland had the largest increase coming in at 21.55% while Latvia had the lowest penetration rate having declined 5.71%. The OECD calculated that there were 1.27 billion total mobile broadband subscriptions in 2016, 1.14 billion of these subscriptions had both voice and data included in the plan.

Increased broadband speeds
Broadband is what connects Internet users to the Internet, thus the speed of the broadband connection is directly correlated to IP traffic – the greater the broadband speed, the greater the possibility of more traffic that can traverse IP networks. Cisco estimates that broadband speeds are expected to double by 2021. In 2016, global average fixed broadband reached speeds as high as 27.5 Mbit/s but are expected to reach 53 Mbit/s by 2021. Between the fourth quarter of 2016 and the first quarter of 2017, average fixed broadband speeds globally equated to 7.2 Mbit/s. South Korea was at the top of the list in terms of broadband speeds. In that period broadband speeds increased 9.3%.

High-bandwidth applications need significantly higher broadband-speeds. Certain broadband technologies including Fiber-to-the-home (FTTH), high-speed digital subscriber line (DSL) and cable broadband are paving the way for increased broadband speeds. FTTH can offer broadband-speeds that are ten times (or even a hundred times) faster than DSL or cable.

Internet service providers in the Zettabyte Era
The Zettabyte Era has affected Internet service providers (ISPs) with the growth of data flowing from all directions. Congestion occurs when there is too much data flowing in and the quality of service (QoS) weakens. In both China some ISPs store and handle exabytes of data. The response by certain ISPs is to implement so-called network management practices in an attempt to accommodate the never-ending data-surge of Internet subscribers on their networks. Furthermore, the technologies being implemented by ISPs across their networks are evolving to address the increase in data flow.

Network management practices have brought about debates relating to net neutrality in terms of fair access to all content on the Internet. According to The European Consumer Organisation, network neutrality can be understood as an aim that "all Internet should be treated equally, without discrimination or interference. When this is the case, users enjoy the freedom to access the content, services, and applications of their choice, using any device they choose".

According to the Canadian Radio-television and Telecommunications Commission (CRTC) Telecom Regulatory Policy 2009-657 there are two forms of Internet network management practices in Canada. The first are economic practices such as data caps, the second are technical practices like bandwidth throttling and blocking. According to the CRTC, the technical practices are put in place by ISPs to address and solve congestion issues in their network, however the CRTC states that ISPs are not to employ ITMPs for preferential or unjustly discriminatory reasons.

In the United States, however, during the Obama-era administration, under the Federal Communications Commission's (FCC) 15–24 policy, there were three bright-line rules in place to protect net neutrality: no blocking, no throttling, no paid prioritization. On 14 December 2017, the FCC voted 3–2 to remove these rules, allowing ISPs to block, throttle and give fast-lane access to content on their network.

In an attempt to aid ISP's in dealing with large data-flow in the Zettabyte Era, in 2008 Cisco unveiled a new router, the Aggregation Services Router (ASR) 9000, which at the time was supposed to be able to offer six times the speed of comparable routers. In one second the ASR 9000 router would, in theory, be able to process and distribute 1.2 million hours of DVD traffic. In 2011, with the coming of the Zettabyte Era, Cisco had continued work on the ASR 9000 in that it would now be able to handle 96 terabytes a second, up significantly from 6.4 terabytes a second the ASR 9000 could handle in 2008.

Energy consumption
Data centers attempt to accommodate the ever-growing rate at which data is produced, distributed, and stored. Data centers are large facilities used by enterprises to store immense datasets on servers. In 2014 it was estimated that in the U.S. alone there were roughly 3 million data centers, ranging from small centers located in office buildings to large complexes of their own. Increasingly, data centers are storing more data than end-user devices. By 2020 it is predicted that 61% of total data will be stored via cloud applications (data centers) in contrast to 2010 when 62% of data storage was on end-user devices. An increase in data centers for data storage coincides with an increase in energy consumption by data centers.

In 2014, data centers in the U.S. accounted for roughly 1.8% of total electricity consumption which equates to 70 billion kWh. Between 2010 and 2014 an increase of 4% was attributed to electricity consumption by data centers, this upward trend of 4% is predicted to continue through 2014–2020. In 2011, energy consumption from all data centers equated to roughly 1.1% to 1.5% of total global energy consumption. Information and communication technologies, including data centers, are responsible for creating large quantities of emissions.

Google's green initiatives
The energy used by data centers is not only to power their servers. In fact, most data centers use about half of their energy costs on non-computing energy such as cooling and power conversion. Google's data centers have been able to reduce non-computing costs to 12%. Furthermore, as of 2016, Google uses its artificial intelligence unit, DeepMind, to manage the amount of electricity used for cooling their data centers, which results in a cost reduction of roughly 40% after the implementation of DeepMind. Google claims that its data centers use 50% less energy than ordinary data centers.

According to Google's Senior Vice President of Technical Infrastructure, Urs Hölzle, Google's data centers (as well as their offices) will have reached 100% renewable energy for their global operations by the end of 2017. Google plans to accomplish this milestone by buying enough wind and solar electricity to account for all the electricity their operations consume globally. The reason for these green-initiatives is to address climate change and Google's carbon footprint. Furthermore, these green-initiatives have become cheaper, with the cost of wind energy lowering by 60% and solar energy coming down 80%.

In order to improve a data center's energy efficiency, reduce costs and lower the impact on the environment, Google provides 5 of the best practices for data centers to implement:
 * 1) Measure the Power Usage Effectiveness (PUE), a ratio used by the industry to measure the energy used for non-computing functions, to track a data center's energy use.
 * 2) Using well-designed containment methods, try to stop cold and hot air from mixing. Also, use backing plates for empty spots on the rack and eliminate hot spots.
 * 3) Keep the aisle temperatures cold for energy savings.
 * 4) Use free cooling methods to cool data centers, including a large thermal reservoir or evaporating water.
 * 5) Eliminate as many power conversion steps as possible to lower power distribution losses.

The Open Compute Project
In 2010, Facebook launched a new data center designed in such a way that allowed it to be 38% more efficient and 24% less expensive to build and run than the average data center. This development led to the formation of the Open Compute Project (OCP) in 2011. The OCP members collaborate in order to build new technological hardware that is more efficient, economical and sustainable in an age where data is ever-growing. The OCP is currently working on several projects, including one specifically focusing on data centers. This project aims to guide the way in which new data centers are built, but also to aid already existing data centers in improving thermal and electrical energy as well as to maximize mechanical performance. The OCP's data center project focuses on five areas: facility power, facility operations, layout and design, facility cooling and facility monitoring and control.