Center for Strategic Assessment and forecasts

Autonomous non-profit organization

Home / Science and Society / Direction materials
The future of data centers
Material posted : Administrator Publication date: 02-07-2020

Given that by 2025 it is expected 175 zettabytes of data, data centers will continue to play a critical role in receiving, calculating, storing, and managing information.

It often happens that hidden from sight, data centers are the core of our Internet. They transporterowych, store and transmit information we create every day.
The more data we create, the more vital become data centers.

Today, many data centers are impractical, inefficient and outdated. To keep them in working order, operators of data centers, from FAMGA to colocation tsenrov, work on their modernization to fit the trends in the ever-changing world.

In this report we dive into many aspects of the future of information centres and their evolution. Starting from where and how they are created, which use energy and what hardware is running inside them.

image
Location

Access to fiber optic networks, the tariffs for energy sources and environment all play an important role in the selection of locations for creating data centers.

According to some estimates, the global construction market data centers by 2025 could be up to $57млрд. This is quite a big challenge, as the commercial real estate giant CBRE has launched a whole division specializing in data centers

Map of data centers

image

Construction near cheap sources of energy

The location of energy-intensive data centers near cheap power sources can make them more affordable to run. Because emissions are growing, and major technology companies are increasingly using dirty energy to power data centers, green energy sources are an important factor

Apple and Facebook have built data centers near hydroelectric resources. In Central Oregon, Apple has acquired a hydroelectric project for its data center in Prinville green electricity. The company said that the deregulated electricity markets in the state of Oregon are the main reason why they built several data centers in the area. Deregulation allows Apple to buy electricity directly from third party vendors that use renewable energy sources, not just local public services.

In Lulea, Sweden, Facebook built a data Megacenter near a hydroelectric plant. In Sweden, data centers began to appear in the Northern region, as, apart from the fact that the climate is cool and the location is not prone to earthquakes, in this part of the many renewable energy sources(hydropower and wind).

image
The construction of Facebook in Lulea, Sweden, in 2012. Source: Facebook

Several large data centers have already moved to the Northern region, some only look to him.
In December 2018, Amazon Web Services announced the opening of its data center AWS Europe Region located about an hour's drive from Stockholm. In the same month, Microsoft acquired 130 acres of land in two neighbouring parts of Sweden, in gävle and Sandviken, with the aim of building the data centers.

In addition to the proximity to cheap, environmentally friendly energy sources, companies that are building data centers, as well pay attention to the cool climate. Places near the Arctic circle, such as Northern Sweden, can allow data centers to save on cooling.

Telecommunications company Altice Portugal declares that its data center Covilhã uses outside air for cooling its servers by 99%. While the old data center of Google in Hamina, Finland, uses seawater from the Gulf of Finland to cool the facility and reduce energy consumption.

Company Verne Global has opened in Iceland campus, which connects to the local geothermal and hydropower sources. It is located on the former NATO base and is located between Europe and North America — the two largest markets in the world.

Construction in emerging economies

The location of the data center point of the growing Internet traffic reduces the load and increases the speed of peredachi data in the region.

For example, in 2018, the Chinese company Tencent, which owns WeChat and Fortnite, has placed data centers in Mumbai. This was a sign that the region is actively using the Internet platform and gaming Tencent are becoming increasingly popular.

The construction of data centers in areas where the intensity of using the Internet is also a strategic business move. If local businesses are developing, they will consider the possibility of transferring its operations to the nearest data center.

As data centers is a function of colocation. This means that they provide the building with cooling, power, bandwidth and physical security, and then rent the space to rent to customers who establish there own servers and storage. Service colocation usually focus on companies with small needs, it also helps companies save money on infrastructure.

Tax incentives and local legislation

Information centres are an important new source of income for producers of electricity and therefore, the state attracts a large company of different privileges.

Starting in January 2017, the Swedish government reduced the tax on 97% of any electricity used by data centers. Electricity in Sweden is relatively expensive, and tax cuts have put Sweden on a par with other Scandinavian countries.

In December 2018, Google has agreed to exemption from tax on the sale of a data center in New Albany, Ohio, 100% 15-year sentence for data center worth $ 600 million. A benefit was suggested with the possibility of extension up to the year 2058. In September, 2018 France, after Breccia hoping to attract talent and economic capital of the world, has announced its plans to lower taxes on the consumption of electricity for data centers.

In some cases, the creation of centers of data storage and processing is one of the ways by which companies can continue their activities in countries with a strict regime.

In February 2018, Apple began storing data in the center, located in Guizhou, China to comply with local laws. If earlier for receiving the access information of Chinese citizens stored in the us data center, the Chinese authorities had to use the legal system of the United States, now a local datacenter provides the Chinese authorities with easier and faster access to information for Chinese citizens stored in the cloud Apple.
While the Chinese authorities previously had to pass through the US legal system to obtain access information of Chinese citizens stored in the us data center, local data centre gives the authorities more easy and quick access to information on Chinese citizens stored in the cloud Apple.

Connecting to fiber-optic communication and security

Two of the most important factor when it comes to locations of centers of storing and processing data is connected to fibre optic networks and strict security.

For many years fiber optic network was a key factor when selecting a location for the construction of data centers. Although data is often perduda to mobile devices over a wireless network or a local Wi-Fi.
Most of the time the data is moved to storage, and back through fiber optic cables. The optical fiber connects the data centers, with cell antennas, home routers and other devices for data storage.

image
Map of submarine cables

Ashburn, Virginia and its neighboring regions has become a major market for data centers, largely due to the large network of fiber-optic infrastructure, which was created by Internet company AOL, as soon as it built its headquarters. At that time, other companies such as Equinix moved to the region and built their own data centers, fiber optic networks in the region continue to grow, attracting more and more new data centers.

Facebook has invested almost $2B data center in Henrico, Virginia, and in January 2019 Microsoft for the sixth time, received a grant in the amount of $1.5 million to expand its data center in Southside, Virginia.

So there are new alternatives to fibre optic networks as large technical staff builds its own infrastructure connections.

In may 2016 Facebook and Microsoft announced joint work on laying a submarine cable between Virginia beach, VA and Bilbao, Spain. In 2017 Facebook announced its plans to build its own fiber-optic network with a length of 200 miles in the earth in Las Lunas, state of new Mexico, to connect its data center in new Mexico to other server farms. Underground fiber-optic system will create three unique network route information for a trip to Las Lunas.

In addition to the connection, another important point is security. This is especially important for data centers where sensitive information is stored.

For example, Norwegian DNB financial giant in partnership with the data center Green Mountain has created its own data center. Green Mountain has posted the data center DNB in high security — a converted bunker in the mountain. The company claims that the mountain is completely protected from all hazards, including terrorist attacks, volcano eruptions, storms, earthquakes.

Datacenter "Swiss Fort Knox" is located under the Swiss Alps, the door is disguised as a rock. Its internal complex system includes a lot of tunnels to which access is only possible with appropriate clearance. Data center protected the emergency diesel engines and the air pressure system that prevents entry into the room of poisonous gas.

imageFort Knox Sweden ‘is physically and digitally-giperazotemii. Source: Mount 10

Due to the high degree of security servers in some data centers, there are many data centers whose whereabouts were never publicized. Information about the location of these centres can be used as a weapon.

In October 2018 WikiLeaks has published an internal list of objects AWS. Among the discovered information was the fact that Amazon was a contender for creating a private cloud value of about $10B for the Ministry of defence.

WikiLeaks itself faced with difficulties in finding a suitable data center to host your information. AWS stopped hosting for WikiLeaks, because the company violated the terms of the provision of services — namely, the released documents, which had no rights, and subjected to people's lives at risk.

In consequence of which WikiLeaks moved to a variety of different data centers. At some point it was even rumoured that WikiLeaks is considering placing your data center in the ocean, in the unrecognized state of Sealand in the North sea.

Structure

While location is probably the most important factor that would reduce the risk of failure of the data centers, the structure of the data centers also play an important role in ensuring the reliability and durability.

Correct construction of the data center can make it resistant to seismic activity, floods and other natural disasters. In addition, designs can be adapted to extend, reduce energy consumption.

In all sectors — from healthcare to Finance and manufacturing companies rely on data centers to support growing data consumption. In some cases, these data centers can be private and stay in place, while the other is shared and located remotely.

In any case, the data centers are located in the center of a growing technological world and continue to experience own physical transformation. Tool according to CB Insights, Market Sizing, global services market for data centers is estimated at $228 billion by 2020.

One of the latest changes in the construction of data centers is the size. Some data centers have become smaller and more rasprostranennymi (called peripheral data centers). At the same time, other data centers have become more and more centralized than ever — a mega-data centers)).

In this section, we consider the peripheral and the megacentres of data centers.

Peripheral data centers

Small distributed data centers, called peripheral data centers, created to provide a Hyper-local storage.

While cloud computing has traditionally served as a reliable and cost-effective means of connecting many devices to the Internet continuous growth in IoT and mobile computing has led to the bandwidth of the networks of data centers.

Currently, there are Edge computing technology, offering an alternative solution. (Edge computing (Edge computing) is a paradigm for distributed computation carried out in the range of end devices. This type of computing is used to reduce the time network response, as well as more efficient use of network bandwidth.)

This involves placing computing resources close to the place where the data is coming (i.e., to engines, generators or other elements. In doing so we reduce the time to move data to centralized computing locations such as the cloud.

Although this technology is still in its infancy, it already provides a more efficient method of processing data for multiple use cases, including Autonomous vehicles. For example, Tesla cars have powerful on-Board computers that allow to process data with low latency (near real time) to data collected dozens of peripheral sensors of the vehicle. This allows the car to take timely decisions.)

image

However, other advanced technologies such as wireless medical devices and sensors do not have the necessary processing power to execute processing large streams of complex data.

As a result, deploy smaller, modular data centers to provide Hyper-local storage and data processing. According to the tool CB Insights, Market Sizing, by 2023 the global market for computing machinery will reach $34B.

These data centers, which are usually the size of a shipping container located at the base of cell towers or as close as possible to the data source.

In addition to transportation and healthcare, these modular data centers are used in industries such as manufacturing, agriculture, energy and utilities. They also help operators of mobile networks (MNS) for faster delivery of content to mobile subscribers, while many technology companies use these systems to store (or cache) content closer to their end users.

Vapor IO — one of the companies that offer server colocation, placing the small data centers at the base of cell towers. The company has a strategic partnership with Crown Castle, the largest provider of wireless infrastructure in the United States.

image

Other leading companies of data centres, such as Edgemicro, call centers microdata that connect mobile networks (MNS) with content providers (CPS). The founders EdgeMicro use the experience of the leaders of organizations such as Schneider Electric, one of the largest energy companies in Europe, CyrusOne, one of the largest and most successful providers of data centers in the United States.
The company recently unveiled its first production unit and plans to sell its services colocation content providers such as Netflix and Amazon, which benefit from increased speed and reliability of content delivery. These colocation services are ideal for companies that seek to own but not to manage their infrastructure data.

image
Edge Micro

And startups are not the only ones involved in the peripheral market of data centers.
Large companies such as Schneider Electric, not only working with startups, but also develop their own products for data centers. Schneider offers several different prefabricated modular data centers, which are ideal for various industries that need the Hyper-local computing resources and storage space.

image
Huawei
These integrated solutions combine system power, cooling, fire extinguishing, lighting and control in a single package. They are designed for rapid deployment, reliable operation and remote monitoring.

The megacentres data processing

At the other end of the spectrum are mega-data centers — data centers with a minimum area of 1 million sq. ft. These objects are large enough to meet the needs of tens of thousands of organizations simultaneously and substantially benefit from the economies of scale effect.

Despite the fact that the construction of such mega-centres is expensive, price per square foot is much higher than the cost of an average data center.

One of the largest projects is the object area of 17.4 million square feet built by Switch Communications, which provides enterprise housing, cooling, power, bandwidth and physical security of your servers.

image
Switch

In addition to this huge campus "Citadel" in Reno Tahoe, the company has a data center with an area of 3.5 million square feet in Las Vegas, the campus area of 1.8 million square feet in Grand rapids and the campus area of 1 million square feet in Atlanta. The campus of the Citadel is the world's largest center colocation-data, according to the site kompanii.

Big tech is also actively building a mega-data centers in the US, with Facebook, Microsoft and Apple — all of the building to support their growing data storage needs.

For example, Facebook builds a data center with an area of 2.5 million square feet in Fort worth, Texas, for processing and storage of personal data of its user. It was assumed that the data center will take up only 750 thousand Sq. Feet, but the social networking company decided to triple its size.

image
Source: Facebook


The new data center will cost about 1 billion dollars and will be located on a 150-acre plot of land that will enable in the future to expand. In may 2017, were put into operation 440 thousand Sq. Ft.

One of the latest investment Microsoft data center in West des Moines, Iowa, which cost the company $ 3.5 billion. Together this cluster of a data center is 3.2 million square feet of space, and the largest data center is 1.7 million square feet. This special data center, called Project Osmium, located on 200 acres of land, it is expected that construction will be completed by 2022.

In recent years, Iowa has become a popular destination for data centers due to low energy prices (among the lowest in the country) and low risk of natural disasters.

Apple isn't building a large facility in Iowa: the area of 400 thousand square feet and it will cost $1.3 billion.

image
Source: Apple Newsroom

While some of the objects the Apple built from the ground up to manage all content storage apps, streaming music service, iCloud storage, user data, Apple redeveloped factory for the production of solar panels with an area of 1.3 м2в Mesa, Arizona, which opened in August 2018. The new data centre runs on 100% green energy thanks to a nearby solar farm.

Outside of the U.S. to specific regions that have attracted these mega-data centers, is Northern Europe, which was a popular place for the construction of tech giants due to its cool temperature, and tax benefits.

Hohhot, China, located in Inner Mongolia, also has a convenient location for a mega data centers, as is access to cheap, local energy, cool temperatures and University talents (from Inner Mongolia University).

While large data centers have become a popular topic for the whole of China, the region of Inner Mongolia became the center of such developments. China Telecom (10.7 million SF), China Mobile (7.8 million SF) and China Unicom (6.4 million SF) was created in this region of mega-data centers.

imageSource: World's Top Data Centers

Innovation data center

In addition to data centers, new structures for data centers, designed with the benefits to the environment are another new trend.

Today, many organizations are experimenting with data centers that operate in the ocean and on its territory. These groups use their environment and its resources for natural cooling of servers at very low cost.

One of the most progressive examples of the use of the ocean with natural cooling is project Microsoft Project Natick. As noted in the patent application filed in 2014 and is called "Submerged data center", Microsoft plunged under water in small cylindrical datacenter coast of Scotland.

image

The datacenter uses 100% renewable electricity local production from onshore wind and solar sources, as well as from tides and waves. The object uses the ocean to remove heat from the infrastructure.

image
Source: Microsoft

While still in the early stages, Microsoft has made great strides in this project that could give the green light for the spread of data centers in the ocean in the future.
Google also experimented with ocean data centres. In 2009, the company filed a patent application for a data Centre, water-based, which provides for the establishment of centers of data aboard floating barges.

image

By analogy with the design of Microsoft, the structure will use the surrounding water for cooling the object. In addition, the barge can also generate energy from ocean currents.

Despite the fact that Google is not informed, whether tests of these structures, it is believed that the company is responsible for floating a barge in the Bay of San Francisco in 2013.

The launch of Nautilus Data Technologies was carried out with a very similar idea. In recent years, the company has raised $58 million to implement the concept of this floating data center.

image
Source: Business Chief

The company pays less attention to the ship, and more technology needed for powering and cooling data centres. The main tasks of the Nautilus are: reducing the cost of computers, reducing energy consumption, cessation of consumption of water, reducing air pollution and reducing greenhouse gas emissions.

Energy efficiency and cost effectiveness

Currently 3% of the total electricity consumption in the world is necessary for data centers, and that percentage will only grow. This electricity is not always "clean": according to the United Nations(UN), the sector of information and communication technologies (ICT), which is largely fueled by data centers, provides the same amount of greenhouse gases that the aviation sector with fuel.

Although ICTs have helped to contain the growth of electricity consumption is partly due to the closure of older, inefficient data centers, this strategy can only lead to the growth of Internet use and the production data.

In the future for data centers there are two ways to reduce emissions: increase energy efficiency inside the data center and to ensure the purity of the energy used.

Reducing consumption and improving energy efficiency

In 2016, DeepMind and Google worked on a system of recommendations for the AI, which will help to make the Google data centres more energy efficient. The main focus was on the smallest improvements that even small changes were enough to help significantly save energy and reduce emissions. However, once the recommendations were promulgated, their implementation requiring the operator to too much effort.

Operators of data centers asked whether they can be implemented autonomously. However, according to Google, no system of artificial intelligence is not yet ready to fully control the processes of heating and cooling in the data center.

The current control system AI company Google is working on concrete actions, but can perform them only in limited circumstances, the priority of which is safety and reliability. According to the company, the new system provides an average of 30% energy savings.

image
A typical day PUE (efficiency of energy use) if turned on and off ML.
Source: DeepMind

Google also used artificial intelligence to adjust the efficiency in the data center in the Midwest during the observation of a tornado. While the human operator can concentrate on preparation for the storm, the AI system used advantages and knowledge of the conditions of tornado — like, the drop in atmospheric pressure and changes in temperature, humidity to adjust the cooling of a data center for maximum efficiency during a tornado. Winter control system the AI adapts to the weather, to reduce energy consumption, which is necessary for cooling of the data center.

However, this method is not without flaws. Gaps in technology AI hamper the creation of conditions to ensure that the centre could easily make effective decisions, and artificial intelligence is very hard to scale.Each of the data centers of Google is unique and difficult to deploy the tool AI( artificial intelligence) all at once.

Another way to increase efficiency is to change the way of cooling overheated parts of the data center such as servers or certain types of chips.
One such method is to use liquid instead of air to cool the parts. The CEO of Google Sundar Pichai (Sundar Pichai) said that the recently released chips are so powerful that the company had to submerge them in liquid to cool them to the desired degree.

Some data centers are experimenting with immersion data centers under water, to facilitate cooling and increase energy efficiency. This allows data centers to have a constant access to natural cool deep water, so the heat from the equipment out into the surrounding ocean. Since a data center may be located at any of the coast that gives an opportunity to choose the connection to clean energy, such as Microsoft Project Natick with a length of 40 feet, running from the wind network Orkney Islands.

The advantage possessed by smaller underwater data centers is that they can be modular. This makes it easier to deploy than new centres on land.

However, despite these benefits, some experts are wary of data centers under water, as well as possibly flooding. Because the cooling process involves drawing of clean water and release of hot water into the surrounding region, these data centers can make the sea warmer, affecting the local marine flora and fauna. And although the underwater data centers, such as Project Natick, designed to work without the control of the person, in case of any problems they can be difficult to fix.

In the future, data centers can also contribute to clean energy and efficiency by recycling part of the produced electricity. Some projects are already exploring the possibility of reuse of heat. And the data center DigiPlex has partnered with a supplier of heating and cooling Stockholm Exergi for the establishment of a system reuse of heat. Koncepcia of this solution is, that would collect excess heat in the datacenter and send it to the local district heating system to potentially warm up to 10 000 inhabitants of Stockholm.

Buying environmentally friendly energy

According to the IEA, the ICT industry has made some progress in the field of clean energy and currently covers almost half of all corporate agreements for the purchase of electricity from renewable sources.

As of 2016, Google is the biggest corporate buyer of renewable energy on the planet.

image
Source: Google Sustainability

In December of 2018 Facebook has acquired 200 MW of energy, the company which produces solar energy. In Singapore, where land area is limited, Google announced plans to buy 60 MW of solar power on the roof.

At the same time, large companies such as Google and Facebook that have the resources to purchase large amounts of renewable energy, it can be difficult for small companies that may not need large amounts of energy.

Another more expensive option of producing energy is the MicroGrid: install independent source of energy within the data center.

One example of such a strategy are fuel cells, like those that sells the newly public company Bloom Energy. Cell uses hydrogen as fuel to create electricity. Although these alternative sources are often used as backup energy, the data center can rely on this system to survive, although this is ekonomicheski expensive option.

Another way by which data centers sleduyut clean energy, is renewable energy credits (RECs), which represent a certain number of clean energy and are typically used to trim the "dirty energy", which is used by the company. For any quantity produced by dirty energy, REC represent the production of an equivalent amount of clean energy in the world. These recovered amounts are then sold back to the market of renewable energy.

However, the REC model there are some problems. On the one hand, REC only compensate for dirty energy -this does not mean that the data center runs on clean energy. Although the REC is usually easier than searching available sources that can provide sufficient energy to meet the needs of the data center. The downside is that is typically not available to small businesses, which do not always have the capital to bet on fluctuations in solar or wind power.

In the future, technology will facilitate close collaboration between small producers and buyers. Appear market model and portfolios of renewable energy sources (such as those generated by the mutual funds), allowing to assess and quantify the fluctuations of renewable energy sources.

Storage

Solid state drive (SSD)

Solid state drives, or SSDS, are a kind of storage device that supports reading and writing data. These devices can store data without power is called permanent storage. It differs from forms of temporary storage such as Random Access Memory (RAM), which store information only during operation of the device.

Drives SSDs compete with HDDs — another form of mass data storage. The main difference between SSD and HDD is that SSDS operate without moving parts. This enables SSDs to be more durable and light.

However, SSDS remain more expensive than traditional HDD drives, a trend that will likely continue without a breakthrough in the manufacturing process of SSDs.

imageSource: Silicon Power Blog

Although solid state drives have become the standard for laptops, smartphones and other devices with a thin profile, they are less practical and therefore less prevalent in large data Centers because of their higher prices.

imageSource: Seagate & IDC

According to IDC, by the end of 2025 more than 80% of the volume of corporate data storage systems will remain in the form of hard disks. But after 2025 the SSD can become the preferred means of data storage for enterprises and their data centers.

As wide dissemination of solid-state drives in consumer electronics and IoT devices the growth of demand may lead to increased supply and, ultimately, reduce the cost.

Cold storage

In contrast to the new SSD, cold storage is used by many older technologies such as CD-R and magnetic tape to store data, used as little energy as possible.

During cold storage, access to the data takes much longer time than when hot (e.g. the SSD). In such systems needs to store only infrequently used data.

"Hot" cloud work for SSD and hybrid drives that provide the fastest possible access to information. Cold storage — solutions more economical and slow. In "Cold clouds" to store information that is not required for immediate reference. After a request, data can be uploaded from several minutes to tens of hours.

And if to speak literally, the difference in the temperature inside data centers — the faster the cloud, the more heat allocates the equipment.

However, the world's largest technology companies such as Facebook, Google and Amazon, use cold storage to store even the most granular user data. Cloud providers such as Amazon, Microsoft and Google, also offer these services to clients who want to store data at competitive prices.

According to the instrument calibration market CB Insights, market cold storage is expected to reach nearly $213B by 2025.

No business wants to be behind the curve in data collection. Most organizations claim that it is better to collect as much data as possible today, even if they have not yet decided how they will be used tomorrow.

This type of unused data is called dark data. This data is collected, processed, and stored but are typically not used for a specific purpose. The company estimates the IBM, approximately 90% of sensor data collected from the Internet are never used.

image
In the future, perhaps there are more effective ways of reducing the total number of collected and stored dark data. But now, even with the development of artificial intelligence and machine learning, companies are still interested to collect and store as much data as possible to use these data in the future.

Therefore, in the foreseeable future the best cold storage makes it possible to store data at the lowest price. This trend will continue to the extent that, while users will generate more data, and the organization will collect them.

Other forms of data storage

In addition to SSDs, hard drives, CD-ROMs, and magnetic tapes you receive a number of new storage technologies that promise an increase in capacity per unit of storage.

One of the most promising technologies — thermal recording (thermal magnetic recording) or HAMR. HAMR significantly increases the amount of data that can be stored on devices such as hard disks, heating up the disc surface using precision laser during recording. This provides a more accurate and stable data record, which increases the storage capacity.

It is expected that by 2020 this technology will allow to produce up to 20 TB of hard drives on one 3.5-inch drive, and further to annually increase capacity by 30% — 40%. Seagate has created the 3.5-inch disk with a capacity of 16 TB, which was announced in December 2018.

In the past decade, Seagate actively patented this technology HAMR. In 2017, the company filed a record number of patents related to this technology.

(Note: There is a 12-18-month delay between the filing of patents and patent publication, so that 2018 may see a larger number of patents HAMR.)

Source: https://habr.com/ru/company/ua-hosting/blog/508746/


RELATED MATERIALS:Science and Society
Возрастное ограничение