The phenomenon of the data center has played its full part in creating the interconnected and smaller world that we now find ourselves in. Data centers have a history going back to the mid-1940s when the first specially created computer rooms housed large military machines that were set to work on specific data tasks.
By the 1960s we had the creation of the mainframe computer, with IBM at the forefront of filling dedicated mainframe rooms at very large companies and government agencies. In some cases, these ever more powerful and growing machines needed their own free standing building, which were the first data centers.
The 1980s saw the launch of PCs that were commonly networked with remote servers to enable them to access large data files. By the time the internet became widely available in the 1990s, internet exchange (IX) buildings had sprung up in key international cities to serve the needs of the World Wide Web. These IX buildings were the most important data centers of their time in terms of serving the needs of the most people, and most remain in place to this day.
There was a boom in the number of data centers as internet use grew and a dot.com bubble was created as companies of every size rushed to get on the internet with their own websites. Data centers were hosting these companies’ websites, by providing remote servers to keep them running.
As these internet data centers had large numbers of servers, and cables from different telephone companies and network operators running through them, if there were any technical problems with the website, the data center operator could change the server or switch the connection supporting it to keep it running properly – a major selling point.
After handing over the operation of their website to an outsider, the next logical thing to do for many organisations was to migrate other data operations to a data center. Known as colocation, this way of doing things saw firms either locate their own servers at a provider’s data center or rent server space from that provider to run and access some applications remotely, such as email, data storage or backup.
For the last ten years or so, cloud services have grown from the likes of Microsoft, Google, Amazon, IBM, Oracle, SAP, Salesforce, China’s Alibaba and many others.
The market was originally kick-started by Amazon through its Amazon Web Services subsidiary, that offered companies hosting solutions around “infrastructure-as-a-service (IaaS)”. IaaS allows companies to flexibly access remote servers, owned by providers like AWS, through the cloud and on-demand according to their business data processing and management needs.
There is a myth that IaaS was offered by AWS as a result of Amazon having excess cloud server capacity through its rapidly growing e-commerce business. But that’s not the case, AWS was specifically created and scaled up to offer IaaS to anyone that wanted it. IaaS allows startups and smaller companies to compete against larger organisations who typically have more extensive computing capacity at their disposal.
The growth of cloud data centers took off when companies started to remotely access some or most of their key business software applications through the cloud, instead of deploying them and managing them on servers located at their sites – known as on-premise.
This transition was encouraged with the promise of lower costs for the software, through the ability to scale up and scale down its use among employees in response to business demands. This usually works out cheaper than permanently installing feature-rich software on company servers on-premise, when companies pay a higher fixed cost for it.
Because the demand for such services has dramatically increased, the size and number of the data centers being built to host them has accordingly grown too. Such data centers have become known as hyperscale data centers. Hyperscale facilities are owned by both cloud service providers and other firms who build them to rent the space to the household names that provide those services.
Industry research firm Synergy recently reported that the worldwide total number of large data centers operated by hyperscale providers – like Microsoft, Google and Amazon and firms that rent space to them – had increased to 541 at the end of the second quarter of 2020. That is more than double the mid-2015 count.
But while the growth of the colocation and cloud data center market is clearly pronounced, there are challenges afoot, not related to technology.
The 2020 COVID-19 pandemic has tested the capacity of data centers. There was a huge rise in home working and a greater need to access business applications that support it through data centers and the cloud.
There have been occasional, if brief, outages or slowing to some services as a result, and if providers want to keep providing a reliable service in the new normal (as many continue to work from home after the pandemic), they will have to quickly ramp up their data center capacity.
Then we have the pressure of the environmental question. The data center industry already consumes around 1% of the world’s electricity [https://www.datacenterknowledge.com/energy/study-data-centers-responsible-1-percent-all-electricity-consumed-worldwide] and there are social awareness pressures from the wider public about this. Last year, most of the world seemed to wake up to a climate emergency, with regular mammoth demonstrations and leaders making promises about carbon reduction targets.
For the industry as a whole, tools to reach such targets include low-carbon electricity, efficient buildings and low-carbon heating, electric vehicles, diversion of biodegradable waste from landfill and renewable energy resources. While many data centers are already using some of these alternatives, most operators will now have to embed climate/carbon goals into new projects going forward.
The issue of data sovereignty is also becoming bigger around the world, as governments, businesses and consumers call for data to stay in the country that it was generated in, both for security and national interest reasons.
Countries like India, Turkey, Russia, China, Indonesia and many others insist that data generated by their citizens must stay within their borders, on security grounds and also for them to benefit economically from using that data to support local business and more efficiently deliver public services.
As a result, the data center industry has to make sure it breaks out of its main markets of North America and Europe by having facilities located in developing markets.
And on wider data compliance grounds, to meet data protection and privacy legislation like the European Union’s General Data Protection Regulation (GDPR), for instance, governmental agencies, large companies and their customers are increasingly demanding that data centers must be built within their borders too.
Partly as a result of this, the top FLAP (Frankfurt, London, Amsterdam and Paris) data center markets are coming under increasing pressure from rival locations as more data centers are being built elsewhere. Dublin, Warsaw, Copenhagen, Stockholm and Milan are among those cities taking data center share from FLAP.
One thing is for sure, while the industry continues to evolve, there is always a new challenge set to appear.
A natural evolution from colocation data centers: why dedicated infrastructure is the future …