Data is central to much of today’s innovation. From augmented reality (AR) to cryptocurrency to the metaverse, data centers are powering these mega trends.
However, the IT infrastructure needed to power this digital ecosystem of the future is enormous. So large, in fact, that without sufficient planning and technological investment, data center energy consumption could quickly rise to untenable levels.
At the turn of the century, as the physical footprints of data centers were expanding exponentially to support the dot-com era speculation, many predicted data center energy demand could surge to 20% of global demand — up from less than 1%. Thankfully, however, Moore’s law has been in effect since the 1970s, and the general rule that “processor speeds, or overall processing power for computers, will double every two years at the same power use” helped to keep power use under control.
In the mid-'90s, a new technology called virtualization became very popular. With virtualization, three to 10 virtual servers could be put into one physical server, so downsizing the physical footprint of data centers and their associated power use became commonplace. The original virtualization is a “heavyweight” application using hypervisors, meaning it took quite a bit of computing resources to operate. Newer, lightweight virtualization using containers, kernels, and microservices runs far more efficiently and uses less energy.
Fast-forward to now, and talks of quantum computing potentially expanding the need for data center power further beyond today’s demands are taking place. In order to prepare for the inevitable next phase of digital transformation, it’s important for corporations to look at many aspects of data centers, including proper sustainable power supply options as well as edge computing and IoT functionality, which can help boost both insights and sustainability.
Data centers will continue to shape the digital transformation in the U.S. and abroad. For large corporations, the time is now to invest heavily in technologies that are energy efficient and resilient.
Photo courtesy of Pixabay
Looking forward, technical advancements are in play, requiring significant investments in data center expansion. Just as technological innovation is changing the world, it is also changing data center operations. For example, building data centers with general purpose processors was the norm for decades until about three years ago. That’s when learning AI started to ramp up, needed faster speeds, and leaned on graphic processing units (GPUs).
But GPUs used too much power, so the industry started moving to application-specific integrated circuit (ASiC), which is an integrated circuit (IC) customized for a particular use rather than intended for general purpose needs. For AI, ASiCs made for deep neural networks are now 10 times more powerful and effective than standard GPUs and are known as tensor processing units (TPUs). But, processing high volumes of data in hyperscale data centers usually requires the transmission of data that is physically faraway, which can cause congestions. This is one reason why some clients are moving processing workloads to edge data centers.
The speed of this change means your plans should include considerations for what’s next. And, what is next? One of the most exciting future storage innovations is one that emulates human DNA and can reduce energy use for storage dramatically. Some very smart people are bullish about the process of encoding and decoding binary data to and from synthesized strands of DNA.
Crystal ball gazing aside, working with an automation partner who understands your specific sector requirements and technology plans is essential. Data will underpin all scaling ambitions, so ensuring a data plan reflects business goals is paramount.
In almost every case, businesses should start by analyzing and understanding the equipment and infrastructure currently in place. Industrial digital transformation involves more than ripping and replacing production and supply chain processes. In many cases, equipment already in place only needs to be upgraded to support automation and digitalization, not replaced.
Many legacy technologies within industrial environments have recently been connected to digital systems for monitoring, data collection, and analysis. These technologies not only help achieve new agility and data efficiency in real time, but they can also help make smart conclusions that will impact data infrastructure planning. These pieces of software can provide insights not only on optimal environmental conditions but also the peak caseloads to manage around.
By using only the required resources, data center professionals can have a real impact on the sustainability of their facilities. But, to have this intelligence, IT and OT need to be unified as thoroughly as possible. This means making every device and touchpoint as intelligent as possible. And, it means doing it securely, as digital enablement raises the specter of sophisticated cyberattacks. It’s important that your data centers have no weak points that cybercriminals can target. The right software and partner can marry these regulatory and cybersecurity needs.
The exponential growth of data will inevitably mean the building of more data centers. The good news is that today’s data centers have made tremendous strides in sustainability measures and produce a significantly lower ecological footprint.
However, not every business has the ability to develop its own data centers, which means optimizing current infrastructures and growing these smartly is the best way to help the business and environment. And, for this, the smart integration of software throughout the OT infrastructure is paramount.
One thing is for certain: Data centers will continue to shape the digital transformation in the U.S. and abroad. For large corporations, the time is now to invest heavily in technologies that are energy efficient and resilient.
Steven Carlini is vice president of innovation and data centers at Schneider Electric (TBC).
Video courtesy of Pixabay