Sign In

Communications of the ACM

ACM News

The Shuttering of Corporate Datacenters


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
The button to push to shut down a datacenter.

Many companies are now migrating their in-house datacenters to the cloud.

Credit: Racksolutions.com

Corporate datacenters are in a period of transition, and are being decommissioned at a rapid rate. In a blog post, market research firm Gartner forecast that 80% of enterprises will have shut down their traditional datacenters by 2025, versus 10% today.

Many companies are now migrating their in-house datacenters to the cloud. Oracle predicts 80% of enterprise workloads will move to the cloud by 2025.

According to Richard Villars, vice president for datacenter and cloud at International Data Corporation (IDC), a research firm tracking technology markets, the cloud is only one of the factors contributing to the increase in datacenter decommissioning.

"Virtualization and converged infrastructure allowed many corporations to get a lot more capacity in their datacenters," Villars said. "With these technologies, you could now do the same amount of work on 40 or 50% of the footprint."

The cloud basically stopped innovation from happening in corporate datacenters, Villars explains, with the result that most companies facing this issue have been watching their datacenter footprints get smaller and smaller. Most enterprises do not need to keep adding new capacity in existing datacenters and will need about half the datacenter space in 2023 that they did in 2014, according to Villars.

Bill Vasquez, senior vice president of Strategy & Business Development at ITRenew, which specializes in onsite datacenter decommissioning and data erasure services, agrees decommissioning activity is growing and evolving at a rapid pace. One of the main growth drivers is the sheer volume of hardware deployed, which is expanding at an exponential rate.

All of this new hardware is helping to drive the decommissioning of older equipment. Companies retain their servers for four years on average, which is considered longer than ideal, and the number of servers in a datacenter can run as high as 50,000 to 80,000, to provide an idea of scale. IDC reports enterprise computing is at near-historic highs, with next-generation workloads and advanced server innovation (such as accelerated computing, storage class memory, next-generation I/O, and more) driving server demand at the end of 2019 to one of the highest levels in 16 years, according to a recent IDC Quarterly Server Tracker. It is the deployment of these new servers that is one of the prime drivers spurring the decommissioning of the older equipment they are replacing.

"Between the growth in data usage and storage, and the emergence of new technologies that require more and more computing power, like artificial intelligence, machine learning, augmented & virtual reality, and the Internet of Things," Vasquez comments, "Decommissioning shows no sign of slowing down."

Secure Data Destruction

Considering all the data that resides on the servers from those decommissioned datacenters, Vasquez suggests the best practice is to eradicate all data, as privacy laws are on the rise and data breaches can be very costly.

"The best way to verify data has been destroyed is to wipe it with 100% sector-verified erasure, and electronically capture the serial number of both the host unit and the media itself with a solution like Teraware," Vasquez says, pointing to ITRenew's data-wiping software. The wiped drives can be reconciled against an asset inventory system for further verification, he adds, and enterprises in industries with higher security requirements may require the units to be shredded after they have been wiped.

Some companies and IT asset disposition providers rely on only physically scanning and shredding the drives, but that approach can be subject to human error, like overlooking a drive or missing a scan of a serial number. Given how damaging this sort of data exposure can be to a Fortune 1000-type company, even one missed drive can have major consequences. For instance, if a decommissioned hard drive were to turn up with hospital patient information still on it, that hospital could be fined thousands of dollars in HIPAA violations for failing to destroy the data properly.

"At the scale of our clients, who have hundreds of thousands to millions of drives that must be decommissioned each year," Vasquez says, "introducing even a small degree of human error into the equation would virtually guarantee data leakage. That's why we recommend 100% sector-verified erasure and serial capture."

Validating Data Destruction

"How we actually destroy the media that has the information on it is where the rubber meets the road from a certitude perspective," says Bob Johnson, CEO of the National Association of Information Destruction (NAID), the standard-setting body advocating for best practices in secure data destruction.

"Some data centers' internal IT staff may do their own wiping and disassembly before disposal," Johnson says, "but in a large percentage of cases, they are turning over their equipment and relying on a third party to perform the data destruction, as well as the equipment removal and recycling."

To validate the removal and recycling were done appropriately, certificates of destruction are one of the primary methods. For instance, through its certification services, NAID verifies secure data destruction companies' services compliance with data protection laws through audits by accredited security professionals, fulfilling customers' regulatory due diligence obligations.

"The certificate of destruction doesn't actually prove anything was destroyed," Johnson says, "but it should link to a contract, and have all the operational procedures involved, so you have airtight documentation." Both clients and service providers need a detailed list of identifiable particulars, such as serial numbers of hard drives destroyed.

The data owner a bank or a hospital, for example is always responsible for the protection of its data, as well as for regulatory compliance. "They are not able to contract that away," Johnson says. In order to hold the data owner accountable, regulators make them legally responsible for the information. "They're on the hook," he says.

Vasquez agrees that the client, the owner of the hardware, is ultimately responsible for the data and its destruction, which is why it is of paramount importance for them to complete thorough due diligence when deciding on a potential data destruction partner. "Only partners who provide the most stringent security solutions should be trusted with this work," he says.

John Delaney is a freelance writer based in New York City, NY, USA.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account