My latest Critical Thinking column over at Data Center Knowledge is part of the site’s focus on all things AI in the datacenter industry this month.
The hope is that AI-driven management software (likely cloud-based) will monitor and control IT and facilities infrastructure, as well as applications, seamlessly and holistically – potentially across multiple sites. Cooling, power, compute, workloads, storage, and networking will flex dynamically to achieve maximum efficiency, productivity, and availabilit
While it’s easy to get caught up in the exciting and disruptive potential of AI, it’s also important to reflect on the reality of how most data centers continue to be designed, built, and operated. The fact is that a lot of the processes – especially on the facilities side – are still firmly rooted in the mundane and manual.
And as Google nearly found to its cost, the answers and actions delivered by AI systems may not always be what was originally anticipated.
Just as Skynet in the film The Terminator took a dispassionate, logical view of preventing conflict, finding that mankind was the problem, Google’s algorithm reached a very simple and accurate conclusion about improving the efficiency of its sites:
The model’s first recommendation for achieving maximum energy conservation was to shut down the entire facility, which, strictly speaking, wasn’t inaccurate, but wasn’t particularly helpful either.
For more head over to DCK.