Artificial Intelligence is gearing up to power more advanced applications, but come with more thermal challenges.
Cool Artificial Intelligence is Making a Big Splash this Summer
It's hard to ignore all of the attention diverted towards Artificial Intelligence (AI) recently. But AI isn't a new thing. IBM's Watson whipped two Jeopardy champions a decade ago. Deep Blue, the chess-playing computer, beat the reigning world chess champion 20 years ago (already?!). Those are just a couple of high profile accomplishments in the AI field. Artificial intelligence research groups started in the mid 1950s. So why is cool artificial intelligence making such a big deal now?
Watson Courtesy of Peter Barnum and Brad Neuman from TechCrunch
Artificial intelligence has developed enough where big name software and hardware companies are realizing the opportunity AI opens for them. Artificial intelligence has become more than a research effort to play complex games like chess or Jeopardy. These companies pouring huge amounts of research into developing their AI capabilities to expand their products and services. Cloud and Enterprise
companies like Google and Amazon utilize AI for great search algorithms for websites and products. But with all the hype, have we really stopped to think how we'll cool artificial intelligence?
Aluminum Extrusion Machinery Diagram Courtesy of Aluminum Extruders Council
New Applications for Artificial Intelligence
According to the TechRadar™: Artificial Intelligence Technologies, Q1 2017 report conducted by Forrester, many companies haven't realized how artificial intelligence can help them. In the survey, 42% of respondents have no defined business case for AI, 39% aren't clear on what it can be used for and 33% of respondents don't have the required skills to implement artificial intelligence in their business.
This general hesitation in the market isn't keeping artificial intelligence pioneers like tech giants to spry startups of jumping head first into development investments. These AI entrepreneurs are creating with some of the newest applications. Artificial intelligence is exploding with new interest and opportunities for development and it's difficult to capture them all on such a small space here. We'll only scratch the surface of some of the latest applications in this post.
Even companies outside of the traditional computer science world are tapping into the potential of AI. Automotive manufacturers like Ford, BMW, and Tesla, names you typically wouldn't associate with high tech computer systems, are pioneering eMobility systems and self-driving cars with the help of traditional computer chip makers. Apple is getting into the fray of autonomous driving platforms with Project Titan. Waymo, a former division of Google, is partnering up with Chrysler to make fully self driving minivans. And these are just a few big names that are now developing self driving cars utilizing sophisticated systems that can sense, process, plan, and react in a complex navigation environment. We'll actually cover this in it's own blog post since this is a huge topic in itself.
Facebook, known primarily for its social networking platform, has looked into improving interpersonal communication with the help of AI. Their Facebook Artificial Research (FAIR) group conducted a study of learning programs that can negotiate. During this study, the programs learned complex negotiating tactics like feigning interest and even developed it's own non-human language to communicate between negotiating programs.Amazon created it's own home assistant, Alexa. Alexa can process our voice commands and reacts to our questions and requests within seconds. This personal assistant relies on cool artificial intelligence that can recognize speech from the user, process the meaning of the user, and plan out an appropriate response.
Watson, after his glorious victory in Jeopardy, has turned to the medical industry. This IBM artificial intelligence has been tutored in the fields of genomics and oncology to help find the right treatment for cancer patients. In the 60 Minutes interview of the medical research team lead, we learn that in a parallel diagnosis of cancer patients, the top doctors and Watson agreed on the treatment 99% of the time. The extraordinary part of this research is the 30% of the patients that Watson found something else in the patient profile that the doctors didn't find, and opened additional treatment doors.
A Solid Foundation for Artificial Intelligence
In the same 60 Minutes episode as Watson's new career in the medical field, John Kelly, the "Godfather" of Watson, took Charlie Rose on a tour of Watson's brain. Watson's brain looks similar to any other data center, albeit with some sweet colored lighting. The computer that originally housed Watson contained 90 servers and 15 terabytes of memory. Watson needed all of that hardware to read and process millions of books per second. What you couldn't see in the interview was the immense power running Watson's brain.
"Charlie Rose: You can feel the heat already.
John Kelly: You can feel the heat -- the 85,000 watts – you can hear the blowers cooling it, but this is the hardware that the brains of Watson sat in."
85,000 Watts in a dedicated server room! One thing we know about electronics is that we're going to want more processing power in a smaller space. IBM has already decreased the size of the hardware since Watson's appearance on Jeopardy. But the major proof of this demand lies in our pockets filled with smartphones. As more industries find uses for artificial intelligence, the businesses and their consumers are going to want this processing power in smaller packages and available anywhere. This is where we need to develop the software to be efficient, but also the processing components and the thermal management software that supports it.
Developing New Hardware to Support Artificial Intelligence
Intel, as one of the largest chip makers, has been steadfastly supporting and developing new processors to make artificial intelligence systems better and more accessible to their users. It wasn't until this year that they formed a dedicated group for artificial intelligence
Graphics Processing Units (GPUs) have become the preferred chip type for AI applications because it many smaller cores. This multitude of cores can handle more parallel processing than Central Processing Units (CPUs) because they have fewer, albeit larger cores. Companies like Nvidia are leveraging their position as a prominent GPU manufacturer to gain a foothold in the AI industry. Nvidia's Jetson TX2 is designed to have all the resources to conduct high level computations to support artificial intelligence applications without connecting to the vast processing power on the cloud. This means it has a GPU, a CPU, Random Access Memory (RAM), and memory storage all on a single board. These sorts of units are perfectly positioned to support self-driving cars since it is self contained and needs no internet access.
Google has also been developing it's own hardware to support artificial intelligence endeavors. As we build learning programs, we need to teach each of these programs how to function. Watson needed tutors to show him the difference between normal and abnormal medical scans. Instead of humans teaching AI programs, Google created Tensor Processing Units (TPUs) to train and execute these deep intelligence programs.
With each new hardware development for AI, there comes a cost of thermals. We're asking for more operations per second, faster training periods for learning programs, and quicker results from each of these AI programs.
Wholesale Approach to Cool Artificial Intelligence
Since Watson can have a remote interface, it isn't much of a problem to have a dedicated room for all of his hardware. It also makes it easier to cool since specialized thermal management solutions can be designed into the room that keeps Watson's brain. Google has taken this approach to their artificial intelligence program. Instead of selling the TPUs and boards, Google will house all of their AI hardware in their own facility and make it accessible to users though the cloud. This is how they can have these sweet but tall zipper fin stack heat pipe heat sinks
for cooling artificial intelligence TPU chips on a board. By the images they've shared from their data center, you can see how packed those racks are with TPU boards. Since all of this processing power is packed in together, Google can generate more overarching thermal management solutions.
Cool Artificial Intelligence on the Go
Though on-board solutions like the Nvidia Jetson TX2 don't have the luxury of huge data centers designed to cool their processors. Self contained units require self contained thermal management solutions. As these on-board AI solutions become more power hungry and complex, cooling solutions will need to increase in capacity as well, while still being efficient. Fan cooled
heat sinks may be enough for the time being, but more advanced cooling will be needed.
These mobile solutions will need to be the best combination of small, low power consumption, effective, rugged, and reliable. This is especially true if we're going to put our lives in the hands of artificial intelligence in self-driving cars. Eventually we'll need to make the switch from air cooled solution to liquid cooled solutions
since air won't have the same sort of cooling capacity that liquid does.
In reality, each new application of artificial intelligence is going to need it own specialized cooling solution. Currently, each application for AI is so specialized that there isn't a one-size-fits-all solution for these AI chips. This is the most fun part, where AI finds all these nooks and crannies and applies itself as best as its developers can. We just need to make sure it's cool artificial intelligence so it's as fast and efficient as we need it to be.
Need help with your HPC or AI cooling system? Contact us
and we can help you walk through the right thermal management system for your application!