Computer Cooling History | The History of Cooling off Your Data

Share on facebook
Share on twitter
Share on linkedin
Computer Cooling History | The History of Cooling off Your Data

Computer Cooling History | The History of Cooling off Your Data

Since the late 1940s, when computers were building-filling, heat-producing, electricity-consuming entities, companies have had a love-hate relationship with them. IBM was among the first to downsize the computer, from house size to room size and then to even smaller. The mainframe of the 1950s and 1960s, still powered by vacuum tubes, continued to be a heat producer, often requiring refrigeration to hold down the temperature.

Computer Cooling History | The History of Cooling off Your Data

In operation, the temperature of a computer’s components will rise until the heat transferred to the surroundings is equal to the heat produced by the component. For reliable operation, the temperature must never exceed a specified maximum value unique to each component. As vacuum tubes gave way to integrated circuits and semiconductors, the heat equations changed.

Starting in 1965, IBM and other manufacturers of mainframe computers sponsored research into the physics of cooling densely packed integrated circuits. Many air and liquid-cooling systems were investigated, using methods such as natural and forced convection, direct air impingement, direct liquid immersion, and forced convection.

Computer Cooling History | The History of Cooling off Your Data
Microsoft has been exploring the idea of an underwater data center.Project Natick reflects Microsoft’s ongoing search for a data center for cloud applications that offers less resource intensive options, rapid provisioning, lower costs, and high agility in meeting customer needs.

Share this article

Share on facebook
Share on twitter
Share on linkedin