Computer Cooling History | The History of Cooling Off Your Data

Share on facebook
Share on google
Share on twitter
Share on linkedin

Computer Cooling History | The History of Cooling Off Your Data

Since the late 1940s, when computers were building-filling, heat-producing, electricity-consuming entities, companies have had a love-hate relationship with them. IBM was among the first to downsize the computer, from house size to room size and then to even smaller. The mainframe of the 1950s and 1960s, still powered by vacuum tubes, continued to be a heat producer, often requiring refrigeration to hold down the temperature.

Computer Cooling History | The History of Cooling Off Your Data
Computer Cooling History | The History of Cooling Off Your Data

In operation, the temperature of a computer’s components will rise until the heat transferred to the surroundings is equal to the heat produced by the component. For reliable operation, the temperature must never exceed a specified maximum value unique to each component. As vacuum tubes gave way to integrated circuits and semiconductors, the heat equations changed.

Starting in 1965, IBM and other manufacturers of mainframe computers sponsored research into the physics of cooling densely packed integrated circuits. Many air and liquid-cooling systems were investigated, using methods such as natural and forced convection, direct air impingement, direct liquid immersion, and forced convection.

Computer Cooling History | The History of Cooling Off Your Data

IBM developed three generations of the TCM (thermal conduction module) which used a water-cooled cold plate in direct contact with integrated circuit packages. Each package had a thermally conductive pin pressed onto it, and helium gas surrounded the chips and pins.

Share this article

Share on facebook
Share on twitter
Share on linkedin