In Episode 5 of the Splashcast podcast, Standard Fluids’ Steve Pignato and Kelvin Cabrera take listeners on a fascinating journey through the history of immersion cooling technology. What many assume is a recent innovation actually traces its roots back more than 50 years. Understanding this evolution helps explain why immersion cooling is positioned to become the dominant thermal management solution for AI-era data centers.
The IBM Origins: 1969
The story begins in 1969 at IBM, where engineer Robert Simons and his colleagues conducted groundbreaking experiments. They took a chip and submerged it in 3M Fluorinert, creating one of the first documented cases of dielectric fluid immersion cooling for computing hardware.
The early bipolar chips generated significant heat. When the fluid began to break down under extreme thermal stress, the engineers developed heat spreaders to distribute thermal load more evenly. This problem-solving approach established principles that still guide immersion cooling design today. IBM continued using Fluorinert for multi-chip module testing throughout the 1970s, validating the concept for high-performance computing applications.
The Cray Era: Late 1980s to 1990s
Cray Research brought immersion cooling into production supercomputing. The iconic Cray-2, deployed in the late 1980s through the early 1990s, used single-phase Fluorinert for direct chip total immersion. These systems demonstrated that dielectric fluids could reliably cool the most demanding computational workloads of the era.
The technology worked brilliantly. However, a significant shift in chip architecture changed everything. As Steve Pignato explains in the Splashcast episode, the transition from bipolar design to CMOS (Complementary Metal Oxide Semiconductor) fundamentally altered the thermal landscape. CMOS chips generated considerably less heat than their bipolar predecessors.
With lower heat output, the complexity and cost of liquid cooling no longer made economic sense. Cray and other manufacturers returned to air cooling. The immersion cooling business at 3M essentially disappeared overnight. For nearly two decades, air cooling dominated the data center industry. The evolution from these early systems to modern approaches reflects both technological and economic considerations.
The Crypto Catalyst: 2010s
Immersion cooling remained dormant until around 2010, when increasing computational demands began pushing air cooling to its limits. Phil Tummo at 3M started exploring two-phase immersion cooling, recognizing that the phase change mechanism offered superior heat transfer compared to single-phase conductive cooling.
The cryptocurrency mining boom provided the catalyst for renewed commercial interest. Ally Control (now Liquid Stack), founded by Karwing Lau, pioneered the use of Novec 7100 fluid for two-phase immersion cooling of crypto mining systems. These deployments ran throughout the 2010s, proving the technology’s reliability and efficiency at scale.
Cryptocurrency mining presented ideal conditions for immersion cooling adoption. High chip densities, continuous operation, and direct financial incentive to maximize hash rates created strong economic drivers. The lessons learned from crypto deployments now inform modern data center applications.
The AI Revolution: Present Day
Today’s AI accelerators like NVIDIA H100 and upcoming Blackwell GPUs generate heat levels that make the original problem from 1969 look modest. Individual GPUs can produce 700-1000 watts. A single rack can exceed 100 kilowatts. Air cooling cannot physically remove heat at these densities without consuming enormous energy and creating unacceptable noise levels.
The technology that IBM pioneered and Cray validated has returned because the physics demands it. Modern chips are essentially back to the heat density challenges of bipolar designs, but with far more sophisticated performance requirements. Two-phase immersion cooling using engineered fluids like Standard Fluids™ SF 649™ Engineered Fluid leverages the latent heat of vaporization to remove heat more efficiently than any other approach.
Why History Matters
Understanding immersion cooling’s evolution reveals an important truth. This technology succeeds when chip heat density exceeds what air cooling can handle economically. We’re at that inflection point again. The difference now is that global data center growth, AI workloads, and environmental pressures all converge to demand better thermal solutions.
As Kelvin Cabrera notes in the Splashcast episode, immersion cooling is set for universal adoption. The decades of engineering refinement mean the technology is mature, proven, and ready for widespread deployment. Standard Fluids provides the high-purity engineered fluids that make this next evolution possible.
The journey from Robert Simons’ 1969 IBM lab to today’s AI data centers demonstrates that great engineering solutions don’t disappear. They wait for the right application and the right moment. For immersion cooling, that moment is now.

