Melting Ice: A History Of Air Conditioning

The rising temperatures around the world have been sending us running indoors. Getting out of the heat has not always been as simple as pressing a button. As we try to keep cool this summer, keep in mind that many generations of people before us went to great lengths to do so. A long chain of developments over the course of a century led to the modern air conditioner. How did air conditioning change the way we live? What sort of impact did this change have on the environment?

We had a hard time keeping cool before the advent of air conditioning. Without it, the heat of the summer controlled where we went and what we did. In cities, people kept themselves under wet blankets or slept on fire escapes when it was too hot inside. In the suburbs, children ran under yard sprinklers while parents sweated on the porch. In rural areas, people got by in the shade.

People have tried to beat the heat indoors with hand-held and electric fans since 1882. Those without just depended on the occasional cool breeze. Ventilation was a more essential consideration in design; buildings were formatted with high roofs and adjacent windows in order to keep a natural breeze flowing through. Back then, everything was about keeping cool.

Thanks to a couple of inventors, all of that changed.  

In 1851, Dr. John Gorrie patented his ice making machine. It was the first ever device designed to lower the temperature indoors. Gorrie’s invention used compression to make ice, then blew air over it to cool the room. While it was not an actual “air conditioner”, because it did not control humidity, the machine worked.

Navy engineers even recreated the device in order to keep President James Garfield comfortable after being shot by an assassin in 1881. However, the device took a lot of ice to maintain operation. In a mere two months, the machine used mountains of frozen water — over five hundred thousand pounds — to keep it running. Investors were not interested, and the idea did not catch on.

Willis Carrier

Real strives in the development of the technology did not occur until the 20th century. Using the same principle to cool the air, engineer Willis Carrier invented the first true air conditioner in 1902. It blew air over cold coils to control room temperature as well as humidity. It was created to maintain favorable conditions for printing in a book publishing factory, but the number of potential applications for the invention were immediately obvious to the inventor. Carrier’s “Apparatus for Treating Air” was patented in 1906.

Chlorofluorocarbons (CFCs), invented in 1928, were common refrigerants used in most air conditioners. CFCs cause ozone depletion, which causes global climate change. These would continue to be used for decades without a second thought.

By the 1930s, air conditioners were commercially available — but they were very expensive. Most consumers did not have regular access to them. Theater owners knew that it was a major draw, and paid royal sums (up to $50,000 back then, or $600,000 in modern currency) to have the necessary equipment installed. When air conditioning was made available at movie theaters across the country, Americans responded to the summer heat by escaping to theaters.

The amount of energy that these devices required was far higher than the infrastructure was prepared for. By 1942, the U.S. constructed a power plant specifically to handle the additional load of air conditioning in the summer.

Carrier’s invention become more and more popular. Throughout the 50s, stingy employers gradually came around to the idea that comfortable employees are more productive. The majority of businesses had air conditioners installed before the end of the decade. Over a million units were being sold every year.

By 1980, Americans were consuming more air conditioning than the rest of the world combined. In 1985, an ozone hole in the Antarctic Pole proved that we were depleting the ozone layer. Two years later, The Montreal Protocol was adopted. This was a framework for international cooperation to control the use of CFCs. In 1997 The Kyoto Protocol was adopted for further cooperation to control greenhouse gas emissions.

In the past 30 years, there have been changes in the design of the air conditioner to improve energy efficiency, but the design of the device has remained mostly unchanged. The Montreal and Kyoto Protocols remain in effect today. Nearly all CFCs have been phased out as of 2015.

It is clear that air conditioning has become more and more a part of American life throughout the century. It has also become a greater burden in regards to energy usage than any other invention. A common misconception is that all air conditioners use a lot of energy. However, if the home is well-insulated, and the system is not overused, the burden can be manageable. An uninformed public and poorly insulated housing has led to a massive spike in energy use.

Melting Ice: A History Of Air Conditioning

Environmentalists are concernedabout the effects this energy use is having on global warming. They promote smarter, sparing use of air conditioning and energy-efficient technology. Future developments in air conditioning will be focused on lowering the energy usage of devices and implementing smart home technology.

While the environment will never be the same after a century of air conditioning, it is exciting to see how the technology might evolve to prevent further damage.

Leave a Reply