January 4 – Happy Newtonmas!

Today’s factismal: Isaac Newton was born on January 4, 1643.

About a week ago, you may have seen texts fling about from various folks, telling the tale of a baby who was born on Christmas and would go on to change the world – a baby by the name of Isaac Newton. The only problem is that the texts are wrong (sort of). You see, by our calendar, Isaac Newton was born on January 4, not December 25. And therein lies a tale.

Isaac Newton, the man who changed science even if he couldn't get his birthday right (Image courtesy Barrington Bramley)

Isaac Newton, the man who changed science even if he couldn’t get his birthday right
(Image courtesy Barrington Bramley)

You see, calendars are tricky things.  Until the time of Julius Caesar (yes, that Julius Caesar), the Roman calendar was a mess. The original Romulan (no, not those Romulans!) calendar had just ten months and covered just 304 days of the year – the period between December and March were just considered to be one long, cold, winter of despair. Obviously, this wasn’t a very good way of keeping track of time. So Numa, the second king of Rome, changed it.

Numa, the Roman who created the first

Numa, the Roman who created the first “good” calendar
(Image courtesy User Hedning on sv.wikipedia)

Numa added two more months between December and March, and brought the length of the year up to 355 days. But, as every schoolkid knows today, the year is actually closer to 365.25 days long. As a result, the calendar slowly slipped ahead of the actual year, with the embarrassing result of the equinox being declared several weeks before it actually happened! In order to fix this, the chief priest (called the Pontifex Maximus or “Chief Bridge Builder”) would slip in after February an intercalendary month known as a Mensis Intercalaris every so often. Because of the extreme difference between the length of the calendar and the length of the actual year, this had to be done roughly every other year.

But then Roman politics came into play. You see, the Pontifex Maximus was usually also the person who was in charge or one of his friends. As a result, the Pontifex Maximus could add in a Mensis Intercalaris or two when his party was in power (thus making the year and their term in power longer) or withhold them when the other guys were (thus making the year shorter). The worst offender for this was, you guessed it, Julius Caesar who made the year of his third consulship 445 days long!

Julius Caesar conquered the Gauls and the calendar (Image courtesy H. F. Helmolt )

Julius Caesar conquered the Gauls and the calendar
(Image courtesy H. F. Helmolt )

This move offended just about everyone. In order to push it through, Caesar had to promise to reform the calendar so that nobody else could play that sort of trick again. (For a more modern example of this sort of political shenanigans, consider FDR’s four presidential terms.) He did it by making the year 365 days long and adding in an intercalendary day at the end of February every four years, starting with the year 46 BCE.

Now this would have been the end of the story, other than the various maneuverings over the names of months, except for one important fact: the year is not 365.25 days long. Instead, it is 365 days, 5 hours 49 minutes, and 12 seconds long, or a difference of 10 minutes and 48 seconds. Though the difference might not seem like much, it added up over the course of a few centuries. By 1582 CE, the calendar was nearly eleven days behind, throwing everything out of whack.

A comparison of the three calendars. The Gregorian comes closest to our modern year.

A comparison of the three calendars. Pope Gregory’s calendar comes closest to matching our modern year.

So it was up to the new Pontifex Maximus, Pope Gregory XIII, to fix the mess. He did it by jumping the calendar forward ten days and by changing the number of leap years. Under Gregory, every fourth year would be a leap year unless it fell on a century (i.e., 1000, 1200); only every fourth century year (i.e., those divisible by 400) would be leap years. That neatly fixed the lagging calendar and patched the problem so that another intercalendary day wouldn’t be needed until the year 10,000.

But Europe in 1582 CE isn’t the same as Rome in 46 BCE, and though the Pope might call himself the Pontifex Maximus, he didn’t have complete control over the world’s calendars. As a result, many countries didn’t adopt the new calendar until much later. Italy, of course, adopted it immediately. France took up the new calendar less than a year later. But it wasn’t until 1752 that England finally adopted the new calendar. And that is why, though Isaac Newton was born on Christmas day in England, he was really born on January 4 by our calendar. So Happy Newtonmas!

And if you’d like to celebrate, why not do so by lending some of your computer time to LHC@Home? They’ll use your spare computer time to help solve mysteries such as “What is Dark Matter?” and “What would happen if π were exactly 3?” To learn more, page over to:
http://lhcathome.web.cern.ch/

December 16 – Carb-gone 14

Today’s fasctismal: The oldest age that carbon-14 can give is 114,600 years; the youngest is 570 years.

If you have ever cleaned up a teenager’s room, then you have probably discovered one of the most fundamental tools of archeology: relative dating. (You’ve also discovered what it feels like to excavate a midden pile.) As you dug down through the layers of trash, petrified food, and old homework assignments, you probably noticed that the older stuff was on the bottom and the newer stuff was on the top. But what might surprise you more than the Wall Street Journal hidden under your child’s bed is the fact that until 1949, relative dating was about the only method that archeologists had for dating really old artifacts.

A household in Pompeii (My camera)

A household in Pompeii; we know how old this is thanks to records the Romans kept
(My camera)

That’s because very few artifacts come with a date on them. And even those that do come with an identifying mark, such as coins, only give you an approximate range of dates; for example, a coin with Julius Caesar’s face on it was probably minted sometime between 50 BCE (when he conquered Gaul) and 44 BCE (when his political opponents put a permanent end to his ambitions). And the older something is, the less likely it is to have any sort of identifying mark; an empty, twelve hundred year old clam shell from the Spiro Mound looks an awful lot like an empty, fifteen thousand year old clam shell from Siberia.

This coin provides an approximate date (Image courtesy Australian Centre for Ancient Numismatic Studies)

This coin provides an approximate date
(Image courtesy Australian Centre for Ancient Numismatic Studies)

Archeologists have come up with several methods for working around this problem (e.g., by counting tree rings), but they fail more often than not (after all, how many tree rings are there in a clay pot?). They needed something more. They needed a method that would work on almost everything and that could be easily verified. And, in 1949, a chemist by the name of Willard Libby gave it to them. He realized that by comparing the amount of carbon-14 that was in an object to the amount of carbon-12, he could tell how old something was. But, because carbon-14 had a half-life of 5,730 years, it could only be used to measure things that were between 0.1 and 20 half lives; that is, things that were no younger than 570 years old and no older than 114,600 years old.

But how does carbon dating work? I’ll give you an experiment that you can do at home to understand this basic concept (Teachers: This works really well in a large class if you ad up everyone’s numbers.). To do the experiment, you’ll need 84 pennies, 16 nickles, and 16 dimes. We’ll pretend that the pennies are atoms of carbon-12; because carbon-12 is stable, it doesn’t decay. A carbon-12 atom today will still be a carbon-12 atom 100,000 years from now. And we’ll pretend that the dimes are carbon-14 atoms. Carbon-14 is unstable; in 5,730 years, half of the carbon-14 that is present today will decay into nitrogen-14. (We never run out of carbon-14 because it is always being created by cosmic rays hitting nitrogen-14 in the atmosphere and turning it into carbon-14.)

When a critter dies, the ratio of carbon14 to carbon-12 is fixed

When a critter dies, the ratio of carbon14 to carbon-12 is fixed

Now a living thing will take in carbon-12 and carbon-14, so the proportion of the two atoms will be roughly the same as is in the atmosphere. But once it dies, it stops adding new carbon. As a result, when the carbon-14 decays it changes the ratio of the carbon atoms. To see that, we need to do our experiment. Start by placing the pennies in a pile and lining up the dimes, all heads up. (If you want, you can draw a dead critter around the money.) This is what the ratio of carbon-14 to carbon-12 looked like right after the critter died. There were 84 pennies/carbon-12 atoms and 16 dimes/carbon-14 atoms. (This was a very small critter.)

After one half-life about half of the carbon-14 has turned into nitrogen-14

After one half-life about half of the carbon-14 has turned into nitrogen-14

Since we don’t want to wait 5,730 years for the atoms to decay naturally, we’ll flip the dimes, one by one. If the dime comes up heads, put it back in the critter because it didn’t decay. But if it comes tails, the carbon-14 atom decayed and turned into nitrogen-14 (aka, a nickle). You’ll probably have about eight of the dimes decay, so your new ratio will be 86 pennies/carbon-12 atoms to 8 dimes/carbon-14 atoms. Now flip the dimes again, once more replacing those that come up tails with nickles. Odds are that you’ll lose about 4 dimes this time and your ratio will be 86 pennies/carbon-12 atoms to 4 dimes/carbon-14 atoms. Do it again and you’ll get something close to 86 pennies/carbon-12 atoms to 2 dimes/carbon-14 atoms.

After two half-lives half of the remaining carbon-14 has turned into nitrogen-14

After two half-lives half of the remaining carbon-14 has turned into nitrogen-14

As you can see from the experiment, the ratio can tell you when a critter, such as a possum or a palm tree, died. And if that critter was then used to make something else, such as a shoe or a house, then we know about when the something else was made. So all you have to do to find out how old something is is measure the ratio of the carbon-14 to the carbon-12 in it. Pretty nifty, huh?

After three half-lives half of the remaining carbon-14 has again turned into nitrogen-14

After three half-lives half of the remaining carbon-14 has again turned into nitrogen-14

But I’m willing to bet that your ratios didn’t exactly match mine. That’s because we only used a very few atoms; in most living things, there are quadrillions of carbon atoms instead of just 100. But there are still some variances in the ratios because radioactive decay happens randomly. As a result, most carbon-14 ages have an error of about 3-5% (i.e., a 570-year old sample is probably somewhere between 540 and 600 years old).

So that’s our experiment on carbon dating. And now that you are a fully-qualified archeologist on par with Indiana Jones, why not start doing some real archeology by becoming a Digital Volunteer at the Smithsonian? You’ll look at old documents, type what you see, and help preserve historical records dating back hundreds of years! To learn more, flip over to:
https://transcription.si.edu/

December 2 – Pile On!

Today’s factismal: The world’s first artificial self-sustaining nuclear chain reaction took place in Chicago on December 2, 1942.

Did you ever wonder why a nuclear reactor is sometimes called a “pile”? The answer to that question was built in Chicago some 74 years ago. During World War II, we knew a lot of things. We knew (thanks to Becquerel) that atoms could split into smaller parts in what would come to be known as an atomic reaction. And we knew (thanks to Einstein) that those reactions could release a lot of energy. And we knew (thanks to our spies) that the Germans were working on ways of turning that energy into an explosive. And we knew (thanks to the way the war was going in 1942 {not well}) that if they developed it, they’d use it. So we decided to develop it first. And so was born the Manhattan Project, the most famous secret project ever.

The world's first artificial nuclear reactor. The uranium went into the holes in the graphite bricks. (Image courtesy DoE).

The world’s first artificial nuclear reactor. The uranium went into the holes in the graphite bricks.
(Image courtesy DoE).

The first stage of the Manhattan Project was discovering if we could control the reaction; if we couldn’t then there wasn’t any way to build a weapon. So Enrico Fermi (who discovered how to make small atoms out of big ones by bombarding them with neutrons) and Leó Szilárd (who discovered how to make other atoms do the bombardment in a chain reaction) constructed the world’s first artificial nuclear reactor in a Chicago University racquetball court. (There was a natural nuclear reactor in Africa some two billion years ago, but they didn’t know about it then.) To make it, they placed 771,000 pounds of graphite bricks into a rough cube that was 20 feet high and 25 feet wide; as they built, they filled the bricks with 92,990 pounds of uranium pellets and control rods of indium, cadmium, and silver. The graphite absorbed some of the energy of the neutrons released by the uranium as it decayed; that made it more likely that the neutrons would be absorbed by other uranium atoms causing them to decay. And the control rods would absorb the neutrons completely, stopping the reaction. By sliding the control rods in and out of what Fermi describer as “a crude pile of black bricks and wooden timbers”, they would be able to control the reaction (in theory, at least).

And on December 2, they tested that theory. In front of a group of other physicists who were also working on the Project, they slid the rods out and started the world’s first artificial self-sustaining nuclear chain reaction. Twenty-eight minutes later, they slid the rods back in and stopped the reaction. The test was a success and that meant that the Manhattan Project could go on and we could use it to win the war.

Today, we are still splitting atoms, this time for peace. And we are still trying to learn what happens next. If you’d like to help the physicists at Stanford discover what happens when you make tiny ones out of little ones, then why not contribute the idle time on your computer with LHC@home?
http://lhcathome.web.cern.ch/

October 5 – Sweet Nothings

Today’s factismal: In 1988, Takaaki Kajita and Arthur McDonald discovered that the Sun wasn’t going to explode.

If you look at the Sun (which you shouldn’t do because it can cause serious damage to your eyes), then odds are you’ll see it as a bright, burning spot in the ten seconds or so that you have before you do serious damage to your eyes (told you so). But when an astronomer looks at the Sun through a telescope with a strong filter that makes it safe to do so, she sees something different. The astronomer sees both the source of all our power and an amazing set of atomic reactions known as the solar phoenix. In this reaction, six protons combine to form a helium nucleus, two spare protons, two gamma rays, and two anti-electrons (aka positrons). But there is something else created in that reaction; something that is so small and slippery that it is almost impossible to catch: the neutrino.

The solar phoenix reaction. The little νs are neutrinos being given off in the first stage.

The solar phoenix reaction. The little νs are neutrinos being given off in the first stage.

The neutrino is special because without it, the solar phoenix reaction simply can’t happen. Even though it is so small that it would take a million of them to have the same mass as a single electron, the neutrino is essential to the solar phoenix and many other nuclear reactions. It is created in nuclear reactors as a byproduct of fission; roughly 4.5% of the energy in a nuclear reactor is lost as neutrinos!

And the neutrino can also be created by particle accelerators. When they smash two tiny protons or electrons together, they make even smaller bits, one of which is the neutrino. Neutrinos are made in so many ways that they are the second most common particle in the Universe (after the photon), and may be responsible for the “missing mass” known popularly as Dark Matter.

Neutrinos are very, very, very, very, very, very small

Neutrinos are very, very, very, very, very, very, very, very, very, very, very, very, very, very small

The neutrino is special in another way, too. It is the only particle that has been the cause of five Nobel Prizes. The first went to Enrico Fermi in 1938, who predicted its existence in 1933 based on a “missing” amount of energy in what physics wonks call slow neutron reactions (this also led to the discovery of the weak force); amusingly, Fermi’s paper was rejected by the leading scientific journal of the day as being “too remote from reality”. The second was given in 1995 to Clyde Cowan, Frederick Reines, F. B. Harrison, H. W. Kruse, and A. D. McGuire who discovered the neutrino in 1956 (take THAT, leading scientific journal). The third went to Leon M. Lederman, Melvin Schwartz and Jack Steinberger in 1988 for their discovery in 1962 that there was more than one type of neutrino; physicists refer to the three types as flavors because whimsy. The fourth Nobel Prize for neutrino-related work was given in 2002 to Raymond Davis, Jr. and Masatoshi Koshiba for their detection of neutrinos from a supernova; today, the field they founded is known as neutrino astronomy. And the fifth prize (thus far) was awarded in 2015 to Takaaki Kajita and Arthur McDonald who proved that neutrinos change flavors as they move.

The Sun generates energy by making big atoms out of little ones (Image courtesy NASA)

The Sun generates energy by making big atoms out of little ones (Image courtesy NASA)

That is important because until 1988, there was serious concern that the Sun might be going out; about half of the astrophysicists thought it would be with a whimper and the other half thought it would be with a bang. That was because we weren’t detecting the right number of neutrinos from the Sun. Even though the neutrino is so small and interacts too weakly with other matter that it is almost impossible to catch, the Sun puts out so many neutrinos (roughly 1.3 x 1018 each second, or 185 million for every person on Earth) that we can still see some of them. Only we weren’t seeing enough of them. Though we knew that neutrinos had different flavors, the Standard Model in physics said that the neutrinos should stay the same flavor; discovering that they changed flavors would mean that the Standard Model was wrong.

The Sudbury Neutrino Observation detector being installed (Image courtesy CoolCosmos)

The Sudbury Neutrino Observation detector being installed
(Image courtesy CoolCosmos)

And in 1988, using neutrinos captured from reactions in the atmosphere and neutrinos from the Sun’s core, two teams led by Takaaki Kajita and Arthur McDonald discovered that neutrinos do indeed change flavor. The Standard Model was wrong (and the Sun was saved). Thanks to their work, we are learning more about how these small but vital particles help the Universe go round. And last year, they were awarded the Nobel Prize for their work.

If you’d like to learn more about particle physics and maybe do a little prize-worthy work of your own, why not head over to LHC@Home? This website, offered by the same folks who invented the internet, has several different ways to get involved in the search more new and even more interesting particles. To learn more, zip on over to:
http://lhcathome.web.cern.ch/

September 2 – An Ill Wind

Today’s factismal: The lightning in a Category 1 hurricane has enough power to run a house for more than 300 years.

If you read the news today, you know that Hurricane Hermine has come aground in Florida. This ended the long dry spell for hurricanes damaging the US mainland (though Sandy was a hurricane in 2012, it had been downgraded to tropical storm before it came ashore); it was the first time in eleven years that the US mainland was hit. Of course, you don’t have to get a hurricane to get lots of storm damage, just ask the folks who sat through Sandy or Allison. Although it is too early for firm estimates, experts think that the damage from this storm will end up costing the US at least $5 billion.

A satellite image of Hurricane Sandy showing the temperature differences in the clouds (Image courtesy NASA)

A satellite image of Sandy showing the temperature differences in the clouds
(Image courtesy NASA)

So what causes all of that damage? The short answer is “energy”. Hurricanes are nature’s way of taking heat from the equator (where it is hot) and moving it to the poles (where it is cold). They do that by using the heat to evaporate water, which forms clouds, which forms storms. Because that heat also causes the air to expand, it drives winds which can drive water in the form of storm surge. Add it all together and you’ve got a lot of energy moving around, looking for something to break – like Florida.

Hurricane Hermine making landfall in Florida (Image courtesy NOAA)

Hurricane Hermine making landfall in Florida
(Image courtesy NOAA)

But how much of the storms energy is released by the different parts of a hurricane’s life cycle? Scientists have run the numbers and found that a hurricane typically releases about 0.002% of its energy as lightning. Now that may sound like small potatoes, but for a Category 1 hurricane, it works out to be enough energy to run a typical household for 360 years or so. (The trick is catching the lightning.) Storm surge is what does most of the damage along the coast and yet it is just 0.02% of the total energy of the hurricane. The winds in a hurricane are what creates that lightning and tornadoes and other exciting side-effects. They are understandably much more powerful; they represent about 4% of the total energy in a hurricane. Interestingly, the sheer weight of the water falling from the sky as rain and hail releases about as much energy as the wind does. Thus far we’ve accounted for about 9% of the energy in a hurricane with the lightning and the storm surge and the winds and the rain. Where is the rest?

Some of the effects of a hurricane (Image courtesy NOAA)

Some of the effects of a hurricane
(Image courtesy NOAA)

It is released high in the sky as water vapor condenses into rain drops and is known among meteorology wonks as the latent heat of vaporization (which is just a fancy was of saying “the heat stored {latent} in vapor”). As the water vapor is carried higher into the atmosphere by the rising air currents, conditions change so that water vapor is no longer stable and water is; this is what forms clouds (which are just raindrops that are too small to fall). When the water condenses, it gives back some of the energy that was used to turn it into a gas; the rest of the energy has gone into raising the vapor high into the sky and powering all of the other special effects.

But here’s the odd thing. Even though we can use satellites to track hurricanes and help people get out of their way, we still don’t know how reliable our satellite images of the clouds that make up hurricanes are. And that’s where you come in. NASA has a citizen science program called S’COOL that asks for people like you and me to tell them what clouds are out there when the satellites pass by. To participate, float on over to:
http://scool.larc.nasa.gov/rover.html

 

 

May 25 – Let’s Work

Today’s factismal: Marie Curie was awarded her PhD 113 years ago today; she would win her first Nobel prize within the year.

To many scientists, the happiest day of their lives is the day when they finally receive their PhD. Short for “Doctor of Philosophy”, the degree shows that the person has made at least one original contribution to their field. In the case of Marie Curie, that is a bit of an understatement; Marie Curie had been making contributions to the fields of chemistry for more than a decade before getting her PhD in 1903. And, unlike the typical graduate student, Marie Curie’s contributions were important enough that, along with her two collaborators, she was awarded the Nobel prize for Chemistry in 1903.

Marie Curie, discovering yet another element

Marie Curie, discovering yet another element

As is the case with most PhD students, Marie Curie’s PhD actually begins with something that her advisor thought was odd but didn’t have enough time to investigate himself. Marie’s advisor was Henri Becquerel who was investigating X-rays. At the time, all that anyone knew was that uranium could produce an image on a photographic plate even when the plate was covered in paper to prevent all light from getting through. Most scientists thought it was a form of phosphorescence; that is, that the uranium somehow transformed visible light into a different type of light. But Becquerel had discovered that uranium didn’t need visible light to produce the “X-rays” that made the image. So he tossed the problem to Marie Curie and let her work on it.

The photograph Becquerel "took" using uranium

The photograph Becquerel “took” using uranium

The first thing that she did was invent a name for the process by which uranium produced X-rays; she called it radioactivity for “radiation activity”. Using sensitive instruments, she was able to show that the amount of radioactivity depended only on the amount of uranium. Based on that, she hypothesized that the uranium atoms themselves must be breaking down. At the time, that was an incredible idea as atoms had been thought of as indivisible for more than 2,000 years. Nevertheless, Marie Curie was soon shown to be right and was awarded the 1903 Nobel Prize in Physics, along with Henri Becquerel and Pierre Curie.

Marie at work with her husband. That lab coat she is wearing was her wedding gown.

Marie at work with her husband. That lab coat she is wearing was her wedding gown.

She then began trying to isolate the part of the uranium that created the radioactivity. Working with her husband and her advisor, Marie Curie discovered radium (named for the radioactivity it generated)  and polonium (named for Marie Curie’s beloved Poland) Indeed, it was the radioactivity of polonium and radium that allowed Marie Curie to isolate them from uranium. The new elements were so radioactive that they have made her laboratory notebooks too dangerous to handle even now. Because she had been the driving force behind the discoveries, the 1911 Nobel Prize in Chemistry was given to Marie Curie alone.

A page from Marie Curie's notebooks

A page from Marie Curie’s notebooks

Unfortunately, Marie, Pierre, and Henri were awarded another prize as well – radiation poisoning. Because they didn’t understand the dangers associated with radioactivity, all three of them ended up with severe radiation burns; it is likely that their exposure to radioactive elements contributed to the deaths of Becquerel and Marie.

Their discovery of radioactivity led to a new theory of the universe, one which is being tested right now in “atom smashers” across the globe. If you would like to help physicists with their research by donating unused time on your home computer, then why not join LHC@home?
http://lhcathome.web.cern.ch/LHCathome/

March 24 – Nozone Layer

Today’s Factismal: The Vienna Convention for the Protection of the Ozone Layer was adopted in 1985; the ozone hole is expected to recover around 2115.

It is hard to love a molecule whose very name means “it stinks”. But ozone is a vital part of life on Earth. Though it is harmful when found in the lower atmosphere, due to the way it makes things oxidize even faster than normal, it is essential in the stratosphere where it gathers into a region known cleverly enough as the ozone layer. This part of the stratosphere has between two and eight parts per million of ozone. Though that may not sound like much, it is enough to reduce the UV that reaches the ground by a factor of 350 million. If the UV at the top of the atmosphere were represented by the US population, then only one person would make it past the ozone layer to reach the ground.

Ozone concentration in the atmosphere; the red line shows the "hole" near the South Pole (Image courtesy NOAA)

Ozone concentration in the atmosphere; the red line shows the “hole” near the South Pole (Image courtesy NOAA)

But in the 1970s and 1980s, scientists discovered that there was a problem with the ozone layer. For some reason, the amount of ozone in the layer was dropping; it had reached the point where a large hole had developed in the ozone layer over the South Pole. (It happened there first because of the way that the Earth’s atmosphere circulates.) If the trend continued, then the depletion of the ozone layer was expected to reach disastrous proportions within the century. But what was causing the depletion?

It turned out that the cause was good intentions. In the 1920s and 1930s, most refrigerators used an ammonia gas cycle to cool food. Unfortunately, ammonia is poisonous even in small quantities. As a result, though the refrigerators improved life, they also made it a little more hazardous. Scientists developed a new cycle based on chlorofluorocarbon gasses (CFCs) that was both more efficient and less hazardous. What they didn’t know (because the ozone layer hadn’t been discovered yet) was that the CFCs would break down ozone into oxygen, which isn’t nearly as effective at reducing UV. So people started buying these new, safer refrigerators and found more uses for CFCs; one popular use was as the propellant for hairspray and other canned goop. That led to more CFCs being released into the atmosphere and more damage to the ozone layer. By the time that the damage was discovered, it was almost too late to fix it.

But there is a world of difference between “almost too late” and “too late”. In this case, the evidence was overwhelming enough that the international community took swift and decisive action. In 1985, twenty nations signed a treaty limiting the amount of CFCs that could be used. In 1992, the treaty was amended to ban CFCs. Substitutes were found for use in air conditioning, refrigeration, and hair spray.

Unfortunately, it takes time for CFCs to work their way out of the atmosphere. Even though we’ve stopped adding them, (or think that we have) the amount of CFCs has decreased by just 10%. Though that 10% reduction has prevented the ozone hole from getting any larger, it is only slowly getting smaller as more CFCs work their way out of the environment. It is expected that it will take another century before the ozone layer returns to normal.

This hole in the ozone layer and the subsequent treaty that fixed the problem couldn’t have happened unless scientists had been out there, measuring things. And ozone isn’t the only climate challenge that faces us. If you know a teen who would like to help measure key climate indicators such as rainfall and wind speed, then why not send them over to Tracking Climate In Your Backyard?
http://www.museumoftheearth.org/outreach.php?page=citizenscienceed/TCYIB