WTF Fun Fact 13028 – Lifespan of a Dollar Bill

The U.S. Federal Reserve estimates that the average lifespan of a dollar bill is just 6.6 years. Since larger bills get passed around less often, a $100 bill has an average lifespan of 22.9 years.

The lifespan of paper bills

According to the Federal Reserve website (cited below):

“When currency is deposited with a Federal Reserve Bank, the quality of each note is evaluated by sophisticated processing equipment. Notes that meet our strict quality criteria–that is, that are still in good condition–continue to circulate, while those that do not are taken out of circulation and destroyed. This process determines the lifespan of a Federal Reserve note.”

They continue:

“The lifespan of Federal Reserve notes varies by denomination and depends on a number of factors, including how the denomination is used by the public. For example, larger denominations such as $100 notes are often used as a store of value, which means they pass between users less frequently than lower-denominations such as $5 notes, which are more often used for transactions.”

Average currency lifespans and their ultimate fates

A U.S. $5 bill lasts roughly 4.7 years while a $10 may be around for around 5.3 years. Twenty-dollar bills typically stay in circulation for 7.8 years, and $50 bills last over a decade (12.2 years).

The Federal Reserve puts new currency into circulation each day and reclaims damaged money to destroy it. The cash is typically handed over by banks.

Every year, around $200 billion of “unfit currency” gets taken out of circulation.

According to Yahoo Finance: “What makes money too unfit to use? According the Fed, bills that have holes larger than 19 millimeters, or about the size of an aspirin, can no longer be used. Bills that are torn, dirty, or worn out are also removed. And 5-, 10- and 20-dollar bills produced before 1996 are removed automatically because of their age, regardless of condition.”  WTF fun facts

Source: “How long is the lifespan of U.S. paper money?” — U.S. Federal Reserve

WTF Fun Fact 13024 – The Original Kleenex Gas Mask Filter

What we now know as Kleenex tissues were originally designed to be gas mask filters. The original kleenex gas mask filter wasn’t nearly as soft and gentle as our current tissues though.

Kleenex’s original purpose for gas masks

According to the Kimberly-Clark website (cited below):

“The Kleenex® Brand’s story began during the First World War when Kimberly-Clark developed a crepe paper used as a filter within gas masks.”

However, the material wasn’t fully developed by the time the war ended, so the company pivoted into developing soft and smooth facial tissues. They were so popular that today that the words “Kleenex” and tissue are often used interchangeably.

However, there is a chapter in between the gas masks and tissues, according to Kimberly-Clark:

“In the early 1920’s, that very crepe paper innovation was cleverly adapted into a consumer product called Kotex® Brand which helped women with their periods.”

In 1924, the tissues hit shelves in the US as a cold cream and makeup remover.

From gas mask to facial tissue

It wasn’t until 1929 that “Kimberly-Clark’s head researcher was suffering from hay fever and started using the tissues in place of his handkerchief.” A tweak to the marketing to encourage people to use them for their noses doubled sales in the first year.

Of course, it’s much more sustainable to use a handkerchief than a disposable tissue, but it would be many decades before the environmental toll of disposable products became clear to the public.

Today, Kleenex is still the most popular brand of tissue used in the U.S. by far. In 2020, 170.79 million Americans reported using Kleenex brand tissues.

The tissues are also available in 150 countries throughout the world. In addition to facial tissue, Kleenex also makes bathroom tissue, paper towels, tampons, and diapers.  WTF fun facts

Source: “The Tale of Kleenex®” — Kleenex UK official website

WTF Fun Fact 13021 – The Scully Effect

The way we see scientists portrayed in books, movies, and television shows shapes the way we think about science in general. It also affects whether we can relate to the idea of being a scientist. This is especially important for women, who are underrepresented in most science fields. That’s part of what makes the Scully Effect so remarkable.

What is the Scully Effect?

Dana Scully was a character on the iconic television show The X Files. She was one of the first visible examples of a female scientist on a long-running television show. And it turns out that Gillian Anderson’s portrayal of the character changed the way viewers thought about the role of women in science.

According to a blog published by Westcoast Women in Engineering, Science, and Technology (WWEST) on Simon Fraser University’s website (cited below), “a phenomenon called ‘The Scully Effect’ has been anecdotally reported among fans of the TV show The X-Files and women in Science, Technology, Engineering, and Math (STEM) fields.”

In 2017, WWEST started a project called Media Depictions of Women in STEM “to evaluate how woman characters in STEM are depicted in popular media and how this might shape viewers’ ideas of women’s role in STEM (especially viewers in elementary and secondary school).”

They also reported that “a study by the Geena Davis Institute has examined just how much the character of Dana Scully has influenced girls and women to focus on STEM in their schooling and careers.”

What does the research show?

There were 2,021 participants in the study, in which :
– 63% of women who were familiar with Dana Scully said she increased their belief in the importance of STEM.
– 50% of those same women said Scully increased their interest in STEM.
– 43% of women who were medium to heavy viewers of The X-Files were influenced to consider working in STEM fields by Scully.
– 27% were convinced to actually study STEM.

Gillian Anderson said of “The Scully Effect”:

“At the time that Scully showed up [in 1993], we didn’t see that type of female represented very much at all out in the world of television, which is what we look to more and more as examples of who we are and to help make sense of us as human beings. And so, to suddenly have an appealing, intelligent, strong-minded female who was appreciated by her pretty cool male coworker was an awesome thing to behold, and I think that a lot of young women said, ‘That’s me. I’m interested in that. I want to do that. I want to be that.’”

Other research has shown that children tend to associate men with science around age seven due to cultural depictions of men in books, shows, and films. But when this can be prevented by also showing women as scientists, girls are more likely to believe that science is a general career path open to them as well.  WTF fun facts

Source: “The Scully Effect” — Simon Fraser University

WTF Fun Fact 13019 – Surfing Invented By Ancient Polynesians

When was surfing invented? Probably 700 years ago, at least. The Polynesians invented surfing – that is, standing on a board while riding a wave – long before contact with European colonists. Only later did it come to Hawaii.

Who invented surfing?

Surfing goes back to at least the 12th century. Polynesians made the first surfboards from long pieces of wood. They likely used the activity as a means of transportation between the islands of Polynesia. But it was also a sacred activity and a way to train warriors.

Today, we typically think of Hawaii as the birthplace of surfing, but it likely originated on other Polynesian islands. The Polynesians brought it to Hawaii later.

According to the surf blog Errant Waves (cited below):

“Surfing was a central part of the power relationship on these islands. For example, the tribe with the highest rank had the best beaches and the best ‘boards.’ In addition, the chiefs of the tribes were the best surfers in the clan, who therefore were allowed to have the best boards made of the best wood. The ‘normal’ people were not allowed on the beaches of the tribal chiefs. They had to surf on their own, lesser beaches. Surfing was therefore literally a royal sport on these islands.”

While cave paintings tell us how old surfing may be, the earliest descriptions are in the notes of European colonists.

Colonists’ describe the invention of surfing

The first description of surfing on the island of Hawaii comes from the botanist onboard Captain Cook’s HMS Resolution. Joseph Banks wrote:

“(…) the shore was covered with pebbles and large stones; yet, in the midst of these breakers, were ten or twelve Indians swimming for their amusement: whenever a surf broke near them, they dived under it, and, to all appearance with infinite facility, rose again on the other side...At this wonderful scene we stood gazing for more than half an hour, during which time none of the swimmers attempted to come on shore, but seemed to enjoy their sport in the highest degree; we then proceeded in our journey, and late in the evening got back to the fort.”

Less than a decade later, in 1777, Dr. William Anderson wrote another description. In 1777, he published these words about surfers in Tahiti:

“I could not help concluding that this man felt the most supreme pleasure while he was driven on so fast and so smoothly by the sea; especially as, though the tents and ships were so near, he did not seem in the least to envy or even to take any notice of the crowds of his countrymen collected to view them as objects which were rare and curious. During my stay, two or three of the natives came up, who seemed to share his felicity, and always called out when there was an appearance of a favorable swell, as he sometimes missed it by his back being turned, and looking about for it. By then I understood that this exercise… was frequent among them; and they have probably more amusements of this sort which afford them at least as much pleasure as skating, which is the only of ours with whose effects I could compare it.”  WTF fun facts

Source: History of Surfing – Collections of Waikiki

WTF Fun Fact 13014 – Movies Don’t Really Burn Calories

A popular claim that watching scary movies burns as many calories as a walk re-circulates each year around during spooky season. But in reality, movies don’t really burn calories. The claim wasn’t the results of a rigorous study and was misleading. In fact, it was only made for publicity purposes.

What’s the claim about movies burning calories

From clickbait site to serious websites like The Guardian, it’s common to the headline once a year that watching movies like The Shining burn calories because they get your heart racing. And while that’s not false, the claim that watching a scary movie is somehow equivalent or better to exercise is untrue.

According to The Guardian’s piece the year the study came out:

“Those who watched a 90-minute horror film were likely to burn up to 113 calories – the same sort of figure as a half-hour walk. Some movies were more effective than others, however: of the 10 films studied, the top calorie-burners were the classic Stanley Kubrick chiller The Shining (184 calories), Jaws (161 calories) and The Exorcist (158 calories).”

For starters, sitting and doing nothing for 90 minutes can burn anywhere from 60 to 130 calories, depending on the person. You get those points for just existing. So go ahead and watch Steel Magnolias because scaring yourself silly isn’t going to help you lose weight.

The “study” is not really a study

What’s even more problematic is that while there is an academic behind the claims (and metabolism measurements):

  1. He didn’t set out to perform a rigorous scientific study.
  2. The data was never published in a scientific journal (which is important because that requires a study to be worthwhile, constructed correctly, and subjects it to peer review).
  3. The results are unimpressive at best (and genuinely misleading at worst).

The source of the info is Dr. Richard Mackenzie, listed as “senior lecturer and specialist in cell metabolism and physiology at the University of Westminster in London” at the time. He is cited as saying (via university press release, not a journal study) that:

While the scientists did measure heart rate, oxygen intake and carbon dioxide output, the study involved just 10 people and was commissioned by the movie rental firm Lovefilm (now owned by Amazon).

Mackenzie noted that:

“As the pulse quickens and blood pumps around the body faster, the body experiences a surge in adrenaline. It is this release of fast-acting adrenaline, produced during short bursts of intense stress (or in this case, brought on by fear), which is known to lower the appetite, increase the basal metabolic rate and ultimately burn a higher level of calories.”

The top 5 movies he asked people to watch (with calories burned during viewing) were:

1. The Shining: 184 calories
2. Jaws: 161 calories
3. The Exorcist: 158 calories
4. Alien: 152 calories
5. Saw: 133 calories

No, movies don’t burn calories in any helpful way

When Snopes (cited below) checked up on the even more bombastic claim people had made after hearing about the study (that watching horror movies could help reduce obesity), the noted: “The study was neither peer-reviewed nor published (nor, apparently, meant to be taken seriously). No follow-up studies replicating its findings, and people who wish to lose weight are probably better advised to get some exercise.”

Snopes then went on to point out the obscenely small sample size, the lack of replication of the study (mandatory of a study to actually make its way towards being considered scientific), and the failure to follow-up with subjects’ actual weight loss.

But the most important point is that even if everything had been done properly, the results aren’t impressive.

The average length of a feature film is around 90 minutes, during which the average person sitting on their butts and doing nothing burns 60 – 130 calories. If you stand, you might burn 100 – 200 calories, more than the 184 that people watching The Shining burned. The person watching The Texas Chainsaw Massacre in the “study” only burned 107 calories – so we’re pretty skeptical of all of these measurements at this point.

The best we can say is that maybe some people burn a couple of extra calories watching scary movies that they would if they were just watching a blank wall. In other words, get your exercise if you want to burn calories in a meaningful way.  WTF fun facts

Source: “Does Watching Horror Movies Reduce Risk of Obesity?” — Snopes

WTF Fun Fact 13010 – The Invention of the Chocolate Chip Cookie

Fun fact: We have a woman named Ruth Wakefield to thank for the invention of the chocolate chip cookie in 1939. She ran the restaurant at the Toll House Inn in Whitman, Massachusetts. She assumed adding broken pieces of Nestlé Semi-Sweet chocolate to her cookies would make the chocolate melt into the batter. But the chocolate largely maintained its shape, and the cookies became so popular that she published the recipe in a Boston newspaper.

***

Even if chocolate chip cookies aren’t you’re favorite, it’s hard to claim they don’t hold an iconic place in American culinary history. According to the Nestlé website,

“It all started back in 1939. Ruth Wakefield, who ran the successful Toll House restaurant in Whitman, Massachusetts, was mixing a batch of cookies when she decided to add broken pieces of Nestlé Semi-Sweet chocolate into the recipe expecting the chocolate to melt. Instead, the semi-sweet bits held their shape and softened to a delicate creamy texture and the chocolate chip cookie was born. Ruth’s ‘Toll House Crunch Cookie’ recipe was published in a Boston newspaper and her invention of the chocolate chip cookie quickly became the most popular cookie of all-time.”

The original chocolate chip cookie recipe

Want to make the original chocolate chip cookie? Nestlé shared the recipe on their website:

The recipe that started it all

More than 80 years later, Nestlé Toll House’s Original Chocolate Chip Cookies are a true classic and a go-to recipe for all occasions.

Ingredients:

  • 2 1/4 cups all-purpose flour
  • 1 teaspoon baking soda
  • 1 teaspoon salt
  • 1 cup (2 sticks) butter, softened
  • 3/4 cup granulated sugar
  • 3/4 cup packed brown sugar
  • 1 teaspoon vanilla extract
  • 2 large eggs
  • 2 cups (12-oz. pkg.) Nestlé Toll House Semi-Sweet Chocolate Morsels
  • 1 cup chopped nuts (if omitting, add 1-2 tablespoons of all-purpose flour)

Instructions:

Step 1: Preheat oven to 375° F
Step 2: Combine flour, baking soda, and salt in a small bowl. Beat butter, granulated sugar, brown sugar, and vanilla extract in a large mixer bowl until creamy. Add eggs, one at a time, beating well after each addition. Gradually beat in flour mixture. Stir in morsels and nuts. Drop by rounded tablespoon onto ungreased baking sheets
Step 3: Bake for 9 to 11 minutes or until golden brown. Cool on baking sheets for 2 minutes; remove to wire racks to cool completely.

 WTF fun facts

Source: “A timeless discovery: The chocolate chip cookie” — Nestlé

WTF Fun Fact 13006 – Brain Cells Learn To Play Pong

Fun Fact: Lab-grown human and mouse brain cells living in a petri dish became sentient enough to learn to play the video game Pong.
That’s right – scientists found that brain cells learn to play Pong, the 1970s tennis-type video game.

***

In news that we don’t find even remotely comforting, brain cells grown in a petri dish have been shown to become sentient enough to learn to play video games. And we’re not kidding when we say that their next plan is to get the brain cells drunk and see what happens.

Sentient brain cells living in a dish

To be clear, these are cells that are living in a petri dish – not a person. They are human cells derived from stem cells and mouse cells derived from embryonic cells. There are 800,000 cells in total involved in the experiment.

Not only have the cells learned to play the game Pong, but they keep improving. “They played longer rallies and were aced less often,” reported The Guardian (cited below). Of course, Pong is a very simple game, which is why the researchers chose it in the first place.

The study that revealed the experiment was just published in the journal Neuron.

The researchers hail from Cortical Labs, Monash University, the University of Melbourne, and University College London.

How can brain cells learn to play Pong?

According to The Guardian, the researchers out the cells on something called the “DishBrain,” “a multi-electrode array that can sense cell activity and stimulate the cells, then gave the cells feedback on whether the paddle was hitting the ball.”

Within five minutes the cells started to communicate using electrical activity to operate the game. It sounds like sci-fi, but it’s true.

“Now the researchers will see how the cells perform when they are drunk or given medicines. They hope to use the DishBrain to learn more about conditions such as epilepsy and dementia.”

“This is the new way to think about what a neuron is,” a researcher said.  WTF fun facts

Interested? See for yourself:

Source: “Scientists teach brain cells to play video game Pong” — The Guardian

WTF Fun Fact 13004 – You Are More Likely To Die Around 11am

“Fun” fact: Because of a gene linked to our circadian rhythms, humans are more likely to die around 11am than at any other time of day.

***

In 2012, a study published in the Annals of Neurology reported on a gene variant that affects our circadian rhythms. And that variant, it seems, could also predict the time of day you will die. And that time is around 11am.

According to a write-up in The Atlantic (cited below), the study’s authors “realized through their research that there seems to be one DNA sequence that determines, essentially, how each of us relates to time itself. And data analysis — poring through 15 years’ worth of sleep and death patterns collected from subjects in an unrelated sleep study — helped them to make the realization.”

Why do we die around 11am?

A lot of this work has to do with the ways in which we are no longer beholden to a strict social schedule as we age.

Our external environment affects our inner circadian “clocks,” which regulate our bodies’ functions This includes elements like work schedules and daylight exposure. But there’s also a genetic component to the time of day we’re most active and alert.

According to another Atlantic piece on the research that studied the phenomenon: “Researchers at Beth Israel Deaconess Medical Center in Boston borrowed data that had been collected 15 years ago from a sleep study at Rush University in Chicago…First, its subjects had worn a device, called an actigraph, that provided detailed information about their sleep-wake patterns. Second, the subjects were all over the age of 65 when the study was originally done. So by the time the Harvard researchers got to them, many had passed away. They had all also agreed to donate their brains to science. Because of this, the researchers knew their precise times of death. Finally, in the course of the many physical and psychological evaluations undergone by the subjects, they had also had their DNA sequenced.”

The researchers compared data from over 500 participants, looking at genetics and the time of death. They found a piece of DNA that was linked to sleeping and waking hours.

People with a more common gene variant tended to die in the late morning, around 11am.

But this doesn’t mean 11am is a more dangerous time of day. The study participants had largely died of natural causes. So the issue is that their circadian rhythms affected their bodies in a way that late morning was the most common time for them to expire.  WTF fun facts

Source: “You Are Most Likely to Die at 11 a.m.” — The Atlantic

WTF Fun Fact 12997 – A Bristlecone Pine Is The Oldest Tree In The World

We’re not sure what kind of tree we expected to be the oldest in the world. Maybe a redwood or an olive tree, perhaps? But, in fact, a bristlecone pine is the oldest tree in the world (at least the oldest to be confirmed). Its name is Methuselah, and it’s likely over 4,800 years old.

Like the tallest and largest (by volume) trees in the world, the oldest is also located in the U.S. state of California. Luckily, it’s off the beaten path, which is no doubt one of the reason’s it’s managed to survive this long.

Methuselah, the bristlecone pine

Researchers put Methuselah at an amazing 4,854-year-old. Its species is the Great Basin bristlecone pine (Pinus longaeva), and it’s named after the biblical elder Methuselah, who was said to be 969 years old and whose name is now often used for things of advanced age.

The tree Methuselah is located in the White Mountains in eastern California. It lies in “Methuselah Grove” of the Ancient Bristlecone Pine Forest tucked inside Inyo National Forest.

And while its precise location has remained a secret for many years, it (along with the location of other majestic California trees under protection) has been leaked to the public, putting it in danger.

What is the oldest tree in the world older than?

That a bristlecone pine is the oldest tree in the world is already remarkable. But the fact that Methuselah is older than the Egyptian pyramids and is thousands of years older than written language is pretty mind-blowing.

Of course, there are constant challenges from people claiming to find older trees. In fact, there may be another tree nearby that’s older (some claim there is). But right now, Methuselah is the confirmed “winner” (if that’s considered a win).

Even if another tree overtakes it, it hardly matters. In fact, that might only serve to protect the tree that’s older than most civilizations that even ancient historians study (since it predates anything they could have written about themselves). You’d have to go back to cave paintings to find older ones.

The bristlecone pine is the oldest tree in the world

According to the NYT (cited below), “For decades, giant sequoias were believed to be the world’s oldest trees…” In fact, California is home to the tallest, largest (by volume), and oldest trees in the world: a redwood named Hyperion, a giant sequoia named General Sherman, and Methuselah.

Simply googling it will give you a better photo than we have permission to share.  WTF fun facts

Source: “In California, Where Trees Are King, One Hardy Pine Has Survived for 4,800 Years” — The New York Times