WTF Fun Fact 13706 – When was RSV Discovered?

In the wake of the COVID pandemic, more and more people are insisting that RSV is yet another new virus. But it isn’t. If you haven’t heard of it before, it’s simply because you were lucky enough never to get it.

Respiratory Syncytial Virus, commonly known as RSV, has a nuanced history that underscores its impact on global health, particularly in children and the elderly. The disease is notorious for causing respiratory tract infections, ranging from mild cold-like symptoms to severe respiratory distress.

RSV Discovery and Initial Research

The discovery of RSV dates back to the late 1950s when it was first isolated from chimpanzees with respiratory illness, hence its initial name, “chimp coryza agent.”

Shortly after, similar viruses were isolated from children with respiratory infections, confirming the virus’s ability to infect humans and its role in pediatric respiratory diseases.

The Shift in Understanding

Initial research focused on RSV as a cause of illness in infants and young children, where it was identified as the leading cause of lower respiratory tract infections, such as bronchiolitis and pneumonia.

However, over the years, the scope of understanding expanded, recognizing it as a significant cause of respiratory illness in adults, especially the elderly and those with underlying health conditions. This highlighted the virus’s broad impact across age groups.

Vaccine Development Efforts

One of the most challenging aspects of RSV history involves vaccine development. In the 1960s, a formalin-inactivated RSV vaccine trial resulted in worsened outcomes upon natural infection, leading to severe disease and, tragically, fatalities in some vaccinated infants.

This setback significantly impacted future vaccine development strategies and underscored the need for a deeper understanding of RSV immunology.

Treatment for the disease has evolved, focusing on supportive care and, in some cases, the use of antiviral medications or monoclonal antibodies in high-risk groups.

Efforts to develop a safe and effective vaccine have continued, with several candidates now in late-stage clinical trials, promising hope for future prevention strategies.

The Ongoing Challenge of RSV

RSV remains a significant health challenge globally, with millions of children under five years old hospitalized each year due to related illnesses. The seasonal nature of the disease, with annual epidemics in colder months, underscores the ongoing need for effective prevention and treatment strategies.

Current research into RSV seeks not only to develop safe and effective vaccines but also to better understand the virus’s transmission dynamics, pathogenesis, and long-term impacts on health.

As science advances, the hope is to reduce the burden of RSV through improved prevention, early detection, and innovative treatments.

 WTF fun facts

Source: “Human Respiratory Syncytial Virus” — Encyclopedia of Microbiology

WTF Fun Fact 13694 – History of the Chainsaw

The history of the chainsaw, a tool linked with forestry and tree felling, has its roots in surgical practice. Specifically, it aided in childbirth.

Medical Origins of the Chainsaw

The initial conception of the chainsaw was far removed from the lumber yards. Invented by Scottish doctors John Aitken and James Jeffray, it was designed to address a specific challenge in childbirth. According to the 1785 edition of “Principles Of Midwifery, Or Puerperal Medicine,” this crude yet innovative device was intended for use in symphysiotomy procedures. They widen the pubic cartilage and remove obstructive bone. The goal is to facilitate the delivery process when the baby becomes stuck in the birth canal.

This “flexible saw,” as it was described, allowed for the precise cutting away of flesh, cartilage, and bone. Despite its gruesome application, the invention was a medical breakthrough. It also offered a new solution to a life-threatening dilemma faced by mothers and babies.

The Chainsaw Through History

The chainsaw’s medical use continued into the 19th century, with the development of the osteotome by German physician Bernhard Heine in 1830. This device, further refined the concept of the chainsaw for surgical purposes. “The Lancet London” described it as comprising two plates that contained a toothed wheel operated by a handle to cut through bone and tissue.

However, the narrative of the chainsaw took a significant turn at the start of the 20th century, moving beyond the confines of the operating room to the great outdoors.

Birth of the Modern Chainsaw

The transformation of the chainsaw into a tool for woodcutting began earnestly in the late 19th and early 20th centuries. Patents filed in 1883 for the Chain Sawing Machine and in 1906 for the Endless Chain Saw laid the groundwork for its application in producing wooden boards and felling giant redwoods. By 1918, Canadian James Shand patented the first portable chainsaw. This marked a new era for the chainsaw’s use in forestry.

Andreas Stihl subsequently developed and patented the electric chainsaw in 1926. Then came the gas-powered model in 1929. This made the tool more accessible and efficient for logging activities. These early models were large and required two men to operate. They set the stage for post-World War II advancements that made chainsaws lighter and more user-friendly, allowing single-person operation.

 WTF fun facts

Source: “Why were chainsaws invented?” — BBC Science Focus

WTF Fun Fact 13689 – The Origin of the Word Robot

The word “robot” is a term we’ve made synonymous with machines capable of performing tasks autonomously. Surprisingly, the root of “robot” is less about silicon and circuits and more about human history and linguistics.

The Birth of the Word Robot

The word “robot” made its first appearance in the realm of literature, introduced by Czech playwright Karel Čapek in his 1920 play “R.U.R.” or “Rossum’s Universal Robots.” The term comes from the Czech word “robota,” meaning “forced labor” or “drudgery.” It describes artificially created beings designed to perform work for humans.

The etymology reflects a deep historical context, where “robota” was associated with the burdensome toil of serfs. Through Čapek’s narrative, this concept of labor was reimagined, giving birth to what we now understand as robots.

A Universal Term

From its dramatic debut, “robot” quickly became a universal term. It captured the imagination of the public and scientists alike. In doing so, it became the go-to descriptor for the burgeoning field of machines designed to mimic human actions. The transition from a word describing human labor to one embodying mechanical automatons is a testament to the term’s versatility and the evolution of technology.

What started as a fictional concept in Čapek’s play has exploded into a major field of study and development. Robots now roam factory floors, explore other planets, and even perform surgery. It’s far removed from “forced labor” but linked to the idea of performing tasks on behalf of humans.

The Legacy of “Robot”

The origin of “robot” is a reminder of how art and language can influence technology and society. Čapek’s play not only introduced a new word. It also prompted us to think about the ethical and practical implications of creating beings to serve human needs. The word “robot” now carries with it questions of autonomy, ethics, and the future of work and creativity.

The word “robot” is a linguistic snapshot of human innovation and our relationship with technology.

 WTF fun facts

Source: “The Czech Play That Gave Us the Word ‘Robot’” — MIT Press Reader

WTF Fun Fact 13684 – Mark Zuckerberg Tried to Sell Facebook

Mark Zuckerberg, the brain behind Facebook, once tried to sell the platform. Yes, the social media giant that’s now a staple in over 2 billion people’s daily lives was almost handed over to another company before it could spread its wings. Let’s unpack this fascinating slice of history.

The Offer on the Table to Sell Facebook

Back in the early days of Facebook, or “TheFacebook” as it was originally called, Zuckerberg and his co-founders created a buzz on college campuses. It was this buzz that caught the attention of several investors and companies. Among them was Friendster, a once-popular social networking site, which actually made an offer to buy Facebook. The figure tossed around? A cool $10 million.

Reports from ZDNet reveal that in July 2004, Zuckerberg was indeed open to selling Facebook.

Zuckerberg’s Vision

What’s even more interesting is Zuckerberg’s decision to decline all offers. At the time, Facebook was just a fledgling site, far from the global platform it is today. Yet, Zuckerberg saw the potential for something much larger than a college network. He believed in the idea of connecting people in ways that hadn’t been done before.

Selling to Friendster, or any other suitor for that matter, didn’t align with his vision for what Facebook could become.

The Road Not Taken to Sell Facebook

Zuckerberg’s choice to keep Facebook independent was a pivotal moment in the company’s history. It set the stage for Facebook to grow, innovate, and eventually become the social media behemoth we know today. This decision wasn’t just about holding onto a company; it was about believing in the potential of an idea and the impact it could have on the world.

Looking back, it’s clear Zuckerberg’s gamble paid off. Facebook went on to redefine social interaction, media consumption, and digital marketing. It’s interesting to ponder what Facebook might have become had it merged with Friendster. Would it have faded into obscurity, or could it have still risen to the top under different stewardship?

Reflections on a Tech Titan’s Journey

Zuckerberg’s early move to keep Facebook sets a precedent in the tech world about the value of vision over immediate gain. It’s a reminder that in the fast-paced world of startups, sometimes the biggest risk is not taking one at all. Zuckerberg’s faith in his project’s potential is a testament to the power of innovation and persistence.

 WTF fun facts

Source: “Mark Zuckerberg was planning to sell Facebook in July 2004” — ZDNet

WTF Fun Fact 13682 – Lighters Were Invented Before Matches

Lighters were invented before matches. It sounds like a historical hiccup, doesn’t it? After all, you’d think the simpler technology would precede the more complex one.

Yet, the path of innovation and invention doesn’t always follow a straight line. So, let’s flick through the pages of history and see how this came to be.

The Early Flame: How Were Lighters Invented Before Matches?

The first version of a lighter, known as the “Döbereiner’s lamp,” made its debut in the early 19th century, around 1823. This gadget relied on a chemical reaction to produce a flame. It used hydrogen gas, which was produced on the spot by a reaction between zinc and sulfuric acid, to create a spark when it came into contact with a platinum catalyst. This contraption was both fascinating and slightly terrifying, considering the volatile substances involved. Despite its innovation, the Döbereiner’s lamp was far from the pocket lighters we’re familiar with today. It was bulky, somewhat dangerous, and definitely not something you’d want to carry around.

Striking Back: The Advent of Matches

Now, you might wonder, “If they had lighters, why invent matches?” The answer is convenience and safety, or at least an attempt at the latter. Matches made their first successful commercial appearance in 1826, thanks to John Walker, an English chemist. Walker’s friction matches, known as “Lucifers,” were a game-changer. They were portable, relatively easy to use, and didn’t require carrying around a mini chemical lab in your pocket. However, these early matches were far from perfect. They were notorious for their unpleasant odor and the potential to ignite unexpectedly, which posed quite the safety hazard.

Following Walker’s invention, matches underwent a series of transformations to become safer and more reliable. The “safety match” as we know it today was developed by the Swedish chemist Gustaf Erik Pasc. It was later improved by John Edvard Lundström. This invention in the mid-19th century utilized the red phosphorus that we now commonly find on the striking surfaces of matchboxes, significantly reducing the risk of accidental ignition and eliminating the noxious fumes produced by their predecessors.

Why Lighters Took the Back Seat to Matches

Given the initial complexity and danger of early lighters, it’s no wonder that matches caught on fire, metaphorically speaking. They were more accessible to the general public. In addition, they are easier to manufacture, and safer to use once the safety match was developed. Lighters required a level of mechanical and chemical know-how that wasn’t widely accessible until later technological advancements.

As technology progressed, so did the design and safety of lighters. The development of ferrocerium (“flint”) by Carl Auer von Welsbach in the early 20th century. Used in many modern lighters for the spark mechanism, it made lighters more reliable and easier to use. The invention of the butane lighter, with its refillable and controllable flame, eventually brought lighters back into the limelight, offering convenience that matches couldn’t match.

Reflecting on the Flames of Innovation

The tale of lighters and matches is a fascinating narrative about human ingenuity, the evolution of technology, and the nonlinear path of invention. It’s a reminder that sometimes, necessity drives us to develop complex solutions before we find the simpler ones. Or perhaps, it speaks to the nature of innovation itself, where convenience and safety are constantly being reevaluated and redesigned to better serve our needs.

In the end, whether you’re striking a match or flicking a lighter, the ability to control fire remains one of humanity’s defining achievements. The story of how we got here, with lighters appearing on the scene before matches, is just one of many examples of how invention and innovation can take unexpected turns, illuminating the paths of progress in surprising ways.

 WTF fun facts

Source: “The match and lighter war” — The Matches Museum

WTF Fun Fact 13669 – Iceland’s Comedian Mayor

Have you ever heard of Iceland’s comedian mayor, Jón Gnarr? He had an unexpected and captivating rise to political power when he became the Mayor of Reykjavik, Iceland.

From Laughter to Leader

Jon Gnarr wasn’t your typical mayoral candidate. Before venturing into the volatile waves of politics, Gnarr was best known for his work as a comedian and actor. His satirical radio shows and television sketches were beloved in Iceland, making him a household name. But it was in the wake of the 2008 financial crisis that Gnarr found a new stage for his talents.

Iceland was hit hard by the financial meltdown, leading to widespread public distrust in the political establishment. Sensing the public’s yearning for change and perhaps a bit of levity during tough times, Gnarr founded the Best Party in 2009.

It was a satirical political party that started almost as a joke but quickly gained serious momentum.

Gnarr’s campaign was anything but ordinary. Promising a polar bear for the Reykjavik Zoo, free towels at public swimming pools, and a drug-free Parliament by 2020, his platform was a mix of the absurd and the appealing.

The Best Party’s campaign video, set to Tina Turner’s “Simply the Best,” became a viral sensation, showcasing the party’s unique blend of humor and honesty.

What set Gnarr apart was not just his comedic background but his transparency and refusal to play by the unwritten rules of political campaigning. He openly admitted that some of his promises were not realistic. This honesty, oddly enough, resonated with a populace tired of the same old political rhetoric.

Becoming Iceland’s Comedian Mayor

To the shock of many, Jón Gnarr won the mayoral election in 2010. His victory was seen as a direct response to the public’s frustration with the traditional political class. But the big question loomed: Could a comedian effectively lead a city?

Gnarr’s tenure as mayor was as unconventional as his campaign. He often appeared at official events dressed in drag or as a Star Wars character, yet behind the humor was a serious commitment to change. He prioritized human rights, welfare, and culture, and while not all his policies were successful, he brought a fresh, more human face to Icelandic politics.

Challenges and Controversies

Leading a city was not all laughs for Gnarr. He faced criticism for his lack of political experience and some of his more unconventional approaches. Moreover, governing in coalition with the traditionally serious Independence Party posed its own set of challenges and compromises.

Yet, throughout his term, Gnarr maintained his unique style and approach, arguing that humor could be a powerful tool to address serious issues.

Jón Gnarr chose not to seek re-election after his term ended in 2014.

 WTF fun facts

Source: “The joker: Jón Gnarr, the comedian who became mayor” — The Guardian

WTF Fun Fact 13665 – US Time Zones

In the early days of American history, the concept of time was not as unified as it is today. With over a hundred separate time zones, the United States’ approach to timekeeping was a complex and often confusing system. This fascinating period in the nation’s history reveals much about the evolution of time standardization and its impact on society and commerce.

The Era of Numerous Time Zones

Before the adoption of standardized time zones, the United States operated on a surprisingly intricate system of over 144 separate time zones. Each city or town was free to determine its own local time, usually based on the position of the sun. This meant that when it was noon in one town, it could be 12:15 in a neighboring city just a few miles away.

This system was manageable when communities were isolated, but as the country expanded and the railway system connected distant cities, the multitude of local times became problematic. Train schedules were particularly affected, as rail companies struggled to create timetables that made sense across various local times.

The Push for Standardization of Time Zones

The turning point came with the advent of the railroad industry. The need for standardized time became evident as train travel made the flaws of multiple local times apparent. Railroads operated on their own time systems, creating a confusing and sometimes dangerous situation for travelers and operators alike.

The solution emerged in the form of four main time zones proposed by the railroad companies. On November 18, 1883, known as “The Day of Two Noons,” railroads across the country synchronized their clocks to these new standard time zones. This was not an official law but rather a practice adopted by the railroads and the communities they served.

Government Intervention and the Standard Time Act

It wasn’t until March 19, 1918, that the United States government officially adopted the standard time zone system with the Standard Time Act. This act also established daylight saving time, a contentious and ongoing debate to this day. The act was a response to the confusion and inefficiency of having multiple time standards and was also influenced by the needs of World War I.

The transition was not immediate or smooth. People were accustomed to their local times and resisted change. However, over time, the benefits of a standardized system became clear, especially for scheduling trains, conducting business, and broadcasting.

The Impact of Standardization

The move to a standardized time system revolutionized many aspects of American life. It facilitated better communication and coordination across the country, essential for a growing nation. Economic activities, especially those related to transportation and communication, became more efficient and reliable.

Moreover, the concept of time zones influenced the world. Today, time zones are an integral part of global coordination, affecting everything from international flights to the stock market.

 WTF fun facts

Source: “Snoozers Are, In Fact, Losers” — The New Yorker

WTF Fun Fact 13657 – Humanity’s Last Day Together

October 31, 2000, was humanity’s last day all humans were together on Earth.

Since that day, there has always been at least one person in space, marking a continuous human presence off our planet.

The International Space Station: A New Era

The event that initiated this ongoing human presence in space was the launch of Expedition 1 to the International Space Station (ISS). The ISS has since been home to astronauts from around the world. It serves as a research laboratory where scientific studies are conducted in microgravity.

Expedition 1 crew members, William Shepherd (USA), Yuri Gidzenko (Russia), and Sergei Krikalev (Russia), were the pioneers of this new era. They launched aboard a Russian Soyuz rocket and began what has become over two decades of continuous human occupation of the ISS.

The Significance of October 31, 2000: Humanity’s Last Day

This date is more than just a historical milestone. It signifies humanity’s leap into a future where living and working in space is a reality.

The ISS has been instrumental in advancing our understanding of space and science. Research conducted there has led to breakthroughs in medicine, environmental science, and materials engineering. The microgravity environment provides unique conditions for experiments impossible to replicate on Earth.

Future Missions

Living aboard the ISS has provided vital information about the effects of long-duration spaceflight on the human body. This knowledge is crucial for planning future missions to the Moon, Mars, and beyond.

Understanding how to maintain physical and mental health in space is key to the success of these ambitious projects.

As we look to the future, the legacy of October 31, 2000, continues to influence space policy and aspirations.

With plans for lunar bases and Mars expeditions, the horizon of human space habitation is expanding. The ISS has laid the groundwork for these future endeavors, proving that humans can live and thrive in the harsh environment of space.

 WTF fun facts

Source: “Celebrating the 20th Anniversary of the First International Space Station Module” — ISS National Laboratory

WTF Fun Fact 13655 – Ice Age Fire Art

Surviving the Ice Age required more than just hunting and gathering – there was fire art. OK, hear us out.

As they gathered around fires for warmth and safety, something more than just physical comfort emerged. This was a time for them to indulge in an artistic pursuit that continues to fascinate us today.

The Paleolithic Animator and Ice Age Fire Art

In recent research published in PLOS ONE, a team led by archaeologist Andy Needham proposed an intriguing idea. They suggested that Ice Age artists used the flickering light of fire to bring their stone carvings to life.

These 15,000-year-old limestone plaquettes, adorned with animal figures, were not just static art. Instead, under the dynamic light of a fire, they appeared to move, animating the etched creatures. Fire art!

Needham’s team studied various limestone plaquettes found at the Montastruc rock shelter in southern France. These carvings, attributed to the Magdalenian culture, showcased a range of animals like horses, ibex, and reindeer.

Interestingly, these plaquettes showed signs of thermal damage, suggesting exposure to fire. But was this intentional?

Experimental Archaeology Sheds Light

To answer this, the researchers turned to experimental archaeology. They created replica plaquettes and subjected them to different fire scenarios. These experiments aimed to replicate the pinkish discoloration seen on the originals. The results? The patterns suggested that the artworks were deliberately placed near the hearth, likely as part of the creative process.

Further exploring this idea, the team used virtual reality to simulate firelight’s effect on the plaquettes. The results were fascinating. The irregular lighting from the fire brought an illusion of movement, making the animals seem like they were alive and moving across the stone surface.

The Role of Pareidolia in Ice Age Fire Art

This phenomenon can be partly explained by pareidolia, where the human brain perceives familiar patterns in random objects. In the flickering firelight, viewers would see incomplete forms on the plaquettes. Their brains would fill in the gaps, creating a dynamic viewing experience.

The Ice Age artists might have used this to their advantage. They could start with natural rock features to shape their animals, allowing the firelight to complete the picture. This interaction between the art, the rock’s natural form, and the dynamic firelight created a captivating experience, unique to the Paleolithic era.

Beyond survival, these artistic endeavors provided a social outlet. After a day of survival tasks, our ancestors likely gathered around the fire, not just for warmth but for a communal experience. Here, they could indulge in storytelling, companionship, and artistic expression.

The act of creating art by firelight was perhaps as important as the art itself. It wasn’t just about the final product but about the process of creation, the gathering of minds, and the sharing of ideas. This communal aspect of Ice Age art adds a deeply human dimension to our understanding of these ancient peoples.

Art as a Cultural Practice

Ice Age art wasn’t merely aesthetic; it was a cultural practice imbued with meaning. The process of drawing, the summoning of spirits, and even acts of destruction (like deliberate breakage or fire damage) could have had significant roles in their society.

These artistic sessions by the firelight might have served multiple purposes – from summoning spirits to strengthening community bonds. The plaquettes, once used, could have been discarded or intentionally destroyed, suggesting a transient nature to this art form.

 WTF fun facts

Source: “Ice Age Artists May Have Used Firelight to Animate Carvings” — Smithsonian Magazine