Disease – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Thu, 05 Dec 2024 10:29:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Disease – Listorati https://listorati.com 32 32 215494684 10 Surprising Places With Ebola Virus Disease Cases https://listorati.com/10-surprising-places-with-ebola-virus-disease-cases/ https://listorati.com/10-surprising-places-with-ebola-virus-disease-cases/#respond Mon, 25 Nov 2024 00:55:08 +0000 https://listorati.com/10-surprising-places-with-ebola-virus-disease-cases/

Have you heard of Ebola? It’s this disease from West Africa that only kills people there, right? Unfortunately, Ebola hemorrhagic fever, or the Ebola virus disease (EVD), has been on the planet for decades and might be coming soon to a country near you (if it hasn’t been there already). Ebola is known as a hemorrhagic fever, which plainly means that it can affect major organs, damage blood vessels, and cause severe illness in humans. The deadly hemorrhaging virus has been responsible for over 11,000 deaths reported between 1976 and 2016.[1]

Recurrent in West Africa and with cases spread all over the world, it seems like EVD is not leaving us anytime soon. Most reported cases of EVD outside of West Africa have been from health workers who have worked in or been based in West Africa. These infections are a result of exposure to the disease from outside their home countries. As a result, the following ten places have had suspected or confirmed EVD cases over the past five years.

Featured image credit: EPA

10 Lagos, Nigeria

In the summer of 2014, a Liberian-American man flew from Liberia to the city of Lagos in Nigeria. On arrival at the airport, he became violently ill and unfortunately died five days later. Two leading infectious disease doctors who treated him at the hospital also died. This initial EVD case infected a total of 19 people, with seven of them consequently dying.[2]

The virus was eventually declared contained in October 2014 after 42 days with no new cases. In early 2018, the Nigerian Civil Aviation Authority urged Nigerian airports to be vigilant in detecting the virus and began thoroughly screening both passengers and crew arrivals from EVD-affected countries.

9 Gulu, Uganda

EVD cases were first reported in Uganda in 2000 and subsequently in 2012, 2014, and 2018. Due to their proximity, it is thought that the cases are linked to the EVD outbreak in the Democratic Republic of the Congo (DRC) and Sudan. Statistics show that there have been a total of 425 reported cases of EVD in Gulu, Northern Uganda, resulting in 224 deaths.[3]

Since early 2018, there has been an increase in suspected EVD cases in Uganda around the northern and eastern regions. These reports are increasing due to the return of EVD in the DRC and Sudan and a rise in refugees fleeing violence. Many cases have been identified as Marburg disease, a “sister” viral hemorrhagic disease of EVD which presents similar patient symptoms, including internal bleeding and vomiting.

8 Mali

In 2014, an EVD-infected man from Guinea traveled to Mali and subsequently died. The infection spread to a further seven people, resulting in a total of six deaths.[4]

All the same, the response of the health care agencies and Malian government has been championed. There is no need to worry about being infected, as in 2019, Mali has been classified by the UK’s Foreign and Commonwealth Office as a no-go area for citizens from other countries. Only essential travel is advised in most of the country.

7 Glasgow, Scotland

While working in Sierra Leone in 2014, a health worker became infected with Ebola.[5] She has been noted as one of the most controversial EVD patients, as she was undetected on arrival at Heathrow Airport in London. The attending doctor checked her temperature and noted that it was normal. (It was actually high.) However, the patient soon deteriorated and became ill with the virus after arriving home in Glasgow, causing a nationwide panic. The attending doctor has since been suspended due to faking the details of the patient’s examination.

Since this period, after months of isolation, the patient has recovered from the virus. She has, however, returned to the hospital for rechecks, as the Ebola virus has returned in different parts of her body. Twice more, she was close to death but fully recovered. This case has been noted down in history as one of the worst Ebola cases in the West.

6 Dallas, Texas, US

In 2014, a Liberian who had been visiting family in Dallas, Texas, became unwell with the Ebola virus and soon died in a hospital. It emerged that on arriving from Liberia, he had lied on his airport admission documents about the fact that he had been in close contact with EVD-infected people in West Africa. Subsequently, two nurses who attended to him also contracted Ebola. Fortunately, both the nurses survived.

One of the nurses went on to sue the parent company of the hospital for a lack of personal protective equipment and health and safety measures. This subsequently resulted in a settlement.[6] It is unknown if she is still working as a nurse.

5 New York, New York, US


An emergency doctor who returned from volunteering with Medecins Sans Frontieres (MSF) in Guinea in October 2014 became ill just days after getting back to New York. He first went to meet friends and went bowling in the city before locking himself in his flat when he became suspicious of his high temperature. He was then transferred to Bellevue Hospital in New York and put into isolation. Three people who were in close contact with him were also put into isolation for security measures. He eventually recovered after weeks in the hospital.[7]

4 Sardinia


In 2015, a nurse returned to Sardinia after performing three months’ worth of humanitarian aid work in Sierra Leone with the charity organization Emergency. When he began to notice Ebola symptoms, he put himself into isolation and ultimately ended up under quarantine in a specialist hospital in Rome, Italy. The nurse was placed under the care of a doctor who had successful experience treating patients with Ebola and was eventually cured.[8]

3 Madrid, Spain

A Spanish nurse was infected with EVD while treating an infected patient who had been flown into Spain from West Africa. The EVD patient was a Spanish priest who had been working in Sierra Leone.[9]

The nurse survived. However, unfortunately, the priest later died. He was the second Spanish priest to die from EVD. The first had been working as a health worker in Liberia.

2 Cornwall, England


In 2014, A Nigerian security guard was tested for Ebola after visiting his family in Nigeria. He was placed in quarantine for three weeks, a fact which made headlines. The man said he felt victimized by the quarantine. Nigeria was declared Ebola-free only two days after his return.[10]

A Nigerian citizen staying at a Cornish navy base in Cornwall was also quarantined after becoming ill. However, test results identified a rare form of the monkeypox virus, and he was transferred to London for treatment.

1 Saudi Arabia


In 2014, a 40-year-old man returned to Saudi Arabia from a business trip to Sierra Leone. He soon became ill with the Ebola virus and was placed in isolation. He had returned to the country to make a pilgrimage to Jeddah and was stopped so that the disease could not be spread to hundreds of other pilgrims.[11] He is the only known Ebola victim to have traveled to Saudi Arabia.

World Health Organization experts state that Ebola can be passed through close and direct contact with infected people and through handling infected persons’ body fluids, such as blood and saliva. Health care staff are advised to follow strict precautions to reduce the risk of human-to-human transmission by following outbreak protocols. This includes using personal protective equipment when handling suspected or Ebola-positive patients and moving and disposing of the bodies of Ebola patients safely. Ebola is a deadly virus that can recur in different parts of the body months and years after initial infection and treatment.

So remember to wash your hands and watch out for any cuts if you’re traveling or working in any affected regions in Africa. Be sure to take precautions in countries such as Guinea, Sierra Leone, Libera, Nigeria, Sudan, the DRC, and Uganda during your travels.

Caroline Alice is a freelance writer and English language teacher with an interest in health and infectious diseases. Twitter @carolinealiceb

]]>
https://listorati.com/10-surprising-places-with-ebola-virus-disease-cases/feed/ 0 16355
10 Surprising Facts About The Spread Of Disease https://listorati.com/10-surprising-facts-about-the-spread-of-disease/ https://listorati.com/10-surprising-facts-about-the-spread-of-disease/#respond Thu, 17 Oct 2024 20:01:40 +0000 https://listorati.com/10-surprising-facts-about-the-spread-of-disease/

For all our advances in medicine and science, disease outbreaks are still not uncommon. Many people suspect that a plague is one of the ways the world as we know it could end, and it’s a frightening thing to realize how scattered our knowledge of disease has been over the centuries.

10Xenophobia And Rudeness Combat Disease

155665182
Just because we know how germs work and how they spread doesn’t mean we’re not still coming up with new theories. Biologists from the University of New Mexico and the University of British Columbia suspect that we’ve come up with a rather ingenious way of preventing disease from spreading to our respective cultures over the years: We’re jerks.

In a nutshell, the theory states that we’ve developed cultures in which we view strangers and outsiders as people to be avoided and shunned rather than welcomed in order to make sure they keep their germs to themselves. The study examined a wide range of cultures and determined that those exhibiting lower instances of disease tended to be broken up into smaller, more independent groups with their own cultural identity and language rather than a unified culture.

The researchers have also argued that areas with lower rates of disease tend to be more standoffish and less affectionate toward strangers. These are places where gestures like hugging and kissing as a form of greeting are less acceptable. They also tend to form certain cultural taboos, especially concerning what’s acceptable to eat, that happen to keep people away from dangerous pathogens. The theory suggests that consciousness of disease has played a surprisingly large role in shaping cultures across the globe.

9The Five-Second Rule

ims549-015
We’ve all heard that if we drop some food on the ground and pick it up before five seconds have passed, it’s okay to eat it. Despite the shocking number of studies on the phenomenon, science can’t decide whether it’s true.

According to researchers from Clemson University, some bacteria like salmonella can live for up to a month on a typical kitchen floor and transfer instantly to dropped food. Another study by Aston University in the UK, however, tested E. coli and Staphylococcus transfer between food and different types of flooring and found that the longer food was in contact with the floor, the more bacteria was transferred.

It’s important to note that the two studies used different types of bacteria, suggesting that different bacteria act in different ways. It’s probably safer to just refrain from eating anything you drop.

8Sent By The Gods

92831551
In ancient Greece, not much was known about bacteria and the transfer of illness between people. They believed that disease was sent by the gods, going so far as to blame Zeus’s rage at the actions of a single person for plagues that swept through whole cities. Apollo and Artemis were often thought to inflict disease on the men and women who displeased them. Another story blames Pandora for releasing disease upon the world when she opened her box, although some texts—such as The Odyssey—suggest that both causes were true.

The spirits released by Pandora were called the Nosoi in Greece. In Rome, these personifications of disease and corruption were named Lues, Tabes, Macies, Morbus, and Pestis. The Nosoi were given characteristics like other gods: Morbus moved with a sort of world-weary exhaustion, Pestis was greedy, and they were all driven by Erinys, the personification of vengeance. According to Hesiod, when the Nosoi were created by Zeus, he took away their ability to speak so no man would hear them coming and no one would be able to escape them.

7The Work Of Robert Koch

506912009
Most of what we know today about bacteria was built on the foundation laid by Robert Koch. The German scientist who used newspapers to teach himself to read when he was five years old grew up to study at the University of Gottingen under a man named Jacob Henle, who was working on a theory that disease was caused by some sort of mysterious organism or parasite.

After serving in the Franco-Prussian War, Koch set up his own laboratory in his home. With the help of a microscope and some homemade lab equipment, he set about determining what this bacteria was that was supposedly had something to do with anthrax. He was eventually credited with confirming that bacteria was responsible for causing disease and could be transferred from one individual to another through the blood.

He also figured out that bacteria survive in unfavorable conditions by the creation of spores that can hibernate and spawn new bacteria once conditions are better. He experimented with different ways of raising bacteria so they could be more easily studied and outlined the conditions that must be present for bacteria to spread. Perhaps most importantly, he also wrote guidelines for the control of contagious disease, including the importance of keeping the water supply clean.

6Miasma

Anti Smoking man
Throughout the Middle Ages, one of the major theories about how disease was spread concerned the presence of miasma, a toxic gas that built up in the soil when plant and animal matter decayed. From China to Europe, this was the scientifically accepted explanation for disease for centuries.

In the early 1800s, a French chemist named Boussingault performed a series of experiments to confirm the existence of miasma and its responsibility for making people sick. His search for the hypothesized hydrogen compound in miasma that was thought to be the cause was a failure, but discussions with a fellow scientist, Justus von Liebig, led them a step closer to the real culprit. Liebig theorized that something in the miasma, not the miasma itself, entered the bloodstream to make people contagiously ill.

The idea that miasma was somehow responsible for illness became more and more plausible with the growth of cities, especially during times like the summer of 1858, when the Great Stink caused by improper waste removal procedures hit Victorian London. The subsequent outbreak of cholera seemingly supported the miasma theory. Even Florence Nightingale believed in it, stating that one of the biggest sources of infection was drains in houses, which allowed the bad air to come back up into the home and infect entire families.

5Spontaneous Generation

492500959
Today, the idea that people believed in spontaneous generation as late as 1859 seems impossible, but there were tons of recipes for creating life from nothing, including the creation of mice from wheat husks and sweaty underwear placed in a jar, circulating well into the 19th century.

In 1745, a clergyman named John Needham boiled chicken broth until it was free of microbes before sealing it, reopening it later to show that more microbes had developed. This was supposedly proof that spontaneous generation was real.

Initially, the advent of germ theory only seemed to support the idea of spontaneous generation. It was thought that microbes were a byproduct of the disease rather than the cause of it, which fit right in with the idea that they were generated in the body from nothing. It wasn’t until Louis Pasteur published his work on the subject in 1859 that the theory was disproven.

4Lady Mary Wortley Montagu And Vaccinations

162411939
Lady Mary Wortley Montagu was a British noblewoman married to the Turkish ambassador. When the ambassador was sent to Turkey in 1716, he chose to take his wife with him. He didn’t know it at the time, but this simple act would be a major boon to Western knowledge of disease prevention.

Smallpox was an occasionally deadly and often disfiguring disease that ravaged England through the Elizabethan era. We now know that it was a plague as far back as ancient Egypt. While in Turkey, Lady Montagu saw several old women treating children by puncturing veins and exposing the blood to a small amount of smallpox toxin. As a result, they experienced a mild form of the illness before they quickly recovered, now equipped with a lifelong immunity to the disease.

Lady Montagu was astounded that they took the dose of the disease “as they take the waters in other countries” and returned to England with the knowledge. She even ordered the procedure to be performed on her own children. However, many remained skeptical, so Lady Montagu and the wife of the Prince of Wales decided to prove it was safe by convincing a group of prisoners at Newgate Prison to get inoculated. The death row prisoners were granted their lives for their participation in “The Royal Experiment.”

3Ayurveda And The Humors

493761601
One of the most common ideas in history about what makes a person vulnerable to disease is the concept that there’s something wrong with the body’s internal balance. In ancient Greece, the idea was that the body contained four humors, each governing a different part of it. The humors—black bile, yellow bile, blood, and phlegm—all needed to be in balance for health, and many methods of treating disease focused on bringing the body back into balance.

The theory was first formalized by Hippocrates and Galen as early as 200 B.C., but the concept is much older. In the ancient Indian system of Ayurveda, developed between 700 and 400 B.C., it is believed that an imbalance in the three doshas (pitta, vata, and kapha) is responsible for disease. The system used by Ayurvedic physicians to balance the three doshas is still popular today.

According to traditional Chinese medicine, which has been practiced for more than 2,000 years, disease can more easily spread to a body that is weakened by an imbalance of qi, or life forces. Many treatments, such as acupuncture, focus on returning this balance to the body.

2Super Spreaders Aren’t An Anomaly

78747055
Super spreaders are people who, for whatever reason, expose an extremely high number of people to an illness or a disease. One of the most well-known cases is Typhoid Mary, who spread typhoid to a large number of people without suffering from the disease herself.

When researchers study the spread of disease, they look at a variety of different factors to determine the cause, including how many people in a population are vulnerable to disease and how many people a single person exposes. For a long time, they had concluded that super spreaders are anomalies. That turned out to be a false assumption, however. In fact, there’s a good chance that there’s a super spreader in your house right now.

Children are among the most prolific super spreaders. It’s been found that vaccinating 20 percent of children is more effective at stopping the spread of illnesses like the flu than vaccinating 90 percent of people over the age of 65. Due to their immature immune systems, children are contagious longer than adults, and they are typically in contact with more people because of things like school and extracurricular activities.

1The Contagion Theory

463518829
The contagion theory of disease was first proposed by Greek physician and philosopher Galen, who had previously proposed the theory of the four humors. What we now know are germs are what Galen called the “seeds of disease.” These seeds, he guessed, were present in a person’s body and explained why some people developed a disease while others were unaffected.

This theory went overlooked in favor of the theory of the four humors, most likely because there was no way to prove or disprove who had these seeds in their system, while the humors were easily observed in all person. It gained new life, however, when a 16th-century physician named Girolamo Fracastoro began writing about the spread of disease. Not only did he believe that these seeds dictated who would fall ill, he also suggested that they could spread from person to person.

Fracastoro’s theories led to the containment of illness in Italy by quarantine, but for all the good he did, he also had a number of things wrong. He theorized that these seeds spontaneously develop within the body and certain seeds took root in certain humors. These disease-causing seeds would then need to be removed from the body by draining the corresponding humor. His theory soon fell out of favor because it couldn’t be proven, and after about 1650, he faded into obscurity.

Debra Kelly

After having a number of odd jobs from shed-painter to grave-digger, Debra loves writing about the things no history class will teach. She spends much of her time distracted by her two cattle dogs.


Read More:


Twitter

]]>
https://listorati.com/10-surprising-facts-about-the-spread-of-disease/feed/ 0 15557
Top 10 Movies About Plague, Pestilence, And Deadly Disease https://listorati.com/top-10-movies-about-plague-pestilence-and-deadly-disease/ https://listorati.com/top-10-movies-about-plague-pestilence-and-deadly-disease/#respond Sat, 07 Sep 2024 17:05:08 +0000 https://listorati.com/top-10-movies-about-plague-pestilence-and-deadly-disease/

Real life Viruses and Hollywood Viruses are different. Real life Viruses have unpleasant symptoms. Hollywood Viruses have mutant-zombie-vampires with anger issues. Real life viruses can be controlled with handwashing and staying indoors. Hollywood viruses are controlled by running around with guns, controlled explosions and occasionally nuclear weapons. Real Life Viruses are a bit boring. Hollywood Viruses can be awesome. So, after you’ve washed your hands, why not sit down, relax and see how the experts do it.

10 The Omega Man, 1971

Charlton Heston is the last man on earth, pretty much. One of the few survivors of a global pandemic, caused by biological warfare. A lone research scientist (Heston) injects himself with a vaccine of his own design, which seems to work.

However, the solitude of being the only survivor starts to drive Heston a bit mad, and he spends most of his time barricaded inside his apartment, which is stacked high with guns.
So far, so realistic.

However, when he is captured by these virus infected mutants, who he calls The Family, instead of attacking him they put him on trial, or at least a mock trial.
Now it is getting a little surreal.

The Family is run by the head mutant, a former TV anchor-man, played by Anthony Zerbe, who has a disturbing Manson vibe about him.

There’s a lot of other weird stuff too, including a lot of spear-throwing (unnecessary it would seem, given the abundance of weaponry in Heston’s apartment) and a crucifixion.

Most disturbing, however is the amount of time that Charlton Heston spends shirtless, for no good reason.

9 Blindness, 2008

Mark Ruffalo is a doctor who treats a man that has gone suddenly blind. The following day, Ruffalo, too, goes blind, and he realises that the blindness must have been caused by some kind of contagion.

The virus spreads, causing a whole city to become sightless overnight.

Except for Ruffalo’s wife, played by Julianne Moore, who retains her sight. In order to be able to stay with her husband, however, she pretends to be blind too.

Blindness is a film about what happens when we become completely dependent upon the kindness of strangers, and about how thin the veneer of decency can be when it’s every man for himself.

8 Outbreak, 1995

Outbreak, released in 1995, concerned an outbreak of an Ebola-like virus in Zaire, and it was an immediate success. Partly this was due to the performance of its all-star cast, and partly because, at the time of its release, Ebola was breaking out in, of all places, Zaire.

The virus is spread through a series of unfortunate, not to say unlikely, events, which include a military cover-up, a smuggled monkey, and its release into the wild, and a broken vial of blood which releases the virus as effectively as Pandora and her box.

Starring Dustin Hoffman, Rene Russo and Morgan Freeman, with star turns by Kevin Spacey, Donald Sutherland and Cuba Gooding Jr, the film’s premise was a little bit ridiculous but the levels of denial among those who should know better is spot on.

7 I Am Legend

How do you cure cancer? Easy. Give it the measles.

Barmy? Possibly. For some reason, not adequately explored, someone must have skipped the usual drug trial protocols, because, next thing you know, the measles has wiped out most of the world’s population. Oops.

Not to worry. Will Smith, is a former soldier turned virologist. Which means that when the measles turns his neighbours into mutant-zombie-vampires, he is trained to both fight them and cure them, all while trying to make contact with other virus-free survivors.

Living alone, with only his dog and some shop mannequins for company, Smith starts to go a little bit mad. He is plagued by the question of whether he is the only person to survive the virus. Could there be other people out there too? I am Legend was well received both critically and popularly, with everyone praising the performance of Will Smith. And his dog.

The mannequins were a bit wooden.

6 The Andromeda Strain

The Andromeda Strain was based on a novel by Michael Crichton, who was a doctor before he was a writer, so presumably knew a thing or two about viruses. When a satellite returns to earth, it brings with it a micro-organism that causes blood to clot in the veins. Those people who don’t die instantly, are driven to kill themselves.

Obviously, NASA has a protocol for dealing with alien micro-organisms. This protocol, codenamed Wildfire, requires sending a crack team of scientists to investigate, while the military prefer their own solution – let’s nuke it. Isn’t that always their solution?

The movie focusses on the disconnect between science and the military, and the dangers of devising rigid protocols to deal with unknown situations.

5 Contagion, 2011

This one is a little bit scary. Directed by Steven Soderbergh, Contagion is a movie about how viruses spread. It is about how difficult they are to contain, and how devastating the consequences can be when they’re not contained.

The film has been praised by scientists for its accurate portrayal of the difficulties of dealing with pandemics. Its all-star cast may help distract you from the impending breakdown of society. The film has everything, from politicians trying to downplay the seriousness of the epidemic, to charlatans trying to make a quick buck selling homeopathic cures, and heroic scientists who work round the clock to try to develop a vaccine.

Soderbergh said that he was trying to make an ‘ultra-realistic’ film about pandemics, and their affect on social order. Job Done.

4 28 Days Later, 2003

When Cillian Murphy awakes from a coma after 4 weeks, the world is a different place. He walks the streets of a deserted London, wondering what on earth happened has happened, and looking for signs of life.

It turns out that an animal rights group has accidentally released a chimpanzee with a highly contagious virus which causes extreme rage and loss of control. During the 28 days he has been asleep, society has collapsed, and the world has all but ended.

28 Days Later is not a film about viruses, as such, but about what happens to society when the normal rules of life are suspended.
It’s not pretty.

3 Train to Busan, 2016

If you want a virus outbreak film that doesn’t take itself too seriously, you could go for Train to Busan. A South Korean action/horror film, it broke records in Korea for audience size.
Imagine you are on a busy train. A woman boards at the last minute, looking pretty ill. The train has barely pulled out of the station before the woman mutates into a zombie-figure, who then attacks the guard, who then also mutates.

Not only that, but, whilst trying to quarantine the infected passengers in one railway car, your train passes burning buildings, and other mutant-zombies, so there’s no point trying to get off. What do you do next?

Train to Busan has been described as ‘the best zombie film ever’, and did wonders for the popularity of South Korean cinema, although probably not quite so much for its train companies.

2 12 Monkeys

What do you do if a deadly virus has wiped out most of humanity? Obviously, you build a time-machine and send Bruce Willis back from a dystopian future to sort it out. 12 Monkeys is directed by Terry Gilliam, so you know it’s also going to be a little bit strange.

Brad Pitt is certainly strange. As are the other inmates of the lunatic asylum to which Bruce is very quickly confined. Pitt’s performance won him a well-deserved Oscar nomination for his performance as an anarchist eco-terrorist with daddy issues and a side serving of psychosis.

In truth the movie isn’t really about a virus. It’s Bruce Willis saving the world. Again.

And that is always fun to watch. But it is Terry Gilliam’s direction, with his trademark black humour and twisted endings and Brad Pitt’s crazy man performance that takes this film from fun to fantastic.

1 Death in Venice, 1971

Death In Venice stands apart from the others on this list as being not just entertainment, but art. Scene after scene we are met with some of the most beautifully filmed images of one of the world’s most beautiful places: Venice. The film follows Gustav von Aschenbach who is taking time to recuperate from a nervous breakdown in Venice, which—ironically—is beginning to feel the effects of a cholera epidemic.

In between lusting after an adolescent polish boy staying at the same hotel and dealing with a mid-life crisis, Aschenbach has flashbacks to the death of his daughter and his career as a composer. The unravelling of the of the protagonist’s life through the film leads us to one of the most poignant and macabre endings ever. Director Luchino Visconti (featured on Top 10 Films About Economic Disaster You Really Need To Watch for The Damned) proved himself a true visionary in the production of this film.

The soundtrack by Gustav Mahler is eerie, beautiful, serene, and breathtaking.

Watch this film before any other on this list.

]]>
https://listorati.com/top-10-movies-about-plague-pestilence-and-deadly-disease/feed/ 0 14786
10 Bizarre Ways Our Ancestors Explained Disease https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/ https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/#respond Thu, 16 May 2024 06:34:58 +0000 https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/

We all admire and respect medical experts for their knowledge and ability to help us overcome various sicknesses and diseases. We forget, however, that doctors are only human and as capable of mistakes as the rest of us. This was especially true in the past, when the diseases that afflicted the human race led doctors and medical experts to some truly bizarre theories and explanations.

10Spread Of Diseases Caused By Night Air

05

In the Middle Ages, the theory of miasma was born. According to this theory, “bad air,” which emanated from decaying organic matter, caused diseases such as cholera, Chlamydia and the Black Death. It seemed to worsen around swamps and during the night. Thus most people avoided the night air by going indoors and keeping their windows tightly shut.

When John Adams and Benjamin Franklin, two prominent American figures, were traveling together in 1776, they were forced to share a room in a crowded inn. Adams later noted in his autobiography that “the window was open and I, who was an invalid and afraid of the Air in the night (blowing upon me) shut it close.” However, Franklin objected and convinced Adams to reopen the window. The fact that a highly educated man like Adams, who later went on to become president, believed that nighttime air was noxious, shows us that the miasma theory was widespread and not solely limited to the poorer, uneducated classes. Indeed, doctors and other highly educated men supported the miasma theory for over a century.

Though the reasoning was flawed, closed windows did have some good health effects. Closed windows helped the prevention of malaria or the poison which produces autumnal fever and the exclusion of moisture, which often chills the body.

In the second half of the 19th century, the miasma theory was replaced by the germ theory.

9Epilepsy Caused By Divine Visitation

02

The early Greeks thought that epilepsy (a word which originated from the Greek verb epilambaneim, meaning “to seize, possess, or afflict”) was caused by “divine” visitation. Epilepsy was also known as a “sacred disease,” and it went by more than just one name. Some other names for epilepsy in Ancient Greece were “seliniasmos,” “Herculian disease” (because it affected the demigod Hercules), and “demonism.”

Epilepsy was considered to be a miasma—pollution or noxious form of “bad air”—that was cast upon the human soul. Thus epilepsy was regarded as divine punishment for sinners and was connected with Selene, the goddess of the Moon, since it was believed that those who offended her were afflicted with the disease.

The Ancient Greeks attributed the disease to different deities depending on the different symptoms that occurred during an epileptic fit. Thus, if the fit included teeth gnashing, epilepsy was ascribed to the goddess Cybele (goddess of nature). If the victim of epilepsy screamed like a horse, the disease was ascribed to the god Poseidon (god of the sea, earthquakes, and horses). The cure for epilepsy included a process of ritual purification as well as well as the recital of healing chants.

8Leprosy Caused By Divine Retribution

03

In the Middle Ages, leprosy was thought to have been caused by divine retribution. Victims of leprosy were believed to be suffering from the disease as a result of their wickedness and personal sin. This explanation for the disease was especially popularized by several biblical accounts, in which leprosy is sent to sinners as a divine punishment. Leprosy was seen both as a disease of the body and a disease of the soul. Thus, lepers were seen as a threat to society not only because of their physical condition but also because of their moral decay which the morally upright were terrified of catching.

As a result, lepers were treated horribly during the Middle Ages—they were shunned by society, were often forced to wear bells to warn people of their approach, and sometimes had to attend their own funeral mass during which they were declared officially dead to the community.

7Colds Caused By Waste Matter

01

The ancient Greek doctor Hippocrates is often considered as the father of medicine. He was the first person to dispel the myth that diseases were caused by angry gods and insisted that illnesses were caused by nothing more but outside factors on Earth. In fact, his influence and teachings were so influential that in the past, physicians took a Hippocratic Oath, swearing to uphold specific ethical standards.

However, in a time when the most absurd explanations for diseases were born, Hippocrates was no exception and contributed some crazy theories of his own, such as his belief that colds were caused by waste matter buildup on the brain. According to Hippocrates, when this waste matter overflowed, it resulted in a runny nose. This is where the Greek word for the common cold, catarrh originated. In Greek, catarrh means “flow,” and the Greek word is in fact still used in English today.

6Mental Illness Caused By Witchcraft

04

In the Middle Ages, people who suffered from mental disorders were thought to be either under the curse of witches or wizards or possessed by the devil. The most common medieval treatment of mental illness was exorcism. During the Renaissance, burning the body and saving the captive soul was the preferred method of “treating” the mentally ill.

During the Middle Ages and the Renaissance, all the tragedies of humanity fell on witches and diabolical possession. Women were condemned as witches far more frequently than men because it was widely believed that women were more likely to be afflicted by demonic possession due to their weaker and more imperfect nature. It was thought that a woman’s reproductive system was the proof of this, with the uterus being the source of evil. Supposedly, during their menstruation cycle, women were full of venom that contaminated them and gave them power to contaminate others.

It was also believed that through imagination one could produce physical changes in the body, and thus imagination was seen as another form of witchcraft. It was thought that the uterus received pathological images that could not be subdued. However the principal process of imagination originated in the spleen. Thus, because two organs—the uterus and the spleen—which could produce pathological images existed, women had two sources of evil and were more powerful than men, since men could only practice evil through their spleen.

5Hysteria Caused By A Wandering Womb

06

In Ancient Greece, women who suffered from any type of mental illness were considered to be victims of hysteria. And hysteria, according to the ancient Greek doctor Hippocrates, was caused by a wandering womb. According to the Ancient Greek physician Aretaeus, the womb could move upward and downward as well as left and right. So for example, if the womb moved up, it caused sluggishness, lack of strength, and vertigo. If the womb moved down, it caused a sense of choking as well as a loss of speech and sensibility. The womb moving downward could also cause a sudden, incredible death.

To cure a wandering womb, physicians applied pleasant scents, such as honey, to the vagina because the womb advanced toward them. Alternatively, the womb could also be driven away from the upper body to where it belonged through the application foul scents. Other prescriptions for a wandering womb included constantly chewing on cloves of garlic, hot and cold baths, consistent sex, as well as frequent pregnancy to keep the bored womb occupied and less likely to migrate around the female body.

4Porphyria Explained As Vampirism

07

Many myths surrounding vampirism emerged during the Middle Ages. However, it is now believed that a rare genetic disease called porphyria may have actually started the bizarre tales concerning “creatures of the night” and not just the easily excitable minds of Middle Age peasantry.

Scientific and medical knowledge was highly limited during the Middle Ages and thus the effects of porphyria could have easily been misconstrued as something of a supernatural nature. Patients with porphyria are extremely sensitive to sunlight and thus may rarely go outside. If they do dare wander outside, the Sun may cause terrible disfigurements to the patient’s hands, feet, or face. In worst case scenarios, their face may seem mutilated or distorted. Their noses, ears, or lips could recede or fall off, and excessive hair growth may occur, making them seem like a wolf or an animal (hence the werewolf myth, another popular tale during the Middle Ages).

Porphyria can also cause erythrodontia (the red discoloration of teeth) as well as receding gums that could have created the illusion of fangs. As for garlic (we all know those blood-suckers hate it), its consumption results in the worsening of porphyria symptoms and might actually inflict pain and cause the patient to become sick.

Today, porphyria is sometimes treated with the injection of a blood product called “heme.” Of course, treatment like that did not exist in the Middle Ages so if we get a little creative with our imaginations, victims might have been instinctively seeking heme by biting human victims and drinking their blood. Brothers and sisters could have unknowingly shared the defective gene that caused porphyria, so a victim of the disease biting their sibling for blood might have triggered an attack of the disease in the bitten sibling, creating a new “vampire” (hence the myth that a vampire’s bite resulted in the victim becoming a vampire as well).

3Ulcers Caused By Stress

08

William Brinton was one of the first few doctors to describe a stomach ulcer in 1857, but the lack of diagnostic tools made ulcer detection incredibly difficult. As well as that, no causative agent of ulcers could be found, and no single associated germ existed. Thus, doctors worldwide turned to the study of psychic and environmental factors to explain the appearance of ulcers. Eventually, it was agreed that poor diet, smoking, and stress caused high acid levels and so were the cause of ulcers. Doctors Arvey Rogers and Donna Hoel even wrote that “a peptic ulcer used to be a badge of success. Up-and-coming professionals were expected to earn one, and if they didn’t maybe they weren’t working and worrying hard enough.” The medical advice dispensed by doctors worldwide was to take antacids and modify your lifestyle.

However, patients with serious ulcer problems fell so ill that they had to have their stomachs removed and sometimes bled until they died. Shocked by all this atrocity, a physician named Barry Marshall and a pathologist named Robin Warren began working together in 1981, determined to get to the bottom of what really caused ulcers. Two years earlier, Warren discovered that the gut could be overrun by bacteria called Helicobacter pylori. Through biopsying ulcer patients and culturing organisms in the lab, Marshall traced ulcers (and stomach cancer) to this gut infection. The cure was antibiotics.

The world stayed skeptical until Marshall (who was unable to make his study with mice and who was not allowed to experiment on people) drank the Helicobacter pylori himself. Within days, he developed gastritis, the precursor to an ulcer. He felt sick and exhausted and started to vomit. Back in the lab, he biopsied his own gut, culturing the Helicobacter pylori and proving to the whole world that it was not stress but bacteria that was the cause of ulcers.

2Autism Caused By The Lack Of Maternal Warmth

09

The syndrome of autism was first identified by a child psychiatrist, Leo Kanner, in a 1943 paper. However, he went further than simply describing the schizophrenia-like features of children by focusing profoundly on their parents and their role in contributing to the syndrome.

Kanner had observed a small sampling of children from educated families and concluded that the parents of autistic children tended to be highly intelligent but at the same time coldhearted and formal. He claimed that autistic children were raised in isolation with no warmth emanating from their mothers or fathers. In fact, he went as far as to say that the parents of autistic children were “just happening to defrost enough to produce a child.” Kanner was not the only one to blame the parents. Numerous other psychoanalysts and child development specialists such as Bruno Bettelheim stressed the role of the parents in causing autism which gave rise to the “refrigerator mother” theory. Throughout the 1950s and 1960s, “refrigerator mothers” (and fathers) not only had to deal with their autistic children but also had to bear the guilt of turning them autistic in the first place.

In early 1960s, however, the refrigerator theory came under fire as parents of autistic children began to fight back. Kanner eventually abandoned his original position, although other specialists such as Bruno Bettelheim continued to defend it. The bizarre refrigerator theory was mostly abandoned in the 1970s, but small numbers of its supporters are still scattered across Europe and places such as South Korea to this day.

1Birth Defects Caused By Maternal Impressions

10

According to the theory of maternal impressions, any fears, desires or strong emotions a woman experiences during her pregnancy months could have a significant effects on her child’s physical appearance. This theory was extremely popular in the 18th century and was often used to explain birth defects. Thus, if a child was born deaf, for example, this was the result of the mother having been shocked by a loud sound during her pregnancy. As a consequence, it was advised that pregnant women exposed themselves solely to pleasant stimulation and were advised to visit galleries and concerts to ensure that their child was cultured and healthy.

However, the theory of maternal impressions was not confined to the 18th century only and in fact goes back centuries. In ancient Greece, the Greek physician Galen believed that if a pregnant woman looked at an image of someone, her child could resemble that individual. So the practice of looking at statutes the mother admired was encouraged to produce attractive children.

It was also believed that a pregnant woman’s mental state not only caused vascular birthmarks but also influence their shape and location. Thus, if a woman craved or ate a lot of strawberries during her pregnancy, she could have a child who had a birthmark that resembled a strawberry.

The maternal impressions theory thrived through the Middle Ages, the Renaissance and the 18th century. It was eventually challenged by the physician and anatomist William Hunter in mid-18th century, but most people still believed that maternal impressions had an impact on infants and thus this rather bizarre theory continued right into the 19th century. By the end of the 19th century, however, the maternal impressions theory was dismissed completely.

Laura is a student from Ireland in love with books, writing, coffee, and cats.

]]>
https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/feed/ 0 12316
10 Weird Ways Disease Altered The World https://listorati.com/10-weird-ways-disease-altered-the-world/ https://listorati.com/10-weird-ways-disease-altered-the-world/#respond Fri, 12 Apr 2024 03:54:25 +0000 https://listorati.com/10-weird-ways-disease-altered-the-world/

Diseases leave obvious imprints on history. A decrease in population size and less genetic diversity are some examples of the impact you’d expect every epidemic to have. However, every once in a while, a disease has a truly remarkable and unusual effect on the world.

10Flu Of 1918 And The Treaty Of Versailles

1

The Flu of 1918 devastated the world and infected one-third of the population. Additionally, it damaged brain cells, affecting the brain’s ability to function and even resulting in psychosis. In April 1919, Woodrow Wilson became infected with the flu. Wilson was president at the time and played an instrumental role in the negotiations of the Treaty of Versailles, particularly standing up the France’s prime minister, Georges Clemenceau, who wanted to dismantle Germany.

As Wilson was recovering from the flu, many White House officials noted a change in his demeanor. Wilson was described as slow, tired, and focused on strange notions. After these odd reports, Wilson abandoned many of his ideas about the Treaty, which gave power to Clemenceau. Many argue that the harshness of the Treaty of Versailles resulted in disaster for Germany, the crippling of the German economy, and played a role in Hitler’s ability to gain power. All of this could be the result of Woodrow Wilson’s bout of the flu.

9Tuberculosis And Expansion Of Western Frontier

2

During the tuberculosis outbreak of the 1900s, many believed in miasma theory, the belief that sickness is caused by bad air and pollution. The idea was promoted by Edward Trudeau, a doctor from New York who was infected with tuberculosis and, after moving to the Adirondacks, noticed an improvement in his condition. He began spreading the news that fresh air and nature were a cure.

Upon hearing this, thousands of Americans moved west in search of better health, and many campaigns for western expansion were targeted toward “health seekers.” People infected with tuberculosis migrated in large numbers with pioneers and explorers.

8Cholera And The Rise Of Epidemiology

3

In 1854, John Snow removed the handle of a water pump and created an entire branch of medicine.

Snow, a physician during the cholera epidemic in London, was suspicious of the way the disease was spreading. He rejected the miasma theory and observed how clusters of disease were popping up among people who used certain water pumps.

His intervention of removing the infected pump handle helped decrease the rates of infection during the epidemic. Additionally, he was the first to use epidemiological methods to control the spread of disease.

7Hookworm And Economic Development In The South

4

Hookworm is a parasite that lives in the human intestine, feeds on human nutrients, and can be transmitted through fecal matter. Hookworm can cause a rash and diarrhea, but hookworm disease can lead to more chronic symptoms. In the South during the early 1900s, hookworm disease slowly rose to epidemic proportions and resulted in lethargy, iron deficiency, and stunted growth.

Over time, symptoms of hookworm helped create stereotypes about Southerners being drawling, unindustrious, or lazy. After the epidemic was identified and efforts were made to prevent infection, the South saw more children enrolling in school, better crop prices, and a rise in income.

6Tuberculosis’s Effect On Fashion

5

In the late 1800s, tuberculosis, an infectious disease of the lungs, had become an epidemic in the US and Europe. Since the disease was around for so long and killed very slowly, its qualities started to be romanticized in the Victorian era. Fashions characterized by being pale and slim became popular, and the disease itself became trendy.

When scientists learned more about the illness in the 1900s, they sparked some of the first major public health campaigns in the US. Hemlines for women’s dresses and skirts became shorter to prevent them from picking up tuberculosis on the street. Beards and mustaches were exchanged for a clean shave because of the possibility that bacteria could be living in facial hair.

5Bubonic Plague And The Catholic Church

6Photo credit: Henri Segur

]]>
https://listorati.com/10-weird-ways-disease-altered-the-world/feed/ 0 11476
Top 10 Crazy Ways Of Detecting Disease https://listorati.com/top-10-crazy-ways-of-detecting-disease/ https://listorati.com/top-10-crazy-ways-of-detecting-disease/#respond Sun, 03 Mar 2024 00:17:22 +0000 https://listorati.com/top-10-crazy-ways-of-detecting-disease/

Medicine is not usually like a House episode. There is no lone genius who pulls a diagnosis out of a hat having used just his wits and a rubber band to solve a case. Most medical work is a steady slog through a patient’s symptoms until the cause of a problem is found. That’s a good thing, as it’s the way that most people can be saved. But sometimes, the solution to a medical mystery can be solved in a most unusual way. Here are ten weird ways, both historical and modern, that diseases have been detected.

10 Dogs Sniff Out Disease

Dogs are man’s best friend, but we’re now learning that they’re also a doctor’s best helper. The sensitivity of dogs’ noses has found them roles in the armed forces, police, and hunting. Able to recognize trace scents at low concentrations, dogs can be trained to alert us nasally blind humans to the presence of a great many things. With bombs or drugs, this ability has long been used, but now hospitals and researchers are employing canine assistants to help with medical problems.

Many diseases cause metabolic changes in the body. This changes the levels of certain compounds or introduces new chemicals to the mix. Doctors rely on these changes when they run blood tests, but said tests often take some time to get results. The ideal test would be one with instant answers. Dogs may be able to supply them. Many of the compounds produced by a diseased body are volatile and can be detected by a nose—more specifically a dog’s nose.[1] Dogs have been trained to pick up the scent of various cancers, low blood sugar, and even the approach of seizures.

While dogs can smell out illness, don’t expect Dr. Fido at your next appointment. They may be sensitive, but training a dog to accurately respond is a time-intensive and expensive process. The canines can also can grow bored and stressed if they do not detect a disease in any samples, much like medical students.

9 Tasting Urine


The tongue is a versatile organ, and before the discovery of accurate chemical analyses, it was one of a doctor’s most useful tools. It might strike us as a bit disgusting, but tasting a patient’s urine gave their medic an insight into what may have been wrong with them.

In 6 BC, a Hindu doctor named Sushrata wrote a description of “Honey Urine,” a disease marked by unusually sweet urine. He noted that ants would crowd around it and sampled it for himself.[2] Today, we know that the urine of untreated diabetics contains high levels of sugar. In the 17th century, an English doctor documented the same thing, describing a disease known as “the pissing evil,” which caused urine that was “wonderfully sweet as if it were imbued with honey or sugar.”

The sweetness of a diabetic’s urine could also be detected without resorting to tasting it. In one case, it was spotted when sugar crystals were noticed on the black shoes of a patient. They had formed as splashes of urine dried.

8 Rabbit And Frog Pregnancy Tests


For most of human history, women have generally had to wait many months to discover if they were pregnant. Only with the telling bump of their belly could they be certain. In the early 20th century, that changed with the development of rabbit and frog pregnancy tests.

It was found that when a woman is pregnant, her urine would contain a hormone called human chorionic gonadotropin (hCG). Testing directly for this would have been too difficult and costly to be a useful procedure at the time. However, when hCG is injected into a female rabbit, it causes telltale swelling and color changes in the rabbit’s ovaries.[3] So a woman’s urine would be injected into a rabbit to see if she was pregnant. At first, this required the death of the rabbit to allow the ovaries to be examined, though later, nonlethal methods were found.

The rabbit test was soon replaced with the frog test. The African clawed frog would have urine injected into it, too. If the frog produced eggs the next day, then the pregnancy test was positive, as hCG causes the frogs to ovulate. This test had the benefit of being cheaper than the rabbit test and had easier-to-recognize results. It was the standard pregnancy test into the 1950s.

7 Diagnosis By TV

In an episode of House, the medical marvel kidnaps the star of his favorite TV soap opera when he spots a symptom he feels needs investigating. Such a wacky event could never happen in real life, surely? Well, it did—minus the kidnapping.

Tarek El Moussa, host of home renovation show Flip or Flop, was diagnosed with cancer when a viewer wrote in. The viewer had been watching a marathon of the show and noticed that El Moussa had a lump on his throat.[4] Most viewers, like El Moussa himself, probably wrote it off as nothing more than one of those lumps and bumps we all have. But Ryan Reade was a nurse, and her concerns were justified.

El Moussa’s doctors discovered that the lump was thyroid cancer that had spread to his lymph nodes. El Moussa had the tumor removed and underwent chemotherapy. He later got to thank nurse Ryan on TV, crediting her with probably saving his life.

6 Ear Folds Can Reveal Heart Disease

In 1973, Dr. Sanders T. Frank wrote a letter to The New England Journal of Medicine, in which he drew attention to the link between patients with angina and a diagonal fold in their earlobes. This symptom is now known as Frank’s sign and has been linked to heart disease and strokes.[5] A recent study found that in people who have had a stroke, Frank’s sign was present in over 75 percent of cases.

There is no medical consensus as to why problems of the heart and blood flow should cause this diagonal fold. Some suggest that it is caused by problems in the arteries present in that area. Others contend that it is related to increased cell aging.

While Frank’s sign has only recently been recognized, historical cases have been found. Busts of the Roman emperor Hadrian show his ears with pronounced diagonal folds. How did Hadrian die? Ancient sources describe him as having symptoms that would today be linked to congestive heart failure.

5 Dementia Can Be Diagnosed From Writing

Iris Murdoch, the brilliant philosopher and novelist, succumbed to Alzheimer’s Disease, which left her confused and unable to remember anything. This cruel disease would eventually prevent her from writing any more, but it may have left clues which will allow others with the disease to be diagnosed earlier.

By studying Murdoch’s 26 novels, scientists were able to compare her use of language over time. While her style and construction of sentences remained the same, her vocabulary dwindled toward the end of her career.[6] Another author whose works have been studied, Agatha Christie, showed a 20-percent decrease in her vocabulary over the course of her career. It may be no coincidence that her late novel Elephants Can Remember is centered around a female novelist struggling with a failing memory.

4 Licking Cystic Fibrosis Patients’ Skin


“The child will soon die whose brow tastes salty when kissed.” So runs a rather morbid line from a 19th-century children’s almanac.[7] The book didn’t say why this should be so, but we now know that the cause of a salty-tasting child was most likely cystic fibrosis.

Cystic fibrosis is a fairly common genetic disease which affects the lungs, pancreas, and digestive tract. A mutation alters the way that many secretions are made in the body, creating a whole range of medical complications. One of the secretions which is changed is sweat. In people with cystic fibrosis, it is unusually salty. Many parents who kiss their child may have detected this difference.

In the 17th and 18th centuries, writers of folk wisdom declared that salty skin was a symptom of bewitching. They may have gotten the cause wrong, but the results were much the same. Before modern medicine, the lives of those with cystic fibrosis were universally short.

3 Sniffing Out Parkinson’s

We’ve already seen how capable dogs are at smelling diseases, but it turns out that the human nose may be just as useful in detecting medical problems. Joy Milne’s husband was a sufferer of Parkinson’s disease. This degenerative illness of the nervous system leaves people unable to control their movements. But six years before her husband was diagnosed with Parkinson’s, Joy knew something was different. She could smell it.[8]

Occasionally, she would smell something she described as a “musty” aroma from her husband, but she didn’t really think anything of it. It was only after joining a charity for Parkinson’s disease and finding that many others with the condition smelled the same to her that she mentioned it to scientists. Intrigued, they tested her. They gave her 12 T-shirts, six that had been worn by people with Parkinson’s and six worn by those without. The initial result was 11 out of 12. Joy identified all of the Parkinson’s patients but also swore that she could detect it in one member of the control group. Mere months later, that person was diagnosed with Parkinson’s disease.

The scientists are now trying to identify what the compound is that Joy Milne is able to smell in hopes of making a clinical test which will allow early detection of the disease.

2 Red Eyes In Photos

Many digital cameras these days come equipped with the ability to remove “red eyes” from photos. Red eyes are caused by the camera flash being reflected off the blood-rich retina at the back of the eye. It can be annoying, but it is, in fact, a sign of good health.

Tara Taylor posted a photo of her young daughter on Facebook, which showed her with one red eye and the other glowing yellow. Concerned friends pushed her to take her daughter to a doctor. The doctors discovered that the child had Coats’ disease, a buildup of cholesterol in the blood vessels of the eye that can cause blindness.[9] Thanks to catching it early, doctors were able save her sight.

Retinoblastoma is a cancer of the retina which mostly affects children and can also be detected by looking at how the eyes react to a camera flash. While Coats’ disease looks yellow, retinoblastomas look white. So next time a picture of your child comes out with two red eyes, consider it a blessing.

1 Pregnancy Tests—For Men

We saw how rabbits and frogs were used to detect the hormone hCG to tell if women were pregnant. Modern pregnancy tests look for the same hormone. Given how rarely men get pregnant, you can understand how shocked one man was when he used a pregnancy test, and it came back positive.

It’s not known why the man used a pregnancy test he found in his cabinet, but when he got the surprising result, he told a friend. That friend, known only as “CappnPoopdeck,” decided to make her friend’s experience into a comic and posted it to Reddit. Along with the usual jokes, one commenter suggested that the man go to see a doctor. Men producing hCG may have testicular cancer. Doctors did indeed find a small tumor, and the man was treated while it was still at an early stage.

Not all testicular cancers produce hGC, but giving men pregnancy tests can be a useful way of detecting tumors at an early stage. A British teenager with an unknown cancer was advised to use a pregnancy test, which revealed his testicular cancer in 2015.[10] Now there is even more reason for men to fear a positive pregnancy test.

 

]]>
https://listorati.com/top-10-crazy-ways-of-detecting-disease/feed/ 0 10531
10 Unnerving Facts About The ‘Suicide Disease’ https://listorati.com/10-unnerving-facts-about-the-suicide-disease/ https://listorati.com/10-unnerving-facts-about-the-suicide-disease/#respond Fri, 16 Feb 2024 23:47:41 +0000 https://listorati.com/10-unnerving-facts-about-the-suicide-disease/

Trigeminal neuralgia (TN, aka the “suicide disease”) is considered to be the worst pain known to man and medicine. People with this rare condition experience excruciating pain to their fifth cranial nerve.

Some treatment options are available, but the condition is progressive and incurable. Although it is not a widely known disease, even non-sufferers will find certain facts about TN interesting and unsettling.

10 A Long History

Discussions about facial torture (tortura oris) can be traced back to the ancient Greek physicians Galen and Aretaeus of Cappadocia in the first century and Avicenna in the 11th century. Even Hippocrates found versions of this facial pain a notable mystery in his writings.

The next reports of the condition appear in Somerset, England, in the 13th century. At the tomb of Bishop Button, one can see wall carvings of people depicted as suffering severe facial agony.

Historians suggest that this was a reference to what we now know as trigeminal neuralgia and not just toothaches. Button’s skeleton was later exhumed, and it possessed a set of nearly perfect teeth. Nevertheless, Button had been canonized and travelers came to give offerings at his grave in the hopes that the saint would relieve their toothaches.

Trigeminal neuralgia hit the mainstream medical world when famous physician John Locke described it in 1677. It received its first medical term, tic douloureux, from Nicolas Andre in 1756. Shortly after Andre’s study, John Fothergill wrote the first comprehensive description and understanding of the condition, and it was dubbed “Fothergill’s disease.”[1]

Fothergill identified it as a neurological condition rather than pains caused by the teeth, mouth, or tongue. Modern neurology now classifies it as trigeminal neuralgia, a reference to neuropathy of the fifth cranial nerve (the trigeminal nerve).

9 A Disease Of Many Names

One can’t deny the shock value of trigeminal neuralgia’s most famous and chilling colloquial name—the “suicide disease.” Many sufferers are dismayed to learn the term when they are newly diagnosed, although they are likely to relate to the gravity and seriousness of it.

Trigeminal neuralgia produces excruciating pain for the sufferer and is known as the worst pain a person can experience. Effective treatments have only recently been discovered.[2]

With no way to escape the world’s most unimaginable pain, more than 50 percent of sufferers were believed to have committed suicide. However, sufferers (and their loved ones) will be comforted to know that there is no substantial evidence or statistic to support this claim.

Less worrisome monikers for trigeminal neuralgia include tic douloureux, Fothergill’s disease, prosoplasia, and trifacial neuralgia. The most common term used by sufferers and those who treat trigeminal neuralgia is simply “TN.”

8 Trigger-Happy Pain

So what triggers the facial spasms and electric-like bolts in a TN sufferer’s face? A better question might be: What doesn’t?

Those who suffer from this condition report a wide array of triggers. Commonly reported culprits include smiling, touching your face, brushing your teeth, brushing your hair, wind, eating and drinking, changes in temperature, shaving, putting on makeup, certain foods, loud noises, and kissing.

The list goes on, and sufferers often report that just the fear of triggering an episode can cause them to withdraw from their daily activities.[3]

7 The Dental Connection

“Yank ’em all out!”

This is likely the battle cry of most new TN sufferers before proper diagnosis. Like the shrine of Bishop Button, TN patients look like people who are suffering from dreadful toothaches. Their first course of action is to see a dentist to remove the offending teeth, not knowing that it is actually facial nerve pain as it radiates to the nerve endings in the jaw.

Many TN patients unnecessarily have teeth removed. They think that it’s the cause of their pain, and dentists can acquiesce to their demands if they aren’t familiar with the condition or given a proper diagnosis.

This can be a sad, frustrating process for both patient and doctor as they learn that the pain persists despite the onward march to a full set of dentures before age 50. Unfortunately, this is a common warning tale among veteran TNers when counseling new patients.[4]

The dental connection is not totally misguided, however, as research has shown that dental problems can often be the cause (rather than the effect) of TN facial pain.

Accidental or iatrogenic dental trauma is found to be the cause of nearly 40 percent of all cases of trigeminal neuralgia. When combined with trigger-inducing standard dental work and the ongoing perception of tooth pain, this can give new meaning to having a “dental phobia.”

6 Treating The Incurable

Given the long history of trigeminal neuralgia, it is surprising that viable treatment options for the condition have only been discovered within the last century. Once the disorder was linked to neurological origins, doctors were able to more effectively develop medical interventions.

Since it is a nerve condition, traditional painkillers like NSAIDs and opioids do not touch the pain. Anticonvulsant and seizure medications like gabapentin and Trileptal are the first line of defense.

They work in about 80 percent of the cases. However, patients report difficult side effects and often have to increase their doses over time to maintain the efficacy of the drugs. Other pharmaceutical options include Lamictal and Baclofen to augment the effects of the anticonvulsants.

When the medication doesn’t work on its own (and it often doesn’t), TN patients have a number of surgical options. The most common is microvascular decompression (MVD) surgery. This procedure was developed by Walter Dandy in 1925 and has quickly come to be the most popular of the TN surgical options.[5]

MVD is a brain surgery that involves separating the root of the trigeminal nerve from a compressing artery that is aggravating the nerve. This procedure is often quite successful for patients with Type 1 TN whose MRI images show that compression is the main culprit.

Long-term results of MVD are varied, with some patients reporting full relief but only for a limited time. This can mean multiple surgeries and the risk of debilitating surgical side effects such as anesthesia dolorosa.

For those patients with Type 2 TN or another condition called atypical face pain, MVD may not be the prescribed intervention. Other surgical options exist, such as rhizotomy, glycerol injections, and balloon compressions.

So where do all these options leave us?

At the same place we started. TN is an enigmatic and fierce foe to the neurological sciences. Some treatments work—but only for some and only for so long. Managing a treatment plan for TN is a full-time job. It takes a lot of trial and error, perseverance, and commitment from both doctor and patient.

The research is happening. So with some good luck (and a few good scientists), a cure or at least a long-term, reliable treatment may be just around the corner.

5 Triple Whammy

Trigeminal neuralgia is known to coexist with a few other dastardly houseguests. As if the feeling of your face being pierced by ice picks and electrocuted by lightning while your teeth feel weighted down by 10-ton dumbbells isn’t enough, now you’ve got to throw in the extra perks of a down-and-out trigeminal nerve.

The most commonly known comorbidity with TN is multiple sclerosis.[6] Scientists have yet to determine the nature of the connection or which comes first—the proverbial chicken or the egg. (However, it is noteworthy that 1–2 percent of MS patients have TN as their first symptom.)

Researchers have identified a chilling rate: 18 percent of women with TN also have a diagnosis of MS, and 2 percent of all MS patients have TN. To make it even worse, 5 percent of all TN–MS patients suffer from bilateral facial pain, a rare type of TN as it usually only makes its home on one side of the face.

Another problem for TN sufferers is migraines and cluster headaches. These three tend to travel in groups, although one is not thought to cause the others. Some doctors theorize that this may be due to the proximity between migraines and cluster headaches with the trigeminal nerve itself, resulting in an interconnected web of aggravation, pain, and secondary symptoms.

It is also important to note that a common side effect of TN medications is headache. Sometimes, the TN anticonvulsant therapies can help prevent migraines and other headaches, but many times, the migraines and headaches must be treated with their own therapies and interventions.

It wouldn’t be far-fetched to assume that a person’s mental health and well-being would be gravely affected by the ravages of this condition. Depression is a frequent companion of TN patients, and their doctors and caregivers should give attention to and support for this unfortunate outcome.

A person’s life changes drastically when they get trigeminal neuralgia. This includes fear during the process of diagnosis (not knowing what’s happening to them), adjusting to life with chronic pain, loss of ability to do things they normally enjoyed, and an overall loss of hope.

Support groups (online and in person) are available to help with the adjustment and in finding understanding and sympathy for their changing life conditions.

4 Differential Diagnosis

The right diagnosis at the right time can make all the difference in a person’s ability to understand, treat, and cope with trigeminal neuralgia. To avoid the long, frustrating road of misdiagnosis, experts recommend these steps.

First, if you have tooth pain, see a dentist. Ask if he is familiar with trigeminal neuralgia. If he is, encourage a differential diagnosis before getting any teeth extracted. If he isn’t, leave.

Second, make an appointment with a neurologist as soon as possible. These are the folks who understand the condition and how to help you. It may take some time to see one in your area, but you can work on a temporary plan with your primary care physician in the meantime.

Third, if you have to go to the emergency room due to the severity of the pain, be prepared to have your pain questioned. ER docs aren’t TN specialists, so offer what you know about the condition. (Have it on a piece of paper because you’ll likely be in too much pain to talk.)

Also ask if they have a neurological consultant on hand. Remember, typical painkillers and opiates will only help the pain mildly. The first form of emergency intervention is a fosphenytoin IV (otherwise known as Dilantin). Ask the doctors about it. (Again, having the information on hand will really help you out when you’re in extreme pain.)

After you get a full workup from your neurologist, you can ask him to write an emergency care plan to bring to the ER with you to avoid any questions or confusion about what you need.

Finally, you may be able to diagnose yourself. A leading international researcher and neurologist on trigeminal neuralgia has created a self-diagnosis tool. He believes that it can guide anyone through a reasonably accurate diagnosis of face pain.[7] It is available here.

Once you’ve come up with your likely diagnosis, take it with you to doctor appointments. With any luck, it will help the diagnosis journey go more smoothly and you will feel empowered.

3 Famous Faces

Trigeminal neuralgia is indeed rare, but it has still affected some current celebrities and historical figures.

Social activist and writer Gloria Steinem has spoken out about her challenges with trigeminal neuralgia. Steinem describes the condition as excruciating. When it flares, it renders her speechless and unable to walk.

HR, the singer of famed rasta band Bad Brains, underwent his own “bad brain” surgery to deal with his TN. Using GoFundMe, HR and his family raised $16,000 for the surgery and have reported that he is doing well now.[8]

In 2011, Bollywood actor Salman Khan announced that he had trigeminal neuralgia and had flown to the United States to receive treatment. Khan used his position as a celebrity to spread awareness about the disease. He stated, “If there was a choice to give this pain to my worst enemy, I would not give it. They wouldn’t be able to take it.”

In 2015, Member of Parliament Andrea Jenkyns was chided by the media for being unable to finish the words in a public speech. Jenkyns spoke out against the reports by citing trigeminal neuralgia as the cause of her difficulty. She described the disorder as “excruciating” and “sporadic.”

2 Gaining Recognition

The facial pain and TN societies have fought long and hard for advanced research into and increased awareness of trigeminal neuralgia. On October 5, 2017, the United States House of Representatives passed House Resolution 558 which recognizes October 7 as National Trigeminal Neuralgia Awareness Day. It also expresses the government’s commitment and support toward finding an end to the disease.

The facial pain network is also seeking international recognition of October 7 as International Trigeminal Neuralgia Awareness Day through the World Health Organization. This proposal was submitted on July 1, 2017.[9]

Additionally, TN is gradually gaining more coverage in various news stories, TV shows, and short films. As the condition is put in the spotlight, sufferers will learn that they are not alone, researchers and scientists will be encouraged to search for a cure, those who live or work with TN sufferers will gain a greater understanding of the disorder, and the medical community will become more generally aware of the condition to reduce future misdiagnoses.

1 The Good (Forget The Bad And The Ugly)

It’s hard to believe that there is any good news after all the unsettling facts discussed above. But there is.

If you suffer from TN or know someone who does, there are many resources and “best practices” you can follow to make the journey a bit more bearable and hopeful. Check these out . . . experts say they are well worth it.

Connection: Get involved in the Facial Pain Association,[10] TNnME, and social media support groups.

Self-Care: Now more than ever, you have to be your No. 1. Know your triggers, and have a plan for dealing with them. Make a plan with your family as to how you will work as a group to deal with episodes. Find ways to relax and self-soothe, and commit to them.

Pain Management: Consider seeing a doctor who specializes in pain management. These specialists have alternative approaches to managing and coping with pain that many TN patients have found quite helpful.

Be Your Own Advocate: Know your condition from top to bottom—your exact diagnoses, your test results, your medications, your ER visits, and your doctors. Keep a file of all relevant medical information, and take it with you to all doctor and ER visits. Be the master of your medical records, and advocate for yourself!

Sometimes, Michael likes to write.

]]>
https://listorati.com/10-unnerving-facts-about-the-suicide-disease/feed/ 0 10162
10 Disease Theories That Were Spectacularly Wrong https://listorati.com/10-disease-theories-that-were-spectacularly-wrong/ https://listorati.com/10-disease-theories-that-were-spectacularly-wrong/#respond Sun, 19 Nov 2023 16:54:25 +0000 https://listorati.com/10-disease-theories-that-were-spectacularly-wrong/

It was John Dewey who once said, “Every great advance in science has issued from a new audacity of imagination.” It is this “audacity of imagination” that led to the Moon landing, yielded antibiotics to combat deadly diseases, and put a computer in every home.

Modern medicine has advanced considerably in recent times, and our understanding of pathology has never been better. However, history shows that mistakes are all too often made in the pursuit of scientific achievement.

When it came to disease, even some of the most revered thinkers got it spectacularly wrong. As you might expect, such theories led to some rather baffling treatments. From lobotomies to bloodletting, scientists throughout the ages have offered some fairly crackpot therapies.

The more you know about the history of medicine, the more you question the legitimacy of current theories. What else have we gotten wrong? What more is there to discover? Only time will tell.

10 Female Hysteria

Scientists once used pseudoscience as a means of correcting hysteria in women. The theory dates back to ancient Egypt. Many great thinkers imagined that hysteria was brought about by the position of the uterus (aka “the wandering womb”).

The word “hysteria” is derived from the Latin hystericus (“of the womb”). Smelly substances were often placed near the vagina to correct the problem. The ancient Greek physician Aretaeus thought the womb was repelled and attracted to different fragrances. The scent of the substance used depended on whether the uterus was high or low.

The medical fraternity’s understanding of hysteria turned stranger still. According to Greek mythology, the priest Melampus was said to have rid Argo’s virgins of their strange behavior. The daughters of King Proetus went mad and hallucinated that they were wandering cows. Melampus cured the women with roots of the flower hellebore and instructed them to make love to virile men.

And so the notion of a “melancholy uterus” came to pass. Prominent thinkers, like Plato and Hippocrates, believed the female uterus had its own moods. Lack of sex and reproduction were thought to make the uterus sad. An unhappy uterus, argued Hippocrates, was ultimately caused by a buildup of poisonous humors. These humors then migrated to other parts of the body and caused disease. Similar theories persisted from ancient Rome onward.[1]

According to US scholar Rachel Maines, theories surrounding hysteria led to the invention of the vibrator. In the 19th century, doctors were tasked with pleasuring women into a state of normality. It is said that doctors, bored with giving manual hand jobs, passed the responsibility on to midwives. (Other scholars disagree with Maines’s hypothesis.)

The electromechanical vibrator was originally invented in the late 1800s to massage muscles. Medical doctors decided it would be quicker to use the device to give women “hysterical paroxysms” (i.e., orgasms). Treatment times were slashed from around an hour to just 10 minutes.

9 Trepanning And Evil Spirits

Today, the practice of drilling a hole in one’s head to treat mental health problems is a tough sell. But that was not always the case. From Neolithic times to ancient Greece, numerous civilizations used a procedure called trepanation to combat disease. Trepanation involves making a hole in the human skull to remedy some perceived ailment.

During Paleolithic times, primitive tribes used trepanation to expel evil spirits from the body. In reality, the symptoms witnessed probably stemmed from mental illness. Skull fragments from the operation were highly sought after. Shamans would fashion amulets from the fragments in the hopes of fending off demonic possession.

The warring tribes of South America put the procedure to slightly better use. They used trepanation to treat traumatic head injuries. Today, modern surgeons use a refined form of trepanation to alleviate intracranial pressure. So perhaps there was some method to their madness.

Even now, a few brave souls use trepanning techniques to alter the flow of blood and cerebrospinal fluid in their heads. (N.B.: Do not try this at home unless you enjoyed the ending to One Flew Over the Cuckoo’s Nest.)

Amanda Feilding, founder of the Beckley Foundation, performed self-trepanation in the early 1970s. She believes that “stagnant pools” of toxins contribute to diseases like Alzheimer’s. Feilding ran for parliament in the UK twice on a platform of providing “Trepanation for the National Health.” She received few votes.[2]

8 The Elixir Of Life

They say two things are certain in life: death and taxes. But it seems that the elites of ancient China were obsessed with avoiding the former. In a bid to find the elusive “elixir of life,” they put their faith in alchemists. Over 2,000 years ago, the very first emperor of unified China, Qin Shi Huang, ordered his men to find a potion that would make him immortal.[3]

In what can only be described as an epic miscalculation, alchemists gave the emperor his elixir: mercury. As we now know, mercury only serves to bring about the recipient’s speedy demise. Historians believe the emperor was poisoned after consuming an unhealthy dose of mercury sulfide. He died at the not-so-immortal age of 49. Despite this obvious failure, alchemists continued their work. Many of them died toiling over their elixirs.

Before his passing, Qin Shi Huang ordered the creation of his Terracotta Army. These inanimate warriors were placed in the emperor’s enormous burial chamber to protect him in the afterlife. Ironically, archaeologists think Qin Shi Huang’s tomb is surrounded by a river of mercury.

Qin Shi Huang was not the only emperor to succumb to the temptation of quicksilver. Emperor Xuanzong of Tang was given an elixir derived from a mercury ore (cinnabar). He developed classic symptoms of mercury poisoning, including itching, muscle weakness, and paranoia.

The alchemists argued that these symptoms were a mere blip on the road to immortality. Of course, the emperor died shortly after. A number of Xuanzong’s predecessors died taking similar elixirs, including emperors Muzong and Wuzong. Emperor Muzong suspected something was up, so he made his alchemists consume their own poisonous concoctions. Muzong’s wisdom did not last long. He, too, became obsessed with elixirs and poisoned himself.

7 Miasma Theory

The miasma theory was proposed to explain the spread of disease. Before the germ theory came to pass, scientists thought that atmospheric impurities (“miasmata”) were the primary cause of disease. Plague doctors were illustrative of this theory in action. These frightening characters wore beak-shaped masks that were designed to keep foul-smelling miasmata away. The masks were packed with aromatic herbs to stop doctors from inhaling “bad air.”

In Victorian England, Edwin Chadwick put forward the miasma theory to explain London’s cholera epidemics. Meanwhile, Florence Nightingale argued that outbreaks of measles, smallpox, and scarlet fever were caused by building houses too close to smelly drains.

An anesthetist called John Snow refuted the miasma theory. Snow said that cholera was transmitted via polluted water, not bad air. This was a controversial hypothesis for the time.[4]

Snow observed that certain parts of London were more likely to experience cholera outbreaks than others. He realized that some of the local water companies filtered and purified their water, while others did not. All the companies took their water from the Thames—a swirling cesspit of refuse, effluent, and general despair. (Some things never change.)

Regions with high levels of cholera often received unpurified water from especially dirty parts of the Thames. Snow also discovered a link between the spread of waterborne diseases and the city’s inadequate sewage system. One major outbreak was caused by a cholera-riddled diaper that had been dumped in a leaky cesspit. The disease took hold when water from the cesspit contaminated a nearby water pump.

In 1861, Louis Pasteur’s germ theory proved that Snow was correct. The discovery of the bacterium Vibrio cholerae was the final piece of the puzzle. The miasma theory, which dated back to the time of Hippocrates, was finally put out to pasture.

6 Tooth-worm

Dental caries are no joke. This was especially true in Babylonian times when the Legend of the Tooth-worm existed. Thereafter, a number of ancient civilizations thought that wriggly worms were responsible for cavity-related pain.

The theory goes that a nasty worm would bury itself in the tooth. Its wild movements inflicted great pain on the sufferer. Only once the worm tired and ceased its thrashing would the pain subside. Some civilizations thought that the creature was actually a demon taking on the guise of a worm.

Fumigations and extractions were popular treatments for tooth-worm. Scribonius Largus, the physician to the Roman emperor Claudius, performed fumigations with henbane seeds. It was said that the resultant fumes would repulse the pest. During the 17th century, a number of charlatans conned patients into thinking they had tooth-worm. The practitioners would only pretend to extract worms. In reality, they were simply removing pieces of lute string.[5]

Roman philosopher Pliny the Elder is worth a brief mention. Pliny’s cure for toothache involved capturing a frog by moonlight, spitting in its mouth, and saying, “Frog, go, and take my toothache with thee!”

In 1728, Pierre Fauchard published a two-volume book, The Surgical Dentist. Described as the “father of modern dentistry,” Fauchard debunked the theory of tooth-worm and recommended that patients reduce their sugar intake.

5 Ulcers And Stress

Until recently, practitioners and researchers were united in their belief that ulcers were caused by stress and excess stomach acid. Scientists who were skeptical of this entrenched theory were the subject of ridicule.

So, in 1984, Barry Marshall set out to make a point. The Australian gastroenterologist was convinced that ulcers were the result of a bacterium called Helicobacter pylori. He was so convinced that he started experimenting on himself.

His colleague cooked up a delicious broth of H. pylori, which Marshall then drank. Now a miserable vomit-sprinkler, Marshall was diagnosed with acute gastritis. He cured himself with a simple course of antibiotics. The theory behind stress-induced ulcers was beginning to crumble.[6]

However, Marshall and his colleagues faced considerable pushback from the medical-industrial complex. A number of big drug companies were worried that antibiotics would make their products redundant. “Because the makers of H2 blockers funded much of the ulcer research at the time, all they had to do was ignore the Helicobacter discovery,” explained Marshall.

For the longest time, the idea that bacteria could survive in such an acidic environment was laughable. But scientists soon discovered that Helicobacter could effectively neutralize the acid around it.

Researchers now think that 80 percent of gastric ulcers are caused by the bacterium. Barry Marshall and colleague Robin Warren won a Nobel Prize for proving their peers totally wrong.

4 Corpse Medicine

Corpse medicine was the practice of using human corpses to treat illness. The part of the body consumed was dependent upon the ailment. “Like cures like,” argued the homeopaths. Therefore, nosebleeds and epilepsy were often treated with bits of skull, while superficial wounds were wrapped in fat-soaked bandages.

Europe’s rich and famous were pigging out on human bodies during the 16th and 17th centuries. The continent was rife with cannibalistic gravediggers looking for a quick buck. Egyptian tombs were looted of their mummified inhabitants and used to treat bruises and bleeds.

Even royalty was at it. England’s King Charles II was partial to a little alcohol and ground skull (aka “The King’s Drops”). The king would tootle off to his own laboratory and brew up a batch himself.[7]

Another form of corpse medicine was seen in 19th-century Denmark. Public executions were attended by blood-lusting spectators, many of whom brought their own cups.

In 1823, Hans Christian Andersen described witnessing a man feed the blood of an executed felon to a child. The blood was used as a treatment for epilepsy. Blood was referred to as the “elixir of life” throughout the Middle Ages (marginally better than mercury), and virgin blood was used to cure leprosy.

This “medical vampirism” dates back to ancient Rome. Numerous civilizations thought that human blood carried the soul. Drinking blood, they theorized, could stave off illness and afford new strength. It was this mystical belief that compelled the Romans to drink the blood of gladiators killed in the arena.

3 The Four Humors

Knowledge of anatomy and medicine soared under the physicians of ancient Greece. Dissections and vivisections provided doctors with fresh insight into the body’s inner workings.

Galen found that the brain controlled movement via nerves. Herophilus distinguished between veins and arteries. A number of prominent philosophers drew a connection between disease and the environment. And a biological trigger of disease replaced the supernatural. However, one deeply flawed theory went uncontested: the four humors.

Ancient Greek medicine was heavily influenced by Hippocrates. His theory on humoralism supposed that the body was made up of four fluids: blood, phlegm, black bile, and yellow bile. An imbalance in these fluids, or humors, would lead to disease. The four humors were also associated with an individual’s mental state. For example, a patient was melancholic if he had too much black bile.[8]

But where did the idea of these humors come from?

Well, the ancient Greeks were likely pouring blood samples into glass containers and leaving them to coagulate. After some time, this sample would separate into four distinct layers: red, white, black, and yellow. This is perhaps what they thought of as humors.

However, the Greeks may have taken inspiration from the four elements: earth, air, fire, and water. It was also widely accepted that these humors were somehow connected to the four seasons and planetary alignment.

Changes to diet and lifestyle were often recommended to redress the balance. The Greek physician Galen was a proponent of bloodletting to get rid of excess blood—what he considered to be the dominant humor.

Bloodletting continued under the barber-surgeons of medieval Europe who thought the practice could cure smallpox and epilepsy. Humoralism persisted throughout the West for thousands of years. Historians suspect that George Washington’s faith in bloodletting may have contributed to his demise in 1799.

2 Urine Therapy

Simply put, urine therapy involves using urine to combat disease. Those in support of the practice extol its apparent healing qualities. Books about urine therapy wax lyrical about the “elixir of life,” “the golden fountain,” and “liquid gold.” While most qualified doctors view urine as a waste product, urine connoisseurs claim that the liquid is a distilled product of the blood (aka “gold of the blood”).

Urine has been used throughout history with alarming frequency. Thomas Vicary, Henry VIII’s surgeon, advised cleaning battle wounds with urine. The 17th-century chemist Robert Boyle instructed patients to drink “a moderate draught of their own urine” in the morning. On the recommendation of George Thomson, urine was used to combat the deadly bacterium responsible for the Great Plague.[9]

A quick perusal of the Internet reveals that urine therapy is something people still do today. Hundreds of thousands of people in China are said to drink urine. A surprising number of athletes have also resorted to guzzling down their own juices, including MMA fighter Luke Cummo and boxer Juan Manuel Marquez.

Madonna famously told David Letterman that urine was a cure for athlete’s foot. Some desperate teens have taken to slapping urine on their pustulous faces, while others are brewing up their own urine-based teeth whiteners.

For obvious reasons, there remains little research on many types of urine therapy. But doctors are adamant that drinking pee is a bad idea. The practice has no health benefits and can lead to dehydration. Cleaning your wounds with urine is also a bad idea. New research shows that urine is not sterile, as was once thought to be the case.

1 Powder Of Sympathy

Sir Kenelm Digby was a man of science, philosophy, and reason. But, like many of his 17th-century contemporaries, Digby had a keen interest in alchemy and astrology. The Englishman came up with the strange notion that applying treatments to the weapon that caused an injury would heal the wound itself.

This miracle cure was called the “powder of sympathy.” Digby’s theory was delivered to top academics at the University of Montpellier. The speech lasted two hours and boasted of endorsements from King James.

Digby’s belief in the treatment came after experimenting on his friend James Howell. The writer was wounded while trying to stop a duel in England. In this instance, the powder of sympathy was tested on Howell’s blood-soaked bandage.

The bandage was then removed and kept separate from the wound. The treatment reportedly gave Howell “a pleasing sense of freshnesse” and a new lease on life. However, today’s scientists know better. His recovery was likely the result of good fortune and the placebo effect.

According to Digby, a Carmelite monk taught him the weapon salve. The potion was supposed to work on the basis of “sympathetic magic.” Proponents argued that a weapon would form some kind of connection to the human body after drawing blood. Digby and his colleagues believed that atoms of the lotion were attracted to the wound via some form of magnetism.

The powder of sympathy garnered considerable attention. There were 29 editions of Digby’s book, A Late Discourse . . . Touching the Cure of Wounds by the Powder of Sympathy. The potion was sold in many apothecaries throughout 17th-century Europe. Even the likes of John Locke and Thomas Sydenham lauded the bizarre treatment.

Digby’s love for the supernatural did not end there. He also had a keen interest in palingenesis, a form of “biological rebirth.” He hoped that the technique would resurrect life from the crystallized ashes of plants and animals.

Some scholars suggested that Digby’s attempts at resurrection were related to an obsession he had with his dead wife, Venetia. Rumor circulated that Digby had accidentally killed Venetia by giving her large quantities of “viper wine.”[10]

]]>
https://listorati.com/10-disease-theories-that-were-spectacularly-wrong/feed/ 0 8576
10 Horrifying Facts About The Human Form Of Mad Cow Disease https://listorati.com/10-horrifying-facts-about-the-human-form-of-mad-cow-disease/ https://listorati.com/10-horrifying-facts-about-the-human-form-of-mad-cow-disease/#respond Mon, 11 Sep 2023 07:06:00 +0000 https://listorati.com/10-horrifying-facts-about-the-human-form-of-mad-cow-disease/

Known as the human form of mad cow disease, variant Creutzfeldt-Jakob disease (vCJD) is an incredibly rare yet terrifying illness. Scientists currently do not know as much as they would like to about this condition.

But what they have discovered is fascinating and has led to increased health regulations for cattle meat production. For example, the US Department of Agriculture requires that all materials from the spinal cord and brain be eliminated from high-risk cattle. These products are not permitted in the US food supply.

Nevertheless, the facts about the human version of mad cow disease are truly horrifying.

10 It Is A Prion Disease

Prion diseases (aka transmissible spongiform encephalopathies) are rare forms of brain disorders which alter the structure of a patient’s brain and can be transmitted from one organism to another. Prion diseases can be found in both humans and animals, and vCJD is one of the deadliest human forms of prion disease.

All humans have prion proteins in our brains, but the normal function of these proteins is not yet fully understood by scientists. In vCJD, these normal prion proteins are affected by a diseased prion that is transmitted into the victim.

The infected prion will attach itself to the victim’s prion proteins and bend them into an abnormal shape. These newly infected prions will destroy brain cells and create gaps in brain tissue that cause the brain to resemble a sponge. As more prion proteins become infected, the victim will experience brain damage and other psychiatric symptoms.[1]

9 It Is Transferred Directly From Cows

The way in which humans obtain the abnormal prion that causes vCJD is almost always from eating the meat of a cow with bovine spongiform encephalopathy (aka mad cow disease or BSE). Mad cow disease is similar to vCJD because BSE is caused by an abnormal prion being transferred to the cattle.

It is not clear how mad cow began, but it is believed that it started with a cow consuming cattle feed that contained the meat of a sheep infected with scrapie (another prion disease). Scientists agree that mad cow originally spread in the UK when the meat of infected cattle was fed to young calves. Researchers have confirmed the relationship between these two diseases since the first cases of vCJD were reported in the UK, the same place where mad cow began.

UK citizens were most likely to be exposed to infected meat between 1984–86, and the first reported cases of vCJD were between 1994–96. This makes sense because the average incubation period of the diseased prion in both cows and humans is about 10 years.[2]

In some rare cases, a person has contracted vCJD from getting a blood transfusion from someone who has vCJD. But it is far more often caused by the meat of cattle infected with mad cow disease.

8 It Is A Form of A Larger Disease

vCJD is only one form of the more general disease known as Creutzfeldt-Jakob disease (CJD), which is defined as a disease caused by an infectious transferable prion. CJD is narrowed down into four specific diseases as each one has the prion being transferred to the human a different way. One of these specific forms of CJD is vCJD, and the other three forms are the same except for the way the prion is passed to the victim.

Sporadic Creutzfeldt-Jakob disease (sCJD) is the most common form of CJD, but it is also the most mysterious. In sCJD, one of the normal prion proteins in the victim’s brain changes spontaneously into an infected one. Scientists do not fully understand the cause of sCJD, but they believe it is more common in people with a specific version of the prion protein gene.

The third form is familial CJD, which is caused when a child inherits a gene abnormality that produces misshapen proteins. The symptoms of this disease usually do not begin until age 50, so parents with the mutated gene may not know they have it when they choose to start a family.

The last form of CJD is called iatrogenic CJD, and it is incredibly rare. This form transfers the diseased prion when a medical instrument that was used on someone with vCJD is not properly cleaned and is then used on another person.[3]

With all four of these terrifying diseases under the umbrella of CJD, many scientists have called Creutzfeldt-Jakob one of the most devastating brain disorders in existence.

7 It Created Another Disease

Kuru is another terrifying prion disease that spread through the tribal areas of Papua New Guinea. The disease was found in tribes that practiced a form of cannibalism where they ate the brains of their dead relatives as a mark of respect.

The tribes stopped this practice in the early 1960s, but cases of kuru continued because the disease has a long incubation period (at least 10 years). Interestingly, what happens in the brain of a person with kuru is similar to that of someone with vCJD as it causes a similar misshaped prion to begin infecting the brain’s proteins.

Scientists conducted research to determine why cannibalism caused a prion abnormality and came to a shocking hypothesis: Kuru originated from tribe members eating the brain of a person with Creutzfeldt-Jakob disease. Scientists confirmed this theory by comparing kuru prions with prions in all four forms of CJD and concluded that kuru prions have transmission properties very similar to those in CJD.

This means that a deceased member of a tribe in Papua New Guinea had CJD and nobody knew it. So they ate his brain and created a new form of prion disease. Scientists have not yet figured out what form of CJD the diseased tribe member had, but there is a high chance that it was vCJD.[4]

This is because there is an abundance of cattle in Papua New Guinea with very few health regulations or guidelines, which could have led to tribes consuming mad cow–infected cattle.

6 There Are No Viable Treatment Options

Unfortunately, there is no known cure for vCJD. A victim of this disease will sadly only survive an average of 13 months from the time of his first symptoms. Many drugs, such as amantadine and pentosan polysulphate, have been tested on vCJD victims, but none have proven to consistently help.

However, these medications have helped individual victims of this disease. In one reported case, a 22-year-old vCJD victim was treated with pentosan polysulphate (a drug meant to slow down the destruction of brain cells) and survived for 51 months after his first symptoms.

This is much higher than the 13-month average survival period, which means that this solution could work for more victims with further research. Currently, though, treatment is focused on keeping the patient comfortable and alleviating the symptoms of vCJD.[5]

5 It Causes Awful Psychiatric Symptoms

The symptoms of vCJD are mostly the same as the symptoms of the other forms of CJD. This happens because the abnormal prion is the same in every form, and the only difference among the forms is how they are transmitted.

As this diseased vCJD prion destroys the cells of a patient’s brain, it causes many symptoms related to the nervous system. First, a patient will experience problems with muscle coordination, such as involuntary muscle jerking and spasms.

As the brain deteriorates, the victim can experience numerous symptoms, such as hallucinations, forgetfulness, and, in some cases, blindness. Within a year after the first symptoms begin, the patient will lose the ability to talk or move and will enter a coma. Many patients contract pneumonia or other infections while in this coma, and that is usually what kills them.

Although these symptoms are similar to the other forms of CJD, vCJD is different because it affects younger people. The average age of a vCJD victim is 28 years, while sporadic and inherited CJD affect mainly middle-aged and elderly people.[6]

4 A Diagnosis Can Only Be Confirmed After Death

As vCJD is such a rare disease, it can be challenging for doctors to officially diagnose a person with it. If a doctor suspects that a patient has a form of CJD, the first step is to run standard tests to make sure that the patient does not have a more common mental disorder. These tests include spinal taps and MRI scans, which may detect the type of abnormality that is common in CJD.

However, the only way to confirm that a patient has a form of CJD is a brain biopsy or autopsy. Although a biopsy is conducted while the patient is still alive, it is not recommended for CJD victims because it can be dangerous and does not always get brain tissue from the patient’s affected brain cells.

Unless a person with CJD wants to take the risk of getting a biopsy, doctors can only confirm a CJD diagnosis by examining the patient’s whole brain in an autopsy after his death. Luckily, scientists are trying to change this by creating new tests that could help diagnose CJD.

For example, the National Institute of Neurological Disorders and Stroke has developed a test that looks for a specific protein in the brain that causes neural degeneration. They do this by examining the spinal fluid of a patient. With further research, this test could lead to a way to confirm a CJD diagnosis before death.[7]

3 It Is Extremely Rare

Worldwide, CJD will affect around three people out of every million. This already classifies it as an extremely rare disease, but this is just CJD as a whole. vCJD makes up less than 20 percent of CJD cases, and there have been only 231 cases of vCJD reported since it was first identified in 1996.

Out of the 231 known cases, 178 of them are from the UK, which makes sense because that is where mad cow disease emerged in cattle. In the US, there have been only four documented cases of vCJD. In two of these four US cases, the mad cow disease prion was transmitted to them when they were traveling in the UK.

In three out of the 231 reported cases, the vCJD infected prion was transmitted to a person through blood transfusion. This means that 1.3 percent of all known vCJD cases were not actually caused by eating diseased cattle but were transmitted by a person who already had vCJD. Scientists consider this a rare combination of vCJD and iatrogenic CJD and now encourage people with all forms of CJD to avoid donating blood.[8]

2 It Has Spread Worldwide

Despite being such an uncommon disease, vCJD has spread beyond the UK over the past few decades. We already know that 178 of the confirmed vCJD cases come from the UK and four come from the US. But the remaining 49 come from 10 different countries.

In order from most cases to fewest (as of 2017), there have been 27 cases from France, five from Spain, four from Ireland, three each from the Netherlands and Italy, two each from Portugal and Canada, and one each from Japan, Saudi Arabia, and Taiwan.

Although the UK has the vast majority of cases, this may not be true for long. vCJD cases in the UK peaked in 1999 and have been declining since then. But the opposite seems to be happening in other countries.

Despite the UK having far more reported cases of vCJD, Portugal has recently had a higher rate of vCJD cases compared to their number of mad cow–infected cows. This means that Portugal had a higher number of diseased cattle getting in their food supply and infecting humans, so Portugal could eventually have more confirmed cases than the UK if the UK’s infection rate continues to decline.[9]

1 Awareness Is Increasing

Luckily, many steps have been taken in the US and UK to prevent vCJD from spreading. For example, strict guidelines have been put in place to stop the meat from cattle with mad cow disease from entering the human food supply. These new guidelines include banning certain animal feed as well as BSE testing for all cattle over 30 months old. (Mad cow disease is rare in cattle under 30 months old).

In the US, the Centers for Disease Control and Prevention (CDC) has taken many steps to monitor the spread of this disease. They review all CJD cases in the US and analyze them to try to find patterns that could help scientists create new treatments.

In 1997, the CDC created the National Prion Disease Pathology Surveillance Center which conducts advanced tests, including brain autopsies, for prion diseases. Although vCJD and the other forms of CJD have not been eradicated, these steps by scientists and doctors have helped to limit its effects.[10]

J.J Grover enjoys writing lists, playing mahjong, and eating waffles.

]]>
https://listorati.com/10-horrifying-facts-about-the-human-form-of-mad-cow-disease/feed/ 0 7541
10 Awesome Skills People Got From Injury And Disease https://listorati.com/10-awesome-skills-people-got-from-injury-and-disease/ https://listorati.com/10-awesome-skills-people-got-from-injury-and-disease/#respond Thu, 07 Sep 2023 06:48:49 +0000 https://listorati.com/10-awesome-skills-people-got-from-injury-and-disease/

For most people who develop medical conditions due to unforeseen diseases or accidents, life tends to get more difficult. It doesn’t matter how severe the condition is, they have to give up on a part of their former lives to accommodate it.

In some cases, though, the exact opposite happens. Throughout history, a few people have acquired extraordinary skills because of their medical conditions, and science still doesn’t quite understand how it works. Even if it’s a bad idea to go out and try to replicate these cases to improve your lives—as it only happens in rare instances—they provide us with valuable insight into how the human brain really works.

10 Franco Magnani

Born in 1934, Franco Magnani grew up in a small village in the Tuscany region of Italy called Pontito. Like many places in Europe at that time, it was destroyed by the Nazis in World War II, and it still hasn’t recovered. After working there as a woodworker, Magnani eventually moved to San Francisco in his thirties.

Soon after his relocation, he was stricken by a mysterious illness that gave him powerful, delirious hallucinations about his hometown. When he recovered, he found that he had developed an exceptional ability to draw anything from memory, especially scenes from his early life in Pontito.

He eventually got so good at it that he’s now known as the “memory artist,” and we’re not using the word “memory” as an exaggeration. Magnani produced his best paintings without ever going back to the village after he moved to San Francisco, and they still look quite realistic.[1]

9 Derek Amato

Learning a musical instrument is no easy feat as anyone who has tried it just to look cool would tell you. Depending on the instrument, it may take months—even years—to achieve proficiency at it and even more time to master it. For Derek Amato from Denver, though, all it took was a bad fall.

It was around 12 years ago that he accidentally fell headfirst into the shallow end of a swimming pool, directly injuring his head. For most of us, that would mean a few days off from work at best and a debilitating lifelong medical condition at worst. Derek Amato, however, came out of it with a newfound ability to play the piano.

Apparently, the injury made him see black and white squares in his head, which he is somehow able to translate into piano notes. That’s the only type of notes he can understand as he is unable to read traditional notations like other musicians.[2]

8 Ken Walters

Ken Walters was a successful, happily settled engineer in 1986 when his life took a turn for the worst. While he was working on a farm, a forklift truck driven by a 12-year-old accidentally pinned him to a wall. It caused massive spinal and internal damage and had him confined to a wheelchair for 19 years. However, this is not the injury we’re talking about.

As if things weren’t bad enough, he suffered a stroke in 2005 and was taken to the hospital. At first, he couldn’t even speak properly and had to communicate with the hospital staff through notes. While writing one of those notes, he realized that he could now doodle, which was particularly surprising to him as he had never been good at any type of art before the stroke.

The stroke fundamentally changed something in his brain, and he soon started making art for a living. In addition to Walters being contacted by companies like IBM, EA, and Java for his artwork, his pictures have been featured in magazines and art galleries around the world.[3]

7 Leigh Erceg

Some medical conditions are so rare that they only affect one person in the entire world—as far as we know. Apparently, Leigh Erceg is the only one in the world ever diagnosed with acquired savant syndrome and synesthesia (both are rare conditions on their own) following a brain injury.

In 2009, she fell down a ravine while working on her family farm and badly injured her spinal cord and brain. Due to the severity of the accident, she has no memory of anything before the injury and even has trouble remembering world events like the fall of the Berlin Wall. The injury also squashed her ability to feel emotions, which the doctors creatively called the “flat affect.”

On the other hand, she gained an extraordinary flair for art and physics. Her home is now filled with art done with Sharpies as well as boards full of complex mathematical formulas that most of us would have trouble reading, let alone understanding.[4]

6 Eadweard Muybridge

Perhaps the most famous person on this list, Eadweard Muybridge has multiple accomplishments that you may have heard of. Apart from pioneering some of the earliest techniques in photography, he was also the first person to make a motion picture.

What may not be common knowledge, however, is how he got those talents. According to later research by a psychologist at UC-Berkeley, it was because of an injury from a serious stagecoach accident in 1860. Muybridge went into a coma for a few days. Then he suffered from intense visions and loss of hearing, taste, and smell for three months. Following his recovery, he moved to England and started his long and illustrious career as a photographer.

Muybridge may be one of the earliest-known examples of someone with acquired savant syndrome—a condition in which the patient acquires extraordinary abilities following a disease or brain injury.[5]

5 Jim Carollo

Despite high school’s best efforts, most people are bad at mathematics. There’s just something about numbers and calculations that doesn’t come naturally to everyone, and those who are good at it usually end up having hugely successful careers in specialized fields. For Jim Carollo, though, it was only a matter of going through a traumatic brain injury.

At 14 years old, he was involved in a severe auto accident and even spent some days in a coma. The injury was so serious that the doctors thought he wouldn’t make it. Not only did he survive and fully recover within a few months but he also developed a skill for math.

He scored a perfect 100 on his next geometry test without studying, which was surprising because he had never been particularly good at math before the accident. He became especially proficient at remembering numbers and can now recite anything from phone numbers, credit card numbers, old locker combinations, and the first 200 digits of pi just from memory.[6]

4 Lachlan Connors

Not everyone is born with a talent for music. However, Lachlan Connors was so bad at it that he couldn’t even remember nursery rhymes like “Twinkle, Twinkle, Little Star.” He actually wanted to make a career in lacrosse, which may be because he was a kid at the time and didn’t know that there is no real career in lacrosse.

It all changed after he suffered a few injuries to his head while playing. What were just concussions and presumably swollen bumps the first few times soon turned into serious epileptic seizures and hallucinations. Due to the severity of his condition, his doctor advised him against playing anymore.[7]

The seizures and hallucinations eventually subsided, though they were puzzlingly replaced by something he never had before—the ability to play musical instruments. We aren’t just talking about the guitar, either. He could now effortlessly play a wide range of instruments like piano, ukulele, mandolin, harmonica, and bagpipes.

3 Pip Taylor

Drawing is something we all love to do, even though most of us aren’t naturally gifted at it. It’s one thing to make a variety of smiley faces when you can’t concentrate in class, but it’s a wholly different thing to be good enough at it to convince someone to pay you for it.

It’s a difference that Pip Taylor—a middle-aged woman from Liverpool, England—knew only too well. She grew up with a passion for drawing but was so bad at it that even her teacher advised her against taking it up as a career.[8]

Then she fell down a flight of stairs and cracked her head in 2012. She was amazed to discover that she could now draw realistic copies of almost anything. The doctors were stumped and couldn’t really explain it. However, they admitted that brain injuries can sometimes rewire the brain into developing extraordinary skills.

2 Sabine

Due to advancements in medicine in the past few decades, we take some diseases for granted. What may be curable with just a doctor’s visit and a few medicines today used to be a definite death sentence at one point. Typhoid was one such disease and wreaked havoc on anyone who was unlucky enough to contract it before we could find a cure for it.

Sabine—a six-year-old in 1910—was left blind and mute due to the disease. Although she regained some of her speech abilities in the following months, her brain stopped developing like a normal child. At the time, some presumably insensitive doctors even called her an “imbecile.”[9]

All those problems aside, though, the disease also left her with a sort of superpower—the ability to do calculations with ridiculously large numbers like it was nothing. She became especially good at squaring any number they threw at her in a matter of seconds.

1 Ric Owens

One of the biggest misconceptions about head injuries is that they’re supposed to hurt a lot and, in cases of severe damage, you should black out. Even if those things do happen quite often, the reality is that serious injuries affect everyone in a different way.

In 2011, Ric Owens was a professional chef with a successful career. When his car was hit by a big rig on the highway, he didn’t think much of it as he didn’t have any immediate symptoms. Within a week, though, he started getting migraines and his speech began to slur. He was soon diagnosed with post-concussive syndrome and found that he was no longer interested in making food. Instead, he was now into abstract geometric art.[10]

Although neither he nor the doctors could explain how it happened, he could suddenly make art with anything he found around the house. He has about 100 pieces lying around the house now, made with everything from ceiling tiles, pallets, lamps, and glass.

You can check out Himanshu’s stuff at Cracked and Screen Rant, get in touch with him for writing gigs, or just say hello to him on Twitter.

Himanshu Sharma

Himanshu has written for sites like Cracked, Screen Rant, The Gamer and Forbes. He could be found shouting obscenities at strangers on Twitter, or trying his hand at amateur art on Instagram.


Read More:


Twitter Facebook Instagram Email

]]>
https://listorati.com/10-awesome-skills-people-got-from-injury-and-disease/feed/ 0 7467