Misconceptions – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Sun, 23 Feb 2025 08:18:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Misconceptions – Listorati https://listorati.com 32 32 215494684 10 Common Misconceptions About Food Origins https://listorati.com/10-common-misconceptions-about-food-origins/ https://listorati.com/10-common-misconceptions-about-food-origins/#respond Sun, 23 Feb 2025 08:18:28 +0000 https://listorati.com/10-common-misconceptions-about-food-origins/

There are thousands of great dishes in the world, and considering this it is not surprising that many people have mistaken impressions of where certain cuisine comes from, or where it is popular. Many dishes that we think come from one country, either do not originate there, or are served in a completely different manner. Some things that we consider to be extremely popular in certain countries, or even believe are national dishes in certain places, may originate loosely in that country but are rarely eaten there. Below are ten common fallacies in regards to the origin of foods.

Fo061109Pommes Frites-2

Misconception: French Fries originate in France.

French Fries are incredibly popular all over the world, but the origin really wasn’t French. While the French have tried to claim them in the past, the truth is that they were invented by the Belgians. While they were invented in Belgium, and still quite popular there, some of the folklore around them is a little fantastic. The stories say that some Belgians would often fry very small fish, and when they had no fish, they would make potatoes in the shape of small fish instead. The story may sound a little fantastic, but all good folk tales do.

Screen Shot 2013-03-29 At 3.07.26 Pm

Misconception: Chimichangas are a Mexican food.

Many people would name Chimichangas if you asked them to name Mexican foods, but they aren’t really a Mexican food at all. The truth is that Chimichangas would fit more in the category of “Tex-Mex”. However, this deep fried burrito does not originate from Texas either. According to recent developments, it is likely that the state of Arizona invented the Chimichanga, and they have been considering making it the state food.

Mothers-Famous-Chinese-Egg-Rolls-Recipe-Small

Misconception: Egg rolls are a Chinese food.

The Egg Roll that we eat in many parts of the western world isn’t really all that much like the most similar Chinese food. The egg rolls westerners know was created by Chinese immigrants to America who were using what they had on hand to make something that would still seem like theirs, but was meant to appeal to Americans. However, the thick hard-shelled egg rolls are nothing like the actual spring rolls that you will find in Hong Kong or elsewhere in China. Spring rolls are much more delicate and light, while western egg rolls are much more like deep fried dough stuffed with lettuce and tiny shrimp.

Pizza-Nachos-High

Misconception: Nachos are a Mexican food.

While nachos were originally invented by a Mexican, they were made to satisfy the appetites of visitors from the USA, out of spare ingredients that were lying around. The man who invented them was known by the name “Ignacio”, which is where the name nacho comes from. A man named Frank Liberto eventually starting selling them at stadiums and the rest was history. They have now become a wildly popular concession food. The nachos often seen in western stadiums today are made with a strange cheese concoction invented by Liberto that would stay good longer and not need to get hot to melt. Basically, Liberto invented mutant cheese sauce.

Nigiri-Sushi

Misconception: Sushi rolls are commonly eaten in Japan.

Most westerners think of sushi rolls whenever asked about Japanese food, however, the truth is that sushi is not nearly as popular in Japan as it is among those who are trying to emulate Japanese culture. Many Japanese people, just for starters, feel a little bit intimidated when they head down to the sushi bar. More importantly, however, sushi is rarely eaten in Japan. Less than a quarter of Japanese surveyed even ate sushi a couple of times a month or more, sushi is simply a meal usually reserved for an important event. And Japanese people would rarely eat the kind of rolls westerners are used to. What they usually eat is called nigiri and consists of a long piece of fish with a big ball of rice wrapped around it using seaweed.

5

Spaghetti and Meatballs

Rs-Recipepicks-10Spaghetti-Meatballs608

Misconception: Spaghetti and Meatballs are from Italy.

When asked about foods that come from Italy, many people would think of Spaghetti and Meatballs before anything else, but Spaghetti and Meatballs is not a dish of Italian origin. While it was invented by Italians, they were immigrants in North America. But more to the point, in Italy Spaghetti and Meatballs is not on the menu, and not really something they serve. On the rare occasions that pasta and meatballs are even involved in the same dish, they are definitely not served together.

Croissant1-Enlarge(07Czf3)

Misconception: Croissants originate in France.

Many people are under the impression that croissants were invented in France, some westerners even say it with a faux French accent in an attempt to sound sophisticated (in English it is pronounced “cruh-sont”. However, the croissant is believed to have actually come from something called the kipferl. While there are many conflicting stories about how the croissant came to be, most believe that it was by an Austrian man. Legend says the man was an artillery officer from Austria who decided to open up shop in neighboring France, and that after introducing the kipferl the croissant eventually followed.

6A00D8341Ef22F53Ef0147E2982Aef970B-500Wi

Misconception: Crab Rangoon is a dish of Chinese origin.

Crab Rangoon for those who don’t know, is a cream cheese filled, deep fried wanton, with crab meat in it. It has enjoyed considerable popularity in the western world, but as you might have guessed, isn’t really Chinese at all. Rather, it fits into that dubious category of “Chinese-American Cuisine”. The main reason why this dish doesn’t make any sense as Chinese is that cheese in generally, especially cream cheese, is seldom eaten in China. Also, while the dish may sound good now, keep in mind that the crab meat inside is rarely actually crab meat. Much crab Rangoon only carries imitation crab meat, and some recipes call for it especially.

98Qok

Misconception: Pizza is Italian.

While Italian’s do have their own version of pizza, it is nothing like what many in America would expect it to be. And the legends of who invented pizza, and where and how are very murky. However, it is important to note the differences between the two styles of pizzas. In America, pizza is generally slathered in tomato sauce with seasonings, topped with a ton of cheese and then heaped with toppings. However, in Italy, pizza is a much more exquisite creation that might disappointment American food cravings. They don’t always involve tomato sauce—some opting instead for fresh tomatoes— and frequently include fresh herbs and vegetables, some mozzarella and olive oil. While both dishes have the same name, they are strikingly different.

1

Corned Beef and Cabbage

Corned-Beef-Cabbage

Misconception: Corned Beef and Cabbage is the national dish of Ireland.

With St. Patrick’s Day recently behind us, millions around the world may have eaten corned beef and cabbage in honor of the Patron Saint of Ireland. However, corned beef and cabbage really isn’t all that popular amongst the Irish, and it certainly isn’t the national dish. While Ireland doesn’t really have an official national dish, some believe that the most popular would be a bacon joint, likely served with potatoes and maybe vegetables. There are, however, many different great Irish foods. The key point is that there really isn’t any one national food of Ireland, and Corned Beef isn’t even all that Irish.

You can follow Gregory Myers on twitter.

]]>
https://listorati.com/10-common-misconceptions-about-food-origins/feed/ 0 18133
10 Common Misconceptions About Prehistory https://listorati.com/10-common-misconceptions-about-prehistory/ https://listorati.com/10-common-misconceptions-about-prehistory/#respond Tue, 28 Jan 2025 05:27:57 +0000 https://listorati.com/10-common-misconceptions-about-prehistory/

Without written records to give us an idea of what life before written history was like, we are left to decipher the clues left behind and put the pieces together for ourselves. Imagining a world before the written word is a little mind-blowing, and as we learn more and more about life thousands of years ago, we’re also finding that a number of popularly held beliefs about the prehistoric world are absolutely false.

10 Food Was Dull and Bland

iStock_88402161_MEDIUM

Historians at the University of York recently analyzed several pottery shards found along the Baltic Sea. The pottery, which was in use about 6,000 years ago, contained traces of lipid deposits, which came from fish, shellfish, and deer, and after comparing other trace residues to more than 120 different types of plants, they found that the prehistoric chefs were using garlic mustard to flavor the dishes.

Garlic mustard seeds are tiny and have a hot flavor similar to a peppery wasabi. What they don’t have is any real nutritional value, leading the researchers to conclude that the only reason they were included in cook pots was to add some spice.

Other European sites, dating back to between 4,000 and 5,000 years ago, have yielded other cooking pots and vessels that still contain traces of spices like turmeric, capers, and coriander.

9 Industry Was A Foreign Concept

Blombos Cave Artifacts

Archaeologists have recently uncovered sites recognizable as workshops that date back to around 60,000 years ago, but Blombos Cave in South Africa has yielded something even older. Researchers call it a prehistoric paint factory, and the cave contains everything that would have been needed to assemble paint kits for ancient cave paintings. The site contains containers made from abalone shells, bone spatulas for grinding and mixing components, and pigments used in the creation of red and yellow paints.

In 2008, 70,000-year-old ocher pigments were uncovered, and the finds have suggested that the cave was used as a manufacturing facility for tens of thousands of years. The colored paints would have been used not only in cave paintings but on leather objects, pottery, or even as body paint. Red paint dating back at least 160,000 years has previously been found, but the findings in Blombos Cave show an unheard-of level of chemical knowledge, preparation, and the ability to mass produce and store products.

8 Prehistoric Creatures Were Dinosaurs

Pterandon Skeleton

The term “dinosaur” actually has a very specific definition, and the creatures that fall into that category only occupy several steps along a whole family tree of prehistoric critters. In order to be considered a dinosaur, there need to be a few specific features present in the skeleton: The most apparent feature is in the hip; dinosaur hip bones consist of three separate but joined bones with a central hole for the head of the femur. That construction is what gives the dinosaur its stance, and not all ancient creatures have that particular bone structure. From there, dinosaurs are further classified as “bird-hipped” and “lizard-hipped,” a distinction made in 1888.

So, what creatures are most commonly misidentified as dinosaurs? Pterosaurs, the iconic prehistoric flying creatures, are technically part of another branch of the family tree called the avemetatarsalians.

7 There Is A Missing Link

Austrolopithecus

There are few paleontological terms that are tossed around more than “missing link,” but the popular impression that one single creature is the missing link is deeply flawed.

In 1863 a Scottish physician named John Crawfurd first used the term to refer to the idea of a species that existed between modern man and our primate ancestors. Afterward, it was applied to the discoveries of Homo erectus and Australopithecus africanus, and it has been a media misconception ever since. Technically, every single species and every single fossil is a missing link because of the slow development of transitional anatomies.

6 Prehistoric Humans Ate A Paleo Diet

iStock_86919573_MEDIUM
The idea of a singular Paleo diet first showed up in the 1960s, and today, it is still a way of life for a certain percentage of the population.

The modern Paleo diet is heavily meat-based, with no processed grains, legumes, or sugars. Supporters of the Paleo lifestyle argue that this is fine because we haven’t changed too much since the time we were hunter-gatherers, so we should—in theory—be healthier this way.

The idea that it was only when we got away from healthy living that we developed diseases like diabetes is what one researcher calls a “Paleofantasy.” It is completely false, just as it isn’t true that we remain unchanged from our prehistoric ancestors.

And finally, there is no such thing as a single, historical Paleo diet, anyway. While ancient Inuit people had a diet that was heavy in meat and fish (with not much of anything else), the !Kung of southern Africa were eating mostly nuts and seeds.

5 Agriculture Started The Development Of Cities

Gobekli Tepe

For decades, the standard explanation of how we went from our prehistoric society to our modern one was with the development of agriculture. Once we started figuring out how to farm, we no longer needed to move with migrating herds of animals. We could build permanent homes and villages and we could turn our attention to things like writing and culture.

Discoveries of stone tools and animal bones at Gobekli Tepe suggest that we have that all completely backwards.

At the heart of Gobekli Tepe are a series of carved stone megaliths dated to around 11,000 years ago. The stones were carved and installed while the civilization was still relying on its hunting and gathering ways. It was only about 500 years later that they established a nearby village where they domesticated sheep, pigs, and cattle and started to farm the world’s oldest strains of wheat.

The need to build a massive complex, carve sacred images into stone, and to create a sociological center forced mankind to develop farming and herding as a way to feed the builders and stoneworkers. Farming provided the fuel necessary to allow our prehistoric ancestors to make their vision a reality.

4 Neanderthals Didn’t Honor Their Dead

Neanderthal Burial

Several major discoveries have shown that not only did Neanderthals bury their dead, but they mourned them in complex rituals as well. We know that they were certainly capable of forming attachments and feeling grief—precursors to the need to mourn. For example, discoveries of the remains of Neanderthals who were elderly or infirm shows us that they would go out of their way to provide extra care for an aging individual rather than abandon them.

In addition to burials, archaeologists have also excavated Neanderthal remains that show signs of processing, much like modern bodies that are processed for burial. In some remains, knife marks show where bone marrow was removed, where soft tissues were cut away, and where joints were purposely separated. It has been suggested that these particular cuts can be associated with cannibalism, but also that it might have been done as part of a spiritual ritual.

Furthermore, at least one formal prehistoric cemetery has been found in Irkutsk, Russia. The cemetery contains more than 100 bodies belonging to the members of a hunter-gatherer tribe that lived in the area between 7,000 and 8,000 years ago.

3 Neanderthals Lived Short Lives

Neanderthal

The last Neanderthal died around 40,000 years ago, and science has been trying to figure out just why Homo sapiens were the ones to survive. One theory is that Homo sapiens simply had a longer lifespan than our Neanderthal cousins.

An analysis of fossil records refutes that idea. Neanderthals and early humans had similar life expectancies. The two species coexisted for about 150,000 years and about 25 percent of individuals from both species survived past the age of 40. There were also about equal percentages of people that made it past the age of 20.

2 Prehistoric Art Was Simple

Cave Painting

A 2012 study analyzed artistic depictions of movement in four-legged animals from prehistoric cave paintings all the way through the modern era. The study found that prehistoric artists were better at accurately depicting animal movement than modern artists. The analysis looked at 1,000 modern works of art and found that the error rate in artistic depictions was around 57.9 percent. The prehistoric artwork studied had only an error rate of around 46.2 percent. That makes our ancient ancestors much more accurate in their art than modern masters.

Prehistoric people weren’t just making art on cave walls, either. Countless bog bodies and mummified remains have been found with extensive tattooing, and the discovery of 3,000-year-old artifacts from the Solomon Islands has afforded valuable insight into the practice of early tattooing. The volcanic glass tools are among the only prehistoric tattooing instruments ever found.

1 Prehistoric Living Was Clean

iStock_87166575_MEDIUM
As it turns out that prehistoric man needed a little escapism, too—by getting high.

Traces of the hallucinogenic San Pedro cactus dating to around 10,000 years ago have been found in caves in the Andes Mountains of Northern Peru, and documented evidence of the use of magic mushrooms is even more plentiful.

There is also evidence of opium use and humans chewing coca leaves at least 8,000 years ago, beginning in the area around the Mediterranean and spreading to the rest of Europe.

And alcohol, the favorite modern drug, dates back to at least 7000 BC in the form of a fermented rice, honey, and fruit beverage discovered on pottery shards from the Henan Province.



Debra Kelly

After having a number of odd jobs from shed-painter to grave-digger, Debra loves writing about the things no history class will teach. She spends much of her time distracted by her two cattle dogs.


Read More:


Twitter

]]>
https://listorati.com/10-common-misconceptions-about-prehistory/feed/ 0 17622
10 Misconceptions You Believed Thanks To Looney Tunes https://listorati.com/10-misconceptions-you-believed-thanks-to-looney-tunes/ https://listorati.com/10-misconceptions-you-believed-thanks-to-looney-tunes/#respond Sat, 23 Nov 2024 23:27:19 +0000 https://listorati.com/10-misconceptions-you-believed-thanks-to-looney-tunes/

If you are of a certain age, a large part of your Saturday morning routine was watching “Looney Tunes.” The adventures of Bugs Bunny, Daffy Duck, Porky Pig and their friends have been entertaining kids and adults since 1930. The cartoon series has not only come to be one of the most beloved examples of American animation in the world, but each of the cartoon’s characters has become global cultural icons.

SEE ALSO: Top 10 Hidden Images Found In Cartoons

Fans of the show have come to accept that the cartoon is sometimes screwy in the perceptions about the animals the show’s characters are based on. We are not here to point out the problems with anthropomorphic animals, loose interpretations on the laws of physics, or why a rabbit would appear in drag. Instead, we will be looking at how “Looney Tunes” created misconceptions about how real-world animals act.

10 Rabbits Love Carrots

The most identifiable image of Bugs Bunny shows the rabbit munching on a carrot, asking “What’s up, doc?” The problem with this is that rabbits and hares in the wild would avoid the carrot. Carrots and other root vegetables are full of sugar, which rabbits are unequipped to digest. Rabbits typically eat grass, hay and dark leafy greens.

While a domesticated rabbit can eat a carrot, it is a bad idea to feed carrots to your pet bunny regularly. It is akin to you being force-fed candy daily; you may appreciate the treat, but it is doing nothing positive for your health.[1]

9 A Roadrunner Can Outrace a Coyote

A running gag on the show is Wile E. Coyote chasing the Roadrunner, only for the Roadrunner to leave the hapless coyote in the dust. In real life, the chase would go differently. The greater roadrunner—the species of roadrunner likely to live in the American Southwest—is a long-legged member of the cuckoo family. The bird tends to weigh about a half-pound as an adult and is between 20 to 24 inches in length. Greater roadrunners typically run 20 miles per hour, but has been reported to reach 26 mph.

A coyote averages between 32 to 37 inches long—not counting its tail—and weighs between 20 and 50 pounds as an adult. A coyote prefers to stalk but can run as fast as 43 miles per hour if needed.

So, unless the roadrunner had a large head start or can somehow trick the coyote into slamming headfirst into a tunnel the coyote itself painted on a canyon wall, a coyote can easily catch a roadrunner.[2]

8 Cats ‘n’ Skunks

This myth may be more due to animation design laziness than anything zoological. Some of the most memorable “Looney Tunes” episode show Penelope the Cat (or some other unfortunate feline) having a white stripe painted down her back. The cat is then thought to be a female skunk by the Casanova skunk Pepe Le Pew.

In real life, a cat cannot be confused with a skunk. Striped skunks are related to badgers and weasels. While about the size of a cat, stripped skunks have pointed snouts, rounded ears, and wide, flat tails.

Cats have flat faces, pointed ears, and narrow tails. Cats also have stout but flexible bodies, compared to a stripped skunk’s elongated torso. So, a cat with a white painted stripe down its back would look like a painted cat, and not a skunk.[3]

7 Dogs Hate Cats

Evolutionary zoologists have found that there is nothing about a cat that makes a dog hate it. To the contrary, if the two animals are raised together, cats and dogs can see each other as part of their pride/pack. A dog may chase a cat as part of the animal’s hunting instinct, but this is not animosity. Cats are fast, and dogs like to chase fast things.

If a cat or a dog feels that its territory is being intruded on, it may attack. This is not limited to its so-called mortal enemy, though. Squirrels, mice, fellow cats or dogs, and even humans are likely to get this treatment if the animal feels threatened. While cats and dogs communicate differently, peace is as likely between cats and dogs as war.[4]

6 Cats Kill Birds and Mice for Food

Speaking of cats, the cat and mouse/cat and bird archetype is also a vast simplification. Let’s be clear, though, before we start, cats do kill mice and birds. House cats are responsible for the extinction of 63 species of birds, small mammals and reptiles. It is estimated that the American cat population kills between 1.3 billion-4.0 billion birds and 6 billion-22.3 billion mammals every year.

The problem is that house cats do not just kill for food. While we can emphasize with Sylvester chasing Tweety while hungry, it is just as likely that Sylvester would go after Tweety regardless of his food situation. A cat’s predatory instinct is so strong that even a well-fed cat would ambush and kill a smaller animal. The situation has led the domesticated house cat to be classified as one of the 100 World’s Worst Invasive Alien Species.[5]

5 A Tasmanian Devil Eats Anything

This is another oversimplification. Given the opportunity, a Tasmanian devil would eat anything. As scary as it may seem, “Looney Tunes” Taz is a spot-on depiction of the Tasmanian devil. Ornery to the point of being psychotic (by human standards), the Tasmanian devil was so named after early settlers of Tasmania saw the marsupial’s behavior.

With jaws strong enough to deliver a bite that can cut steel cable, a Tasmanian devil would kill or eat anything they comes across. A Tasmanian devil has been known to kill animals many times its size, including a sheep.

Considering this, Tasmanian devils usually prefer to eat carrion or already dead animals. Tasmanian devils are roughly the size of medium-sized dogs. An adult devil can reach lengths of 30 inches and weigh up to 26 pounds. Tasmanian devils are also slow, only capable of reaching speeds of eight miles per hour. Tasmanian devils will also eat insects, fish, small birds and snakes.[6]

4 There is No Such a Thing as a Chicken Hawk

While Henery the Chickenhawk’s endless fight to “get a chicken” is a funny theme in the series, it is inaccurate. From a zoological point-of-view, there is no such thing officially as a chickenhawk or chicken hawk. The term chickenhawk has been used in some parts of the United States to describe three different birds: the red-tailed hawk, the sharp-skinned hawk, and Cooper’s hawk. Only the red-tailed hawk is brownish, like Henerey.

These birds do not generally eat chicken. While a red-tailed hawk may go after a free-range or wild chicken, the hawk usually eats rabbits (sorry Bugs) and rodents. Cooper’s hawks and sharp-skinned hawks are bird eaters but would not go after a chicken. A chicken may be the same size or larger than the hawks.

3 Ducks Must Fly South for Winter

Ducks, like most migratory birds, do not always travel south for winter. They simply move from one habitat to another that may offer them better odds for survival. While this may mean that most ducks do travel south, ducks that have a ready access to food may opt to winter where they are. Other ducks may travel west or even north to known feeding spots. Usually, the place ducks travel to migrate is based on their ancestral nests. So, Daffy going on strike from flying south is not so strange.

Of note, domesticated ducks don’t migrate during the winter. Domesticated ducks usually cannot fly—some have clipped wings, while others are too fat.[7]

2 Rabbits are Fast Burrowers

A motif in many Bugs Bunny cartoon was the grey rabbit burrowing quickly from offscreen, only to emerge and be confused that he was not where he intended. “I should have made a left in Albuquerque” is one of the most memorable quotes in animation history. Like many things in “Looney Tunes,” this is only a half-truth.

Rabbits are prolific burrowers. The typical rabbit spends most of his non-foraging life digging. Rabbits have powerful front legs and long, sharp claws. Rabbits will burrow to build a den or nest underground (although, cottontail rabbits build dens above ground). These dens are connected to neighboring dens, forming a network called a warren.

Even though digging is instinctual for rabbits, it is a slow, time-consuming process. Rabbits will spend days—if not weeks—digging a single hole. While rabbits will spend large sums of energy digging, their small size means that only a small amount of dirt is moved at a time. This doesn’t dissuade them, though.

There is something about rabbit holes that “Looney Tunes” did get right. Usually, when Bugs needs to make a quick escape, he will dive into a nearby rabbit hole, only to appear somewhere else. This is typical rabbit behavior. To avoid predators, rabbits will dig deep escape routes in their foraging or warren zones. These rabbit holes are connected to the warren or to other rabbit holes. A frightened rabbit would head to the nearest rabbit hole, dive in, travel through the underground connections, pop out a different hole and escape, confusing the predator.[8]

1 Finger In The Barrel

Common sense probably makes this a touch obvious, but just in case . . .. In 2012, the Discovery Channel show “Mythbusters” tested the Bugs Bunny troupe of sticking a finger in Elmer Fudd’s shotgun to make it backfire and explode. The logic behind this is that by sticking a finger in the barrel, the air space in front of the bullet cannot be vacated, leaving the expanding gas behind the bullet with nowhere to go. This would turn the shotgun into a bomb.

“Mythbusters” found that a finger stuck in a shotgun barrel would disintegrate in the path of the slug, along with most of the finger’s arm. If Bugs was to plug a shotgun barrel with his finger, Bugs would have died or been horribly maimed. Fortunately, “Looney Tunes” is just a cartoon and the laws of physic there are but a suggestion.[9]

About The Author: Frederick Reese is a politics, financial, and emergent technologies reporter. Based in Upstate New York, Frederick has written for Yahoo!, CoinDesk, Bleacher Report and the Huffington Post. You can follow Frederick on Twitter.

]]>
https://listorati.com/10-misconceptions-you-believed-thanks-to-looney-tunes/feed/ 0 16315
10 Major Medical Misconceptions – Toptenz.net https://listorati.com/10-major-medical-misconceptions-toptenz-net/ https://listorati.com/10-major-medical-misconceptions-toptenz-net/#respond Sat, 11 May 2024 18:50:38 +0000 https://listorati.com/10-major-medical-misconceptions-toptenz-net/

The internet is a great resource, but one thing that it doesn’t come with is a manual on what information you should trust, and what information you shouldn’t. The internet, especially since the days of COVID-19, has been rife with medical misinformation, and it’s spread far and wide through social media. Unfortunately, some of these misconceptions about medicine can be dangerous and are better cleared up for the safety of all. 

10. The Higher Percentage Of Rubbing Alcohol, The Better The Sanitizing Power 

When the pandemic hit, the stores were soon running out of cleaning products and disinfectants of all kinds. Among other things, rubbing alcohol became extremely hard to find, as many people wanted it to disinfect surfaces for COVID-19. While it’s good that people were taking anti-viral precautions, some observers noticed that whenever anything was remaining, it was usually a decent stock of 70% rubbing alcohol, with the higher percentages being mostly unavailable. Now, you’d think this would make sense, as it sounds like a higher percentage of alcohol would kill more viruses, but this is a common misconception. 

The truth is that for disinfecting regular surfaces you want 70% alcohol, and not the 90% or higher stuff. The reason for this is that rubbing alcohol still needs some water to spread around the surface area and have some time to kill germs before it evaporates completely. However, that doesn’t mean the higher percentages don’t still have their uses. They are often recommended for electronics, where you would rather sacrifice cleaning power, to use as little moisture as possible. 

9. Once You Get A Transplanted Organ, Your Problems Are Mostly Solved 

In movies, TV, and every other form of media on the planet, it’s a common storyline to have someone who needs a transplant for a lifesaving organ. They may be low on the list, or have some other issue holding them up. The drama of the story is usually set around getting them the organ, after which we’re led to believe they now live happily ever after. Unfortunately, while it would be great if this were true (and it may be one day), it’s currently not the case. 

The unfortunate reality is that no matter how close a match, a transplanted organ will require you to take drugs that suppress your immune system for the rest of your life. This is because your body will try to attack the new organ, thinking it’s an invader. A “match” makes it possible for drugs to hold this back, but it does not stop you from needing them. You can also get a version of diabetes called diabetes mellitus due to the drugs you have to take. 

8. You Can Get The Flu From The Flu Vaccine

It’s a matter of fact that a lot fewer people get the flu shot every year than people who get their regular booster shots and other vaccines. Some people say it’s mostly for older people, younger kids, or people with weakened immune systems. They’ll say they can handle a case of the flu just fine, and that it could even give them a small case, if anything. They don’t want that risk, so they avoid it. Some even argue the shot is mostly ineffective, and doesn’t usually work. 

However, most people’s excuses are complete bunk. For starters, you cannot get the flu, even a small case of it, from the flu vaccine. The parts of the flu that are in the shot are dead, and cannot suddenly come back to life. You can indeed get a few mild symptoms that are often associated with the flu, which may last a few hours or days, which is why people are confused. 

As for the rest of the misconceptions, we’ll take them one at a time. Everyone six months and up should get the flu shot; the lost productivity from the flu is way worse than a few vaccine side effects; and while it may not work, each year’s vaccine is targeted to hit the flu strain expected to be the biggest problem that season. 

7. You Can Suck Venom Out Of A Wound 

A popular trope in fiction is the old venomous wound, wherein someone tries to suck the venom out to save the afflicted person. This goes back to, for example, the stories of Sherlock Holmes, where it’s used for a plot with a mistaken vampire. It also probably goes back much, much further. It’s an incredibly common trope, is what we’re saying.

It’s become so prevalent in popular culture that a lot of people consider it gross and scary, but real and grimly necessary. There are even extractor devices designed to pull venom from a wound, and most would be surprised if they found such devices being sold when they don’t actually work. 

However, the truth is that the venom from most animals doesn’t work as fast as movies would have you believe, and surprisingly few people die from venomous animals each year. What you really need to do is get victims to a hospital as quickly as possible and get them anti-venom. Sucking venom from a wound will not get it out faster than the person’s body can pump it through the blood, and you could poison yourself if you have a cut in your mouth. As for those extractor pumps, scientific studies have proven that they are not effective. 

6. Stabbing Adrenaline In Someone’s Heart Is Great For An Opioid Overdose

Pulp Fiction is a classic movie, and anyone who has seen it well remembers the scene where John Travolta races to his drug dealer with an overdosed Uma Thurman, and then stabs her dramatically through the heart with an adrenaline needle. Apart from stabbing her through the heart, he also doesn’t need to push down the plunger, presumably because pushing it through the heart just gets it all done that much faster. 

Unfortunately, while the scene is fun, it’s pretty much all wrong. While sometimes an injection in the heart, known as an intracardiac injection, is necessary, there’s no reason to think it would be needed in this case. It’s also done slowly and carefully through the ribcage by a trained medical professional. If you were using an adrenaline needle, you would actually usually use the thigh. However, in this case, you would not want to use adrenaline at all, as Uma’s character was overdosing on opioids. What you would want to use is Narcan, which often comes in a nasal spray today, making delivery much easier. If you want to speed things up, you would follow up the Narcan with CPR. 

5. Glasses Are Magic Goggles That Fix All Eye Problems 

Vision problems are common around the world, and basically everyone either wears or knows someone who wears specs. For the most part, people understand what glasses can and can’t do; however, that doesn’t necessarily mean everyone understands eye problems. Some people will get confused when they meet someone who isn’t fully blind but is not wearing glasses to improve their vision. There’s a common misconception that glasses can fix all (or most) eye problems outside of blindness, and a general misunderstanding of what it means to be “legally” blind. 

Unfortunately, the truth is that a lot of people with extremely low vision have problems that glasses simply cannot do anything for, and for the most part, science can’t do much to alleviate those issues. Among these are things like age-related macular degeneration, diabetic retinopathy, sun damage, nerve damage of any kind, and many more. Scientists are trying to find ways to help these people, but most of it is very experimental. 

As for being legally blind, it’s easiest to compare it to a normal, sighted person. Someone who has normal vision can see an object that’s 200 feet away from, well… 200 feet away. Seems obvious enough, right? Well, to see that same object, a legally blind person would need to be 20 feet away.

4. Feed A Cold, Starve A Fever 

This is an old saying that can be traced back to a 1574 dictionary by John Withals. Like many folksy sayings, it has become a part of culture in some areas, where it was passed down from parent to child. Some people still take the saying seriously to this day, and most imagine, even if they don’t know medical science, that it must have some kind of reasoning behind it. After all, colds are minor bacterial infections, and the flu is a virus, so it makes sense to treat them differently, right? 

Well, perhaps in some respects, but not in this particular case. The truth is that there is absolutely no scientific rationale for starving someone who has a fever, or a virus of any kind. Whether it’s a cold, fever, broken leg, or any medical problem, keeping someone well-fed, rested, and watered is crucial. So if you are trying to remember how this phrase should go, it would be “feed a cold, feed a fever.” 

3. Going To The Hospital In An Ambulance Will Get You Seen Faster 

Some people, in the hopes of getting seen at the hospital quicker, will call the ambulance when they have other ways of getting there. While this may sound selfish, most people are not exactly thinking straight when they’re in a crisis and worried for their lives, or the lives of their loved ones. They’re just trying to get medical care in a bad situation. However, that doesn’t mean it doesn’t cause problems or isn’t a huge pet peeve with medical personnel. 

The thing is, this popular idea is completely untrue and a total misconception of how the medical system works. The purpose of an ambulance is to bring someone in who doesn’t have another way to get there or needs some kind of care to keep them alive on the way to the hospital. It does not matter how you arrive, you will be seen based on need. This is called triage. It can also cause problems for others, as it can tie up important resources that may be needed for others. The bottom line is if you can get to the hospital safely another way, and you know you can make it in time, you don’t need the ambulance. 

2. Defibrillators Are Magic Heart Restarting Paddles 

defibrillator

Medical dramas probably reuse this trope more than any other in their playbook. We all know the scene so well we could recite it from memory. An individual is flatlining and there’s only one thing to do: break out the defibrillator paddles. Someone yells “Clear,”  we hear the sound of rushing electricity, and the paddles are pressed against the person’s chest. This is repeated until they wake up. The doctor may throw in a couple of lines like, “damn you, you aren’t dying on me!” for added dramatic effect. 

Now, while these scenes make for great, dramatic television, they aren’t exactly based on reality. A 2014 study of dozens of resuscitation scenes from movies and television found that there were a lot of issues across the board. In general, they found that it all presented a missed opportunity to educate the public. Defibrillator paddles are commonly misunderstood for this reason, but they are not magic heart-restarting paddles. If someone’s heart rhythm is wrong they can shock it back to normal, but they do not restart a truly stopped heart. 

1. You Should Stick Stuff In The Mouth Of A Seizing Person So They Don’t Bite Their Tongue Off

Seizures are a fairly common medical trope as well and also make for great drama in your favorite doctor shows. Looking at you, E.R. and House. Sometimes people will even dramatically try to get something in between a seizing person’s teeth without getting their fingers bit off, to save the afflicted person from potentially biting off their tongue. This is a common misconception, and it’s important to know that it is not how you should do things, in case you ever do end up in a situation with a seizing person. 

If someone has a seizure, one of the most important things is that you keep their airways clear. This means you should avoid doing anything that might get in the way of their breathing, such as sticking stuff in their mouth. Also, you cannot bite off your tongue, although you can painfully bite it and cause a nasty sore, which is why people are confused. What you should actually focus on, apart from clear airways, is making sure they don’t have anything they can hurt themselves on, time their seizure, and call for medical help if necessary.

]]>
https://listorati.com/10-major-medical-misconceptions-toptenz-net/feed/ 0 12189
10 Stubborn Misconceptions About World War II https://listorati.com/10-stubborn-misconceptions-about-world-war-ii/ https://listorati.com/10-stubborn-misconceptions-about-world-war-ii/#respond Sat, 30 Mar 2024 06:31:21 +0000 https://listorati.com/10-stubborn-misconceptions-about-world-war-ii/

World War II is the largest-scale war in history, at least in any modern sense, and has filled our imaginations to the bursting point. Countless documentaries, movies, video games, books, and everything else have been made about the war, and it has become truly larger-than-life. However, like many larger-than-life things, legend and fact often end up conflated, and the truth can get muddled. In this list, we will be exploring 10 stubborn myths about World War II that just won’t go away. 

10. The Attack On Pearl Harbor Caused The USA To Enter The War 

Pearl-Harbor

One of the most enduring myths about World War II is that the United States sat out the war until, peaceful and kind, they were caught with their pants down at Pearl Harbor and decided to join up to help stop the Axis Powers. Now, while it is true that Pearl Harbor was the catalyst that gave the United States Congress and Senate the guts to officially join, it’s not like the Americans were sitting on their hands before that. 

To start with, Japan did not just attack the United States out of nowhere like a man getting mugged in a meadow — even if that’s how history books make it sound. The fact is the United States was engaged in a very ugly diplomatic war with Japan over its aggressive expansion and war was almost inevitable. As for America’s war against the other Axis powers, America had actually been helping the British for years and had ramped up that help over the previous year. This meant an attack from someone in the Axis was almost certainly coming soon, and with it, the official declarations of war. 

9. The Germans Were On The Cusp Of Making Nuclear Weapons 

Many “shocking” World War II documentaries or other specials will try to go to great lengths about the Nazis, turning them into some kind of almost mythical beast. However, the truth is that they were not only just human beings, but they were human beings who lost. We can theorize until the end of days about what would have happened if they won, but they did not and were almost certainly never going to win. They overextended themselves from the very beginning, and the idea they were close to developing a nuclear weapon is simply not true. 

They had made some heavy water, but it wasn’t really that much for what they would have needed it for, and this is also a terrible method of trying to make nuclear technology as it just isn’t efficient. It shows their nuclear ambition was really in its infancy compared to the United States. Some people will point to all the Nazi scientists the USA grabbed up after the war that were especially good at rocket technology, but they were needed for different things. The US didn’t need help making atomic bombs, they had already slaughtered hundreds of thousands with just two of them. 

8. The Annihilation Of Hiroshima And Nagasaki Caused Japan To Surrender 

Many people, when talking about the atomic bomb, will try to mention the bombings of the cities of Hiroshima and Nagasaki as a necessary evil. They will say that destroying two populated cities with hundreds of thousands of people living in them saved Japan from total destruction. The Japanese are talked about in this weird fashion as if they are some kind of alien hive-mind that would have unreasonably fought to the last man. This makes it easier for the largest country in the world (in military terms) to sleep peacefully at night without being haunted by its past. 

However, this quant justification is nothing more than a bunch of hot air dreamed up by a country that doesn’t want to feel guilty for annihilating hundreds of thousands when it wasn’t needed at all. The fact of the matter is that the US had already bombed the living hell out of Japan, so the idea that bombing more of their cities would make them surrender is just silly. America firebombed Tokyo, and bombed civilian cities all over Japan, as part of a campaign to wear down the nation’s spirit, and they kept fighting. Many scholars now believe the real reason Japan ended the war is simple: Stalin was about to enter the fray, and Japan did not want the Soviets as their conquerors, nor felt they could keep up the war on two fronts. 

7. The United States Was United Against The Dreaded Nazis 

Plenty of Americans like to think of World War II as the time the country truly united, and the whole world came together to fight off evil. World War II has certainly inspired countless fictional franchises, along with more documentaries than you can count. These stories tend to play up the idea that everyone was volunteering to go fight the bad guys, and unlike when politics usually divides people, everyone was completely in this thing together. However, the sad thing is that even the greatest world war of all time didn’t actually bring America truly together. 

There were organizations in the United States that supported the Nazis, and while they were investigated for their un-American activities, the USA has a lot of freedom of speech so people could get away with a lot as long as it was peaceful. For this reason, during World War II the USA did have some marches in support of the Nazis right out in broad daylight. As for the US being the heroes who saved everyone, well… they almost weren’t at all. The isolationists tried super hard to stop the country from getting into the war and disapproved greatly of President Franklin Roosevelt’s shadow war against the Axis powers. The isolationists in Congress thought the country could just sit things out, and they were not easy to convince. 

6. Hitler’s Famed Blitzkrieg Is Vastly Overrated And Poorly Understood 

Some talk about the Blitzkrieg, or Lightning War, in awed voices, as if it was some special advancement in warfare or a genius trick used by Hitler and the Nazis. This misconception has helped create a mystique and some still think if the Nazis had more equipment and the right bit of this and that, the sort of genius they had could have taken over the whole world. However, the truth is there wasn’t anything particularly clever about the Blitzkrieg.

This war strategy was not exactly new, nor was it even invented by the Nazis. In fact, all it really means is that it is the opposite of trench warfare, in that you are trying to get a knockout blow on the enemy. The Blitzkrieg is just the war equivalent of a sucker punch, which is just a dirty trick and not anything clever at all. The Soviets found that the strategy was beaten easily enough by putting down multiple defensive lines to slow down the enemy. By doing this, they could take advantage of the fact that the Blitzkrieg caused the Nazis to temporarily separate some of their lighter and heavier forces, leaving their flanks vulnerable. 

5. German Citizens Sat By And Did Nothing As Their Jewish Neighbors Were Taken Away 

One of the favorite myths about World War II is that regular German citizens did nothing at all and that if we just speak up in our own countries, we can make sure nothing like that ever happens again. And while it’s a comfortable notion that people being responsible would have stopped it, things aren’t always that easy. The Nazis started with propaganda and then followed it up by slowly ramping up terrible, and abusive laws. People who spoke out against them to protect their neighbors were taken into camps themselves and everyone was afraid. 

And we aren’t saying that the German people gave up once Hitler had solidified power either. Many German citizens did try to help those under threat from the Nazis to get to safety. Some German officials stuck out their necks, knowing the punishment for one of their kind would be terrible, to get people to safety. Now, we aren’t saying there weren’t some bad citizens as well since some did report on their neighbors, but there were also many good people who did as much as they could. 

4. The Nazis May Have Dealt With Rebels, But Apart From That Successfully Took Over France 

It’s arguable just how much control Germany had over France throughout the war, but some people are confused about how that control actually began. The Germans first occupied France about halfway through the year 1940. At first, they just had the Northern and Western portions of the country, and many felt that the Vichy government was a puppet state run by the Nazis. The French resistance began almost immediately, and it took the Nazis two more years to control the rest of France. 

And even when they had this, it wasn’t like they had successfully conquered anything. You see, the French were a really hard people to take over, and they didn’t take kindly to invaders on their land. The French resistance was so fierce that to say the Germans had the country would be a gigantic overstatement of their accomplishment. Without an incredibly expensive sustained occupation, it is unlikely Germany could have ever held the territory long-term. With how strong the insurgency was, the Germans had about as much control over France as the United States had over Iraq. 

3. “Keep Calm And Carry On” Did Not End Up Needing To Be Used

“Keep Calm and Carry On” has become, in many people’s minds, the very perfect celebration of British stiff upper-lipped-ness. It perfectly defines their people and shows how they managed to stay calm during the worst of World War II, by remembering to stay cool, and go about their business no matter what. Or at least, it would have been, if they had ever actually needed to use this slogan during the war. The truth is that the slogan wasn’t dredged up until decades later, and it has some interesting history behind it. 

Back then, things were very, very frightening and no one really knew what was going to happen (it was World War II, after all). The Nazis invading Britain and at least temporarily occupying it was a very real possibility and something the government had to be prepared for, even if it was unthinkable. The slogans and posters were printed up as part of a just-in-case shadow campaign to keep people’s spirits up if the country were to actually be invaded. Of course, as we know the British won the Battle of Britain, keeping any invasion at bay, and the campaign was never needed. But it serves as a stark reminder of just how close things were to very different, and much darker outcomes. 

2. Adolf Hitler May Have Survived Somehow And Had A Double In The Fuhrerbunker

One of the favorite subjects of World War II documentaries is the fate of the Nazis, especially those who were not captured and put on trial at Nuremberg. They like to wax on about how some of them may have escaped to South America, and of course, the famous Dr. Joseph Mengele is always mentioned. However, the favorite is Hitler, and the theories about him surviving somehow persist. Still, while the theories may stubbornly refuse to die, Hitler himself most definitely did.

For starters, Allied intelligence told us he had gone to hide in his Fuhrerbunker, and he was discovered with Eva Braun, the closest thing he had to a girlfriend. The Allied forces found the Fuhrerbunker stocked with supplies, which confirmed it was almost certainly his personal bunker, and it was also stocked with what was definitively Hitler’s dead body. Now, some may argue that it could have been a body double, but the dental records were confirmed to be a match with Hitler’s. At the time, there was no better way of identifying a body, and there was zero reason to believe it was anyone else but him in that bunker. 

1. The Battle Of Normandy Succeeded Due Entirely To Its Incredible Planning And Heroism 

Now, we want to be clear that we are not trying to downplay the incredible heroism involved, or the incredible amount of planning required to pull off the operation on D-Day. It was a miracle of planning, subterfuge, and so much heroism that will never be forgotten. However, despite all of that, the operation came closer to disaster than most people are aware of. The truth is that it was quite a defensible position and a huge part of the Allies’ victory relied on tricking Hitler, which had required months of careful work to ensure Hitler could believe Normandy was not the main attack, and not even be sure of the day or time of the “real” attack. 

While losing the battle and not taking the beach would not necessarily have lost the war, it would have been a giant setback for the Allies, and if Hitler hadn’t been successfully fooled it may not have succeeded at all. Several of Hitler’s best commanders were not available on the day of the attack, and the intelligence operation the Allies had put together managed to make him think a bigger attack was still coming elsewhere. He resisted calls from his commanders to bring in more heavy artillery and tanks, which might have stopped the Allies from storming the beach

]]>
https://listorati.com/10-stubborn-misconceptions-about-world-war-ii/feed/ 0 11175
Top Common Misconceptions About Volcano Eruptions https://listorati.com/top-common-misconceptions-about-volcano-eruptions/ https://listorati.com/top-common-misconceptions-about-volcano-eruptions/#respond Sat, 11 Nov 2023 18:20:00 +0000 https://listorati.com/top-common-misconceptions-about-volcano-eruptions/

Expert answers often come with jargon that can put you to sleep.

News stories help, but misinformation tends to creep in now and then. Journalists can’t be expected to sort through scientific minutiae during a crisis.

It’s up to us to figure out the truth. And as you read through this list, you might be surprised at what you only think you know!

Everyone loves watching volcanoes from a safe distance and sometimes even when they threaten us. That fiery power is fascinating.

We’re curious, too. How do eruptions happen? Is it safe to climb a volcano? What are supervolcanoes?

10 Utterly Spectacular Volcanoes In Space

10 Misconception: Molten rock powers eruptions

About 25-30 kilometers (15-20 miles) below our feet, there is intense heat and pressure. Some kinds of rock melt under these conditions, forming magma.[1]

That red-hot liquid rock has enough buoyancy to rise towards the surface through cracks in the overlying crust. It might not get there but if it does, there will be an eruption.

There’s a long way to go, though. Surely the weight of the overlying rock helps squeeze the magma along?

Nope. It’s running on gas — not fossil fuel, the gases dissolved in the molten rock. These include water vapor, carbon dioxide, and sulfur dioxide.

Gases bubble out of the magma as it gets higher, since the pressure is decreasing because there is less rock. You can safely get the same effect by opening a carbonated beverage bottle; the contents are under pressure, but carbon dioxide bubble appear when the seal is released.

But our magma conduit is still capped, so the gases have nowhere to go but out into the conduit. This speeds up the rising melt, and a chain reaction starts that can lead to an eruption, if the gases don’t run out and the magma doesn’t encounter a barrier of some sort that it can’t pass.[2]

If it does stop, then the magma will begin to cool, eventually freezing in place as a dike, sill, or pluton. Erosion sometimes uncovers these structures, revealing their beauty in places like Devil’s Tower in Wyoming.[3]

9 Misconception: Lava always erupts in flows

What’s missing here is the other half of the picture: lava can also explode onto the scene. We’re too busy running away or watching it in awe from a safe distance to realize that this is lava, too.

It all depends on the gases mentioned earlier, as well as the rising magma’s temperature and viscosity.

Temperature is a no-brainer. Molten rock reacts much like molasses does to temperature changes, slowing down as it cools off.

Viscosity isn’t complicated, either. Water has very low viscosity; asphalt is quite viscous.

Gases bubble away in water; asphalt won’t let them pass. This matters because rising magma is driven by gases.

It also contain silica, which tends to form long chains of molecules, increasing viscosity.[4]

The end result of all this geochemistry?

Low-silica magma lets most gas bubble away before the eruption, keeping just enough to power either a gorgeous Hawaiian-style lava flow, or if there’s a fair amount of silica present, something like the clip above.[5]

High-viscosity, high-silica magma, on the other hand, hangs onto more of its gases, building up more internal pressure. A similar pressure increase happens when you shake a champagne bottle.

Either way, the bubbles have nowhere to go until you uncork that bottle or the volcano’s rock gives way. Then, BOOM![6]

You now have a champagne-splattered sports team and a countryside covered in shattered lava, respectively.

8 Misconception: Volcanoes erupt smoke

Actually, the solid material in that impressive plume is rock and natural glass.

Don’t believe it? Ask Mexico’s Popocatepetl Volcano.[7]

During the terrible wildfire season this year in both hemispheres, we have seen plumes above major fires that certainly looked volcanic. But there is a big difference in their contents.[8]

Volcanic ash isn’t fluffy like bushfire ash. It’s all tiny bits of jagged stone and natural glass that get into your lungs and behind your contact lenses.

It’s heavy, too.

According to the US Geological Survey just 4 inches of it weighs 120-200 pounds per square yard. Even small accumulations of volcanic ash can overload and collapse a roof.

It will also do a number on your home electronics and cause power outages, since volcanic ash conducts electricity.[9][10]

Volcanic ash damages cars and other machinery. You will never see smoke affecting jet engines like this!

This is why there are Volcanic Ash Advisory Centers (VAAC) around the world,[11] constantly on the lookout for ash clouds in global flight lanes.

7 Misconception: Lava and pyroclastic flows are the deadliest hazard at volcanoes

Pyroclastic flows are indeed lethal. Lava flows, not so much.

Lava usually moves slowly enough that you can get out of its way.[12] There are exceptions, so don’t get cocky next time you visit an active volcano. Nevertheless, a study that looked at volcanic fatalities between 1500 AD and 2017 found that fewer than 700 people had been killed by lava.

Pyroclastic flows, which travel farther and much faster than lava, have a body count of almost 60,000 people. What could possibly match that?

Would you believe mud?[13]

What happens is that water mixes with volcanic ash, forming a heavy mixture like wet concrete that races down river drainages faster than people can run.

Volcanologists call these mudflows by their Indonesian name: lahar. You don’t want to get caught in one.

Lahars can also happen without an eruption. All that’s needed is water and loose ash. Heavy rainfall and tropical systems are notorious for causing secondary lahars.

In 2007 video above, the water came from a crater lake at New Zealand’s Mount Ruapehu:

All told, volcanic mudflows of various types have murdered almost 60,000 people over the last five centuries. They’re as hazardous as pyroclastic flows!

6 Misconception: Lakes don’t erupt

We all know that volcanic eruptions at sea are awesome.[14] Volcanoes near water can also cause tsunamis, like the one at Anak Krakatau in Indonesia in 2018 that killed four hundred people.[15]

Most of us, though, would never expect water to take the initiative and erupt on its own. But lake eruptions do happen, although there is usually a volcano lurking somewhere in the background.

That volcano’s gases seep through the lake bed into the water. Under the right circumstances (described below), they build up to the point where the lake explodes, unleashing all that gas in one deadly event.[16]

Lake Kivu, which is described in the video above, sits at the feet of Mount Nyiragongo in the Democratic Republic of the Congo.

Researchers are worried that future volcanic eruptions could start right under the city of Goma, located on the shores of Lake Kivu. That would be bad enough, but it likely would also set off a lake eruption, leading to a double tragedy.

But why would lava erupt in town, 14 kilometers (9 miles) from the volcano?[17]

Top 10 Amazing Volcano Lakes

5 Misconception: Eruptions start at the top of a volcano

Tell it to Mount St. Helens, which first exploded sideways in 1980 after a landslide removed rocks overlying the magma inside the mountain!

Active volcanoes do usually have at least one summit crater, but molten rock will exploit any zone of weakness it meets.

In 1977, for example, and again in 2002, Mount Nyiragongo’s famous lava lake drained out through large cracks in the surrounding rocky walls. The unexpected flank eruptions happened far down the volcano’s slopes, where there were villages, and many people died.

Volcanic fissures usually come in a network and can be quite extensive. One such system extends from Nyiragongo’s summit all the way to Goma, the provincial capital that sits on the shores of potentially deadly Lake Kivu.[18]

While no activity has been seen there in recorded history, it’s entirely possible that those fissures could host a future eruption.[19] And as we’ve seen, that could be a double catastrophe if it causes the lake to erupt, too.

On a happier note, beautiful Iceland is riddled with fissures – it sits on an area where two tectonic plates are slowly moving apart.[20]

Many fissure eruptions occur here. The last one, in 2014 at Bardarbunga Volcano, didn’t disrupt North Atlantic air traffic very much. It was also a spectacle.

In this video of drone footage, it sometimes looks as though the lava is coming from a crater. That’s actually a spatter rampart, where cooled globs from the lava fountains collect.[21]

4 Misconception: Exclusion zones are suggestions, not rules

Scientists go up there, why not us?

Because what you don’t know about a volcano can get you killed. This applies to experts, too.

In 1993, for example, a team of international volcanologists entered the crater of Colombia’s Galeras Volcano, which had been quiet for five months, to collect data on it for the Decade Volcano program.

They knew the risks and were ready to run whenever other scientists, who were monitoring the volcano’s real-time “vital signs” from a nearby observatory, spotted the least sign of impending trouble.[22]

Galeras gave no warning and exploded anyway, killing six volcanologists and three local residents who had come up to watch the fieldwork.[23][24]

Sudden changes like this are bound to happen when you have a mountain-sized pressure cooker sitting on a hot burner — even in “safe” places.[25][26]

Exclusion zones are common-sense rules, based on experience gained at the cost of human lives: almost seventy scientists and hundreds of thousands of other people.[27]

Even the tragedy at White Island/Whakaari Volcano in December 2019 spurred advances in volcanology.[28]

Sometimes a scientist takes chances, hoping to save many lives in the future.

But what about you? Are selfies, boasting rights, and the view worth entering a no-go zone and possibly ending up as just a data point on someone else’s research graph?

3 Misconception: All big eruptions are Plinian

For most of us, any explosive eruption is awesome. They’re Plinian, right, with the other type being pretty Hawaiian-style flows and fountains?

Actually, the Volcanic Explosivity Index (VEI) starts off with Hawaiian eruptions at VEI 0 and works its way up to VEI 8, a supereruption.

Along the way are a few intermediate grades and four different Plinian types.

Pictures are worth a thousand words, so let’s look at two explosive eruptions.

The clip above is Taal Volcano in January 2020, which certainly was a big and scary event.

It doesn’t yet have a VEI number, but Philippine volcanologists describe this eruption as steam related (“phreatomagmatic”).[29]

In terms of intensity, that’s not even close to Plinian.

While that certainly meets Pliny the Younger’s description of the Pompeii-destroying Mount Vesuvius eruption in 79 AD,[30] scientists can be fussy over details.

Some call this “classic Plinian”,[31] while others describe it as subplinian.[32]

The VEI scale gives geoscientists a common starting point for subtle disagreements like this. For example,[33]

• VEI 4 is “Pelean/Plinian”; think Eyjafjallajokul, the Icelandic volcano that messed up air traffic in 2010.
• VEI 5 is “Plinian”; Mount St. Helens, 1980.
• VEI 6 is “Plinian/Ultraplinian”; Pinatubo, 1991.
• VEI 7 is “Ultraplinian”; Tambora’s huge nineteenth-century eruption.[34]

Luckily, humanity hasn’t experienced a VEI 8 during recorded history. What would that be like?

2 Misconception: Supereruptions are like nuclear blasts

Which is easier: trying to absorb the fact that a supereruption blasts out at least 1,000 km3 (240 cu mi) of material[35] or just letting film makers and game designers imagine it for us?

Of course we go with simulations. Since the scale of such real-life events is beyond our experience, special effects artists often go nuclear.

Sometimes they try to stay close to science. Unfortunately, very little is definitely known about supereruptions, other than that at least two other giant calderas in the US besides Yellowstone have erupted during the last two million years, along with others in New Zealand, Indonesia, Japan, and South America.

Of these, Taupo in New Zealand hosted the world’s most recent supereruption about 23,000 years ago, while Indonesia’s Toba supereruption 74,000 years ago was the largest.

Currently, scientific explanations for how supereruptions happen include:

1. “Unzipping”: Multiple Plinian eruptions happen along giant ring fractures in the ground, emptying out the magma chamber below, which in turn collapses to form a caldera. This may have happened at California’s Long Valley Caldera.[36]

2. “Boiling over”: Some supervolcanoes, like Cerro Galan in Argentina, show no evidence of Plinian eruptions. Perhaps very dense clouds of crystalline magma poured out of the developing caldera here but were too heavy to rise very high (clip above).

We’ll never really know how supereruptions happen until we experience one. Let’s hope we survive it!

1 Misconception: Any supervolcano eruption means the end of the world

As little as we know about supereruptions, it’s still clear that the global consequences of one would be terrible.

But this doesn’t mean that we should all panic if the Yellowstone Volcano Observatory,[37] or its equivalent at any other supervolcano, ever posts a warning.

Small eruptions happen, too. In fact, experts note that Yellowstone’s most likely next round will either be a hydrothermal blast (a steam explosion large enough to toss rocks around) or a lava flow.[38]

Relatively small hydrothermal explosions are common at Yellowstone, happening every two or three years. Larger ones, those that form craters several hundred meters wide, occur every few thousand years.

And Yellowstone has had eighty known lava flows since its last VEI 8 some 630,000 years ago; the last one took place about 70,000 years ago. Such nonexplosive activity wouldn’t have much impact outside Yellowstone Park.[39][40]

Could scientists monitoring Yellowstone tell whether an impending eruption was going to be small or supersized? Yes, probably.

Because of the huge volume of magma on the move, a supereruption is likely have enormous precursors — those changes in seismicity, ground deformation, chemistry, and other measurements that signal upcoming changes in any volcano’s behavior.[41]

Even if a supereruption took place, it might not be an extinction-level event. Something too big to fit on the VEI scale, now, *that* would be a problem.

+ Misconception: Supervolcanoes are the biggest eruptions known

Don’t panic, but every twenty million years on average, the ground opens up and millions of cubic kilometers of molten rock either flow or explode across the land for 1-5 million years at a stretch.[42]

The word “millions” is why you shouldn’t panic.

The last time it happened was around 15 million years ago in the Pacific Northwest. Geologists call this large igneous province (LIP) the Columbia River Basalts because its eerie rocks now form the beautiful walls of the Columbia River Gorge.[43]

So we’ve got at least 5 million more years to go.

LIPs involve the sort of cutting-edge research that goes way beyond the scope of this article.

It’s fun to know, though, that geoscientists maintain a LIP of the Month web page.[44] That’s actually a pretty cool feature, even if (like most of us) you don’t have a PhD.

And it’s fascinating to think that volcanologists have found connections between some of these ancient events and a few modern volcanoes – relating Yellowstone to the Columbia River Basalts LIP, for instance, and Australia’s rare volcanic beauties, Heard and McDonald islands, to the mostly submerged Kerguelen Plateau LIP.[45][46]

Don’t worry. These places aren’t suddenly going to come to life again. They’re just the last flickering flames of a once terrifying fire upon the land.

Yellowstone still needs watching, but Heard Island is already wrapped in a thick blanket of snow and ice.

10 People Who Actually Fell Into A Volcano

]]>
https://listorati.com/top-common-misconceptions-about-volcano-eruptions/feed/ 0 8487
Top 10 Misconceptions About Nature’s Efficient Killers: Sharks https://listorati.com/top-10-misconceptions-about-natures-efficient-killers-sharks/ https://listorati.com/top-10-misconceptions-about-natures-efficient-killers-sharks/#respond Wed, 30 Aug 2023 05:58:45 +0000 https://listorati.com/top-10-misconceptions-about-natures-efficient-killers-sharks/

Ever since Jaws hit theaters, folks have feared stepping into the water for fear of becoming a giant fish’s snack. The book and films paint sharks as evil killing machines, which couldn’t be further from the truth.

Sharks rarely harm people, and they don’t ride around in tornadoes as much as people think. Still, there are tons of misconceptions about nature’s most efficient killing machines, and these ten are easily the most widespread.

10 Facts That Will Change How You View Sharks

10 Sharks Are Nature’s Most Efficient Killing Machines

Don’t let the title fool you; sharks arent nature’s best killer. That honor doesn’t even belong to the vertebrates! Viral organisms have held that particular trophy on their collective mantles for as long as they’ve been hopping rides in animals’ bodies only to return the favor by killing them.

To be fair, many sharks are efficient predators, but that doesn’t make them nature’s most efficient killers. Many sharks are intelligent and calculating — they stalk their prey in the same way a pride of lions might plan a hunt.

They wait patiently for the perfect moment to strike, which is why you often see sharks swimming peacefully around their potential “food.” Sharks that hunt their prey often rely on surprise and will abort their attempt if that advantage is lost.

Of course, this only applies to some shark species, and there are probably more than you might know. More than 500 species of shark have been identified, and some are more efficient at hunting than others. Many species are far less discerning and will attempt to eat anything they come across.

9 Sharks Are Maneaters

Sharks are scary because they look terrifying. Add to that the fact that they are often unseen when they attack people, and you have nothing short of terror on everyone’s minds when one is sighted. Because of this and movies that capitalize on fear, sharks have been labeled as “maneaters.”

The vast majority of sharks spend their entire lives without ever seeing a human. Think about it for a second — Earth’s oceans are enormous, and most people only venture into the waters along coastlines, leaving a massive space for sharks to swim without seeing people.

The vast majority of shark species are opportunistic feeders that primarily eat small fish and invertebrates. Only around 12 species have been involved in attacks on people. When it does happen, it’s usually due to the shark mistaking a human for something else.

Granted, when a shark bite does occur, it’s serious. Just a little chomp is enough to cause massive tissue damage, the mangling of limbs, and death if not treated right away. The seriousness of these incidents helps give rise to the “maneater” myth, but in the end, that’s all it is.

8 Sharks Are At The Top Of The Food Chain

Most people think of sharks as nature’s most efficient killing machines, but we’ve already dispelled that myth. Still, the myth prevails, leading many to assume that sharks are at the top of their respective food chain.

It makes sense when you think about it… what could possibly pose a threat to a shark? Technically, humans kill far more sharks than anything in the ocean, but we aren’t exactly members of their food chain. As it happens, while sharks don’t have to live in constant fear of many predators, they are hunted in their environment.

When most people picture the deadliest shark, they probably think of the Great White. After all, they can grow up to 23 feet (7 meters) and weigh 2.5+ tons or more. They can become prey to larger great whites, but the creature that hunts and eats them is the aptly named Killer Whale.

This behavior was first documented in 1997 when two Orcas attacked a Great White shark to consume its liver. Since that time, more attacks have been witnessed, proving there’s a predator the largest predatory sharks fear.

7 Sharks Can’t Get Cancer

Because cancer has long been one of humanity’s deadliest enemies, people have looked to other animals to try and find a means of fighting it. Over the years, this has manifested in many ways. The most perplexing is the belief that sharks are so evolved that they are somehow immune to cancer.

This is entirely false, as malignant tumors have been found in sharks since the first was identified in the late 19th century. It likely became popular because, while sharks can (and do) get cancer, it’s somewhat rare when compared to other animals. As a result, people have long believed that grinding up shark cartilage and consuming it will prevent/kill cancer (it doesn’t).

Another culprit for this myth is the 1992 bestselling book, Sharks Don’t Get Cancer: How Shark Cartilage Could Save Your Life. In the book, I. William Lane and Linda Comac explain that sharks rarely get cancer. The book promotes using their cartilage for health benefits.

It also endorsed a product from one of the author’s sons, eroding its credibility as legitimate scientific research. Regardless, the myth prevails despite the large body of scientific evidence proving otherwise.

6 Sharks Will Die If They Stop Swimming

There has long been a belief that sharks need to swim to live, which is associated with how sharks breathe. Like most fish, sharks breathe via gills, which extract oxygen from the water as it passes over them. Unlike most species of fish, sharks do this in a variety of ways.

Some species use ram ventilation, which works when they swim fast with their mouths open, forcing water to flow through their gills. Typically, when a shark is breathing in this manner, they swim faster than usual and are always moving.

Some sharks use buccal pumping, which works by drawing water into the mouth and over the gills, which they can do while remaining completely still. This has been observed in Bullhead and Nurse Sharks.

The Tiger shark has been known to switch between buccal pumping and ram ventilation, depending on their need. Some species lost the ability to buccal pump, including Great Whites and Mako Sharks. These “obligate ram ventilators” will indeed stop breathing if they don’t swim, so the myth isn’t true of nearly every species of shark but does apply to a few.

5 Sharks Can Detect A Single Drop Of Blood From Miles Away

You’ve probably heard this at some point in your life, and ever since, you’ve taken precautions to avoid the water with even a tiny cut. Many sharks are hunters, and they do have an acute sense of spell and a sensitive olfactory system, but it’s not a supernatural ability.

Sharks use their nostrils entirely for smelling since they can’t breathe through them. They are lined with incredibly sensitive cells capable of picking apart various chemicals interpreted as smells by the brain. While this translates into an accurate sense of smell, it doesn’t extend for miles in any direction.

Some sharks have been known to detect a low concentration of something at a few hundred meters, which is hardly a mile or more. Some species of sharks can detect specific compounds at 1 part per billion, which sounds impressive — and it is — but it amounts to about 1 drop in an average-size swimming pool.

Sharks can definitely pick apart a scent in a large body of water, but it’s limited far more than the myth allows. This ability isn’t restricted to predation either, as it comes in handy when detecting pheromones emitted during mating.

4 Sharks Can Swim Backward

For the vast majority of fish species, swimming backward is as simple as flicking the pectoral fins in the right direction. This makes it possible to quickly escape from danger, and most fish can do it, though not as well as they can swim forwards.

Because this isn’t much of a problem for most fish, it stands to reason that people would believe that sharks can swim backward. Oddly enough, sharks are among the minority of fish species that cannot swim backward, and it’s to do with several factors of their anatomy.

Sharks swim forward by moving their tails to push water around their fins, which they use to stabilize and steer their bodies. Unlike most fish, their pectoral fins don’t curve upwards, limiting their swimming to forward movement only. When a shark moves backward, it does so by stopping movement and letting gravity do the rest.

Moving backward can actually be harmful to sharks due to their need to move water over their gills to breathe. Backward movement can lead to suffocation in some species, so doing anything but swim forward isn’t an option.

3 Sharks Are Only Found In Saltwater

Ask just about anyone in the world what kind of water sharks call home, and they’ll describe oceans. This makes sense, seeing as most sharks are found in the world’s oceans, but there are some exceptions.

There are six species of River Sharks found in Southeast Asia, South Asia, New Guinea, and Australia. These sharks all fall under the genus Glyphis, and they all reside in freshwater for their entire lives. Unfortunately, most species of River Sharks are largely unknown to science.

This is due to severe population decline due to habitat degradation, making them some of the rarest sharks in the world. Little is known about their lifecycle and populations as a result of their diminishing numbers. Often, River Sharks are confused with another freshwater-loving species, the Bull Shark.

Bull Sharks spend most of their lives in freshwater, but they return to the ocean to mate. They travel the world’s rivers and have been found as far inland up the Mississippi River as Alton, Illinois, which is about 1,750 miles from the Gulf of Mexico. In 1972, one was found 2,500 miles up the Amazon River, so they do just fine in freshwater.

2 All Sharks Are Deadly To Humans

Many people around the world wrongly assume that sharks are maneating predators capable of causing severe harm or death. While it’s true that a shark bite is almost always a serious injury that can lead to death, the belief that all species of sharks are capable of causing harm to humans is simply false.

Only about a dozen species of sharks have ever been known to bite humans, and seeing as there are around 500 species known to science, that’s only about 2.4% of all sharks. That leaves hundreds of different species that are of no threat to people, and the variety of these animals is pretty amazing.

Some sharks, like the Caribbean Reef Shark, are dangerous to humans if they attack, but they rarely do. Still, you wouldn’t want to get bitten by one. Nurse sharks are well known for having no interest in people whatsoever, and while attacks are incredibly rare, disturbing them might cause a bite.

The largest shark and largest fish in the world, the Whale Shark, is of no danger to humans. They are filter feeders, so it has no interest in people at all. If you found yourself in its mouth, it would be almost as upset as you, but seeing as it lacks any teeth in its mouth and would spit you out, you’d walk away with the best fish story ever told.

1 Sharks Can “Go Rogue” And Hunt Only Humans

The term “rogue shark” is often used to describe a shark that stops hunting its usual prey and instead seeks out and subsists entirely on people. This is, of course, unnatural for all the reasons previously mentioned.

So-called “rogue sharks” are blamed for shark attacks on people, but the concept of a “rogue shark” is entirely without merit. Sharks don’t suddenly gain a taste for humans and seek them out. typically, when a shark bites someone, they realize it isn’t their typical prey and move on.

They rarely take a second bite, though it can sometimes happen. Ultimately, it’s easy to see that rogue sharks aren’t a real threat if you look at the number of annual shark attacks. Globally, 57 unprovoked shark attacks were recorded in 2020. Ten of those resulted in fatalities, and 33 occurred in the U.S.A.

That may seem like a lot, but the odds of a person being bitten by a shark are incredibly low. Your odds of being attacked by a shark are 1 in 3,748,067. You are more likely to be struck by lightning before winding up in a train accident and ultimately dying from fireworks long before succumbing to a shark attack.

Top 10 Fascinating Facts And Stories About Sharks

]]>
https://listorati.com/top-10-misconceptions-about-natures-efficient-killers-sharks/feed/ 0 7350
10 Misconceptions about Mental Health We Need to Unlearn https://listorati.com/10-misconceptions-about-mental-health-we-need-to-unlearn/ https://listorati.com/10-misconceptions-about-mental-health-we-need-to-unlearn/#respond Sat, 19 Aug 2023 04:56:39 +0000 https://listorati.com/10-misconceptions-about-mental-health-we-need-to-unlearn/

Exploring the truth about mental health is a journey of understanding and compassion. As we unravel the complexities of mental health, it’s time to say goodbye to old misconceptions. We’re challenging preconceived notions and embracing fresh perspectives. It’s time to foster a friendly dialogue that nurtures empathy and knowledge. Let’s embark on a transformative quest to unlearn, relearn, and uplift together. Because when it comes to mental health, a little understanding goes a long way.

Related: Top 10 Culture-Specific Illnesses And Mental Disorders

10 Misconception: Mental Illness Is a Sign of Weakness

One misconception that deserves immediate attention is that mental illness signifies weakness. Just as a physical ailment doesn’t indicate personal weakness, neither does a mental health challenge imply any decline in character or strength.

Mental illnesses are complex conditions that arise from a combination of genetic, biological, environmental, and psychological factors. They can affect anyone, regardless of background, personality, or resilience. In fact, seeking help and facing these challenges head-on takes tremendous courage and strength.

Admitting to and addressing mental health concerns is an act of bravery demonstrating a deep understanding of one’s well-being. It’s akin to acknowledging a physical ailment and seeking medical attention. Just as we rally behind someone battling a physical illness, offering unwavering support to those grappling with mental health issues is crucial.

Erase the misconception that mental illness is indicative of weakness. By doing so, we can foster an environment of empathy and understanding, encouraging individuals to seek help without fear of judgment.

9 Misconception: Mental Illness Is an Adult Problem

To promote better mental health awareness, one misconception that needs to be debunked is the notion that mental illness is exclusively an adult problem. This belief couldn’t be further from the truth. Children and adolescents can also struggle with mental health issues, and it’s crucial to recognize and address these challenges early on.

Just as physical health can affect anyone at any age, so can mental health issues. Young minds are particularly vulnerable to the pressures of academics, peer relationships, and societal expectations. From anxiety and depression to attention-deficit disorders, mental health conditions can manifest during childhood or adolescence.

Recognizing signs of distress and providing timely support is paramount. Parents, teachers, and caregivers play pivotal roles in creating safe spaces for young individuals to express their emotions and seek help without judgment. Open conversations and education about mental well-being should be integral to growing up, promoting resilience and emotional intelligence from a young age.

8 Misconception: Therapy Is a Waste of Time and Money

One prevalent misconception is that therapy is a futile drain on your precious time and hard-earned money. In reality, seeking therapy can be invaluable to your mental and emotional well-being. Therapy is not a one-size-fits-all solution but a personalized journey toward self-discovery and growth.

Trained mental health professionals provide a safe space for you to explore your thoughts, feelings, and concerns without judgment. This process can help you untangle the knots of your mind, gain clarity, and develop effective coping strategies.

Moreover, therapy isn’t solely reserved for severe issues. It’s a proactive approach to prevent future struggles and cultivate a more fulfilling life. It’s a commitment to self-improvement, self-compassion, and lasting happiness. Your well-being matters; seeking support is a sign of strength and self-care.

7 Misconception: People with a Mental Illness Can “Just Get Over It”

The misconception that individuals with a mental illness can simply “get over it” oversimplifies the complex nature of mental health and adds an unnecessary burden to those already struggling. Mental health conditions are not a matter of willpower or a choice; they are legitimate medical conditions that require understanding, support, and often professional treatment.

Comparing mental health to a fleeting mood overlooks conditions like depression, anxiety, and bipolar disorder; these are influenced by a combination of biological, genetic, environmental, and psychological factors. Just as we wouldn’t expect someone to “get over” a physical illness like diabetes or heart disease without proper care, the same should hold true for mental health.

Support and empathy are essential when helping someone with a mental illness. Instead of telling them to “just get over it,” let’s encourage open conversations, offer a lending ear, and show patience. By understanding that mental health challenges are not a simple matter of willpower, we can create a more compassionate and inclusive society where individuals feel empowered to seek the help they deserve.

6 Misconception: Mental Health Is the Same as Mental Illness

A misconception we need to unlearn is the confusion between mental health and mental illness. These terms might sound similar, but they refer to distinct aspects of our psychological well-being. It’s like comparing physical fitness with a medical condition—they’re not the same thing.

Mental health encompasses the overall state of your mind, emotions, and psychological well-being. It’s about nurturing your mental resilience, managing stress, and maintaining a positive outlook on life. Just as we prioritize our physical health by exercising and eating well, we should actively cultivate our mental health through mindfulness, self-care, and seeking healthy social connections.

On the other hand, mental illness refers to specific conditions that affect a person’s thoughts, feelings, or behavior. Conditions like depression, anxiety, or bipolar disorder fall under this category. Recognizing mental illness is crucial, as it requires appropriate diagnosis and treatment by mental health professionals.

By understanding the distinction between mental health and mental illness, we can better support ourselves and others. Just as we care for our bodies, we must also prioritize our mental well-being, practicing emotional hygiene and promoting open conversations about mental health.

5 Misconception: It’s Obvious When Someone Has a Mental Illness

In our journey toward understanding mental health, it’s crucial to debunk the misconception that a person with a mental illness can be easily identified at a glance. Contrary to popular belief, mental health struggles often don’t come with a visible sign or a one-size-fits-all appearance.

Mental health challenges manifest in numerous ways, some of which might not be immediately apparent. While some individuals might exhibit visible symptoms, like changes in behavior or mood, others may be adept at concealing their struggles behind a façade of normalcy. People with mental illnesses are skilled at adapting and masking their feelings, often due to stigma or fear of judgment.

Jumping to conclusions based on appearances can exacerbate the stigma around mental health. Instead, foster an environment of compassion and open communication. Encouraging meaningful conversations without making assumptions allows us to support one another without judgment.

4 Misconception: People with Mental Illness Are Often Violent & Dangerous

One of the most harmful misconceptions surrounding mental health is the unfounded belief that individuals dealing with mental illness are inherently violent and dangerous. This stereotype creates a sense of fear and isolation for those struggling with a mental illness.

In reality, most people with mental health conditions are not violent or dangerous. Mental illness encompasses many issues, from anxiety and depression to bipolar disorder and schizophrenia. These conditions affect individuals differently and do not determine their capacity for violence.

People with mental health challenges are more likely to be victims of violence rather than perpetrators. The factors contributing to violent behavior are complex and involve various factors such as substance abuse, history of violence, and socio-economic conditions. Blaming mental illness as the sole cause oversimplifies the issue and unfairly paints all those who struggle with these conditions with the same brush.

We must challenge this misconception and replace it with empathy and understanding to foster a compassionate society. By acknowledging that mental health conditions do not equate to violence, we can create an environment where individuals are more comfortable seeking help and support without the added burden of judgment.

3 Misconception: Mental Illness Is Caused by Bad Parenting

The notion that mental illness is solely caused by bad parenting can unfairly burden parents and perpetuate stigma around mental health issues. In reality, the causes of mental illness are multifaceted and complex; blaming parents for their child’s mental health struggles oversimplifies a deeply intricate topic.

While upbringing and family dynamics shape a person’s emotional well-being, it is just one piece of the puzzle. Genetic predisposition, brain chemistry, traumatic experiences, and societal influences all contribute significantly to the development of mental health disorders. Labeling parents as the sole cause can discourage open conversations about mental health and create barriers to seeking help.

It’s essential to recognize that parents, like all individuals, are doing their best with the resources and knowledge they have. Placing blame on them can hinder the healing process for everyone involved. To promote understanding and empathy, we must shift away from this misconception and focus on fostering supportive environments that encourage open discussions about mental health.

2 Misconception: Mental Illness Is a Choice

We don’t choose to develop physical ailments like diabetes or asthma; individuals don’t choose to have mental health struggles. Mental illnesses are complex conditions that arise from genetic, environmental, and neurological factors. They are not a matter of personal preference or character flaws. This misconception can prevent individuals from seeking the help they desperately need.

Blaming individuals for their conditions only adds to their burden and hinders their path to recovery. We need to replace judgment with empathy and educate ourselves about the true nature of mental health. By dispelling the misconception that mental illness is a choice, we can create a more supportive and compassionate society where those in need feel safe to seek help without fear of judgment or blame.

1 Misconception: People with Mental Illness Can’t Handle Relationships

People with mental health challenges are just as capable of nurturing and maintaining meaningful connections as anyone else. Mental illness doesn’t define a person’s ability to love, care, and communicate effectively. In fact, many individuals living with mental health conditions have developed unique strengths and coping mechanisms that can enhance their relationships. These individuals often possess heightened empathy, resilience, and an increased capacity for understanding others’ struggles.

While it’s true that mental health issues might introduce certain challenges into relationships, it’s important to remember that every relationship faces its own set of hurdles. What truly matters is the willingness to seek support and work together to overcome difficulties. Just like physical health problems, mental health conditions can be managed and treated with the right care, therapy, and medication.

Encouraging open conversations about mental health within relationships can lead to deeper connections and greater emotional intimacy. Remember, a person’s mental health journey doesn’t determine their ability to love and connect. Individuals with mental illnesses can build strong and fulfilling relationships with empathy and support, proving this misconception utterly false.

]]>
https://listorati.com/10-misconceptions-about-mental-health-we-need-to-unlearn/feed/ 0 7185
Top 10 Misconceptions About London https://listorati.com/top-10-misconceptions-about-london/ https://listorati.com/top-10-misconceptions-about-london/#respond Sun, 06 Aug 2023 01:47:14 +0000 https://listorati.com/top-10-misconceptions-about-london/

Who hasn’t dreamed of a citybreak in London at some point? It is, after all, one of the world’s most visited cities—attracting over 19 million international visitors in 2016. The home of some major global attractions, like Big Ben, the Natural History Museum and the London Eye, London is one of the world’s greatest cities. It’s just a shame that it’s full of miserable people, British food and bad coffee. And doesn’t it rain there all the time?

Despite London’s global fame, far too many people still believe some pretty big misconceptions about the UK’s capital. Some of them were true decades ago, some are simply misunderstandings and others just seem to be total rumors, as false now as they ever were. We’d hate for you to be put off going to London based on a simple misconception, so today we’re setting the record straight and telling you the truth about ten of the things most people get wrong about London!

See Also: Top 10 Most Gruesome Things Hidden Under The Streets Of London

10 Coffee and Tea


Let’s start with the one that puts a lot of American visitors off: the myth that good coffee is impossible to find in London. This one may have been true a couple of decades ago, but the London coffee scene is much more impressive now than it was in 2000. Major coffee chains like Starbucks, Cafe Nero and Costa Coffee can be found on almost every street corner, just like in any other major global city. In fact, both Europe’s largest coffee chain, Cafe Nero, and the world’s second biggest chain, Costa, have their headquarters in London.

Until recent times, those looking for a more specialist coffee experience would have to make a detour to the Continent, but London’s independent and mini-chain coffee scene has grown enormously in the last decade. There are now hundreds of specialist stores dotted across the capital offering thousands of unique coffee concoctions (and providing jobs to many barista millennials), enough to satisfy even the most discerning connoisseur. So no, you won’t be starved of coffee if you find yourself in London![1]

9 Bad Food


Britain is infamous for having bland and boring food, so many people think eating in London must be no different. It’s not hard to see why: one of London’s traditional dishes, for instance, is the London Particular. What is it? Well, it’s basically a really thick pea and ham soup—thick enough to stand a spoon in. Yeah.

Fortunately for us, modern-day London is a thoroughly global city. There are thousands of restaurants, from cheap take-outs to Michelin-star restaurants, covering practically every culinary tradition you can think of. But to find London’s best food, you’ll have to take to the street.

London has a centuries-long history of innovative street food that dates all the way back to Medieval times. Back then, most of the city’s labourers would go to a bakery on their break, where they would find everything from doughnuts to meat pies. Today, London’s street food is some of the best in the world—heavily influenced by the city’s diverse population. In modern London you’re just as likely to find a local stopping to grab a pot of curry as you are a chicken sandwich![2]

8 Unfriendly People


Ask people around the world what British people are like, and you’ll get a pretty similar response: they’re quiet, they’re reserved, they’re unfriendly, and so on. And Londoners in particular have a bad reputation—even other Brits will say that people in London are rude!

In reality, Londoners aren’t any more unfriendly than anyone else in Britain—or most people in the world, for that matter. A test performed on London’s streets showed that, when faced with an old lady who needed help with her bag, or a woman who had her groceries knocked out of her hands, nearby people were quick to intervene and offer assistance. So no, it’s not that Londoners are unfriendly, but they are in a rush most of the time. London is a fast-paced, global city, and most people who are out in the day are going somewhere in a hurry. This might mean they forget to say sorry as they push past you in the station, but meet them after work hours and they’ll be totally different![3]

7 Cramped and Dirty


In 1952, the Great Smog descended on London, killing thousands. It was the worst air pollution event in British history, and news of it spread across the world. It led to new government regulations and a plan to cut the city’s pollution levels, but the city’s reputation was permanently damaged. To the rest of the world, London was known as a dirty, old-fashioned Victorian city, struggling to meet modern standards of cleanliness.

This reputation persists today, though it is entirely untrue. While studies have shown that London’s air still affects people’s health, it is relatively clean by modern standards. In fact, London doesn’t even make it into the World Health Organization’s list of the top 500 most polluted cities. In short, visitors to the city have nothing to worry about when it comes to air quality!

Perhaps more shockingly, London is one of the greenest cities on Earth by area. 47% of the city is green space, so almost half is covered by parks, woodland and other open areas—enough for some people to say that the Greater London Area should be turned into a national park.[4]

6It Never Sleeps


When someone refers to “the city that never sleeps”, they’re usually referring to New York. In recent years, though, the phrase has also been applied to the British capital, leading to the idea that London is a busy place with a boisterous nightlife—and not the place to go if you have kids with you!

There is some truth to this. Like any other city, London can be boisterous—especially at the end of the week, when the city runs all-night bus services. The famous London black taxis work all hours, and many London clubs don’t close until 3am or 4am. But even in central London, the Tube and regular bus services end at midnight and don’t start up again until the early morning. Most pubs close before midnight and at that point, the vast majority of Londoners will be heading to bed. In London’s quieter outer boroughs, you could even forget you’re in a city at all![5]

5 It’s Expensive


London is expensive. It is actually one of the most expensive cities in the world… if you want to live there. Rent for an average three-bedroom apartment in central London stretches above £5000 ($6,450), way outside the pay range of most British adults. The rumours of London’s high prices put many people off, but it is possible to enjoy London on a small budget. You don’t even have to look too hard!

You can have a day out in London and hardly spend any money: Most of London’s world-leading museums are free to enter, as are its fantastic art galleries and parks. And the places which do charge admission aren’t too expensive: a ticket to visit Kew Gardens, the world-leading botanic garden, is just £12 ($15.50). There are thousands of pubs and restaurants in the capital which aren’t any more expensive than they are in the rest of Britain. If you’re willing to take a risk, you can find food for mega-cheap. Ask around, the locals will quickly tell you the best places. Just don’t expect to find much budget food in Kensington or Chelsea.[6]

4 No-Go Zones


Unfortunately, one of the more recent misconceptions about London is that some areas of the city are ‘no-go zones’, places where so-called ordinary Londoners—even members of the police—never go. In extreme cases, some conspiracies say these are places where British law is ignored and locals are forced to follow Sharia Law by Muslim councils.

While all cities have areas that are best avoided at night, the idea that parts of the capital are too dangerous for the Metropolitan Police to visit are false, despite the ramblings of some people on the internet. This misconception became widespread in 2016, when Donald Trump claimed there were no-go zones in London. This was rejected by then-mayor Boris Johnson, who said “London has a proud history of tolerance and diversity and to suggest there are areas where police officers cannot go because of radicalisation is simply ridiculous. Crime has been falling steadily in both London and New York—the only reason I wouldn’t go to some parts of New York is the real risk of meeting Donald Trump.”

Just to prove it, a British man accepted a Reddit challenge to walk through one of these rumoured areas while drinking a bottle of wine. Unsurprisingly, nothing happened to him.[7]

3 The Underground Is The Only Way To Travel


The London Underground (also known as the Tube) is one of London’s most famous landmarks. It was first opened in the 1860s, making it the oldest underground passenger railway in the world. And with millions of people using it every single day, it is also one of the world’s busiest.

On the other hand, London is—quite famously—not built on a grid, unlike many more modern cities. When the city burned down in 1666, it was actually rebuilt following the old road system, meaning that some streets in the city are hundreds of years old. This means the city is not optimised for traffic: driving through by car can be torture.

With this in mind, it’s easy to see why many people think the Tube is the only way to travel around London, but there are a wide range of other options. Take London’s famous red buses, for instance. You’ve probably seen these in old films, but they still exist: they carry millions of people across the city every day—more than any other bus network in Europe. There’s also the famous black cab service, whose drivers have to take a tough training course called The Knowledge, which teaches them the layout of London. If you prefer to get around on your own, though, your best bet are the Santander Cycles. Affectionately known as Boris Bikes, you can ride one of these bikes from one of the hundreds of stations across the city to any other station for just £3 ($3.87).[8]

2 It’s Always Wet and Cold


Britain is well-known for its grey weather and rainy days. So most people assume that London, as the British capital, must be no different!

Now, no-one’s going to say the UK is a warm country. Winter in Scotland and North England can be very harsh, and summer is often dominated by cloudy days. But London is in the south, and the temperatures there are much warmer. London is slightly colder than New York on average, but it still remains way above freezing through most of the winter months.

And contrary to popular belief, London is nowhere near as rainy as you might imagine. It averages around 23 inches of rain a year, almost half as much as New York! Precipitation is low through the winter, too: snow is so rare in London that, in the UK, if it snows on Christmas day in the capital, it’s known as a ‘white Christmas’.

We’re not saying you should pack just a t-shirt and shorts for your London visit, but nor will it be as dreary and wet as you expect. In the summer, you might even see a blue sky![9]

1 The City of London?


If someone told us they were going to visit the City of London, we wouldn’t bat an eyelid. But true Londoners know there’s a problem with this statement: do they mean Greater London—the hulking metropolis we’re all familiar with—or the tiny, independent local authority in the heart of London, which has a population of just a few thousand?

The smallest dot on a map covers what was Roman London. It is almost wholly independent from Greater London, having its own police force, government and mayor. So is this the true heart of Britain, the place where the big decisions are made? No, that’s the other city—the City of Westminster, which is home to the UK’s Houses of Parliament.

Does that mean the City of London is the home of London’s government, then? No. The mayor of Greater London—an entirely different position to the mayor of the City of London—governs from City Hall, which is located across the river in Southwark. It’s easy to see how even Londoners get confused by London’s inner workings!

Oh, and just to make things even more confusing: Greater London isn’t even a city. It’s a county.[10]

]]>
https://listorati.com/top-10-misconceptions-about-london/feed/ 0 6983
10 Misconceptions About The ‘Most Dangerous’ Travel Destinations https://listorati.com/10-misconceptions-about-the-most-dangerous-travel-destinations/ https://listorati.com/10-misconceptions-about-the-most-dangerous-travel-destinations/#respond Wed, 26 Jul 2023 21:54:19 +0000 https://listorati.com/10-misconceptions-about-the-most-dangerous-travel-destinations/

Traveling is one of the best ways to broaden your perspective as well as the scope of things that can kill you. Taking a trip may involve quite a few unpleasant experiences—from civil wars to bad sex—along with awesome ones. That’s why travelers usually avoid some cities and countries altogether as quite a few regions in the world are experiencing one problem or another.

See Also: 5 Awesome Radioactive Tourism Spots That’ll Leave You Glowing

But some locations have extremely bad reputations that may be unwarranted. If you visit these places, you may soon realize that they’re probably better than many cities in your own country. The reasons for these misconceptions can range from a history of conflict to poor economic conditions.

As anyone who has journeyed to any of the following places will tell you, these destinations are safer and far more hospitable than some popular travel spots around the world—you know, the ones we don’t think twice about visiting.

10 Tehran, Iran

If you grew up in the US—and maybe during some time periods in the UK—chances are that you view Iran as the last place you’d want to go. Most people visualize it as a desert nation with regular terrorist attacks and general conflict. They don’t realize that they’re probably picturing parts of Afghanistan or the Arabian Peninsula.

Although Iran does have deserts, they barely cover 25 percent of the country (although sources vary on the exact percentage). In comparison, China’s deserts make up about 30 percent of their territory. Moreover, Iran’s deserts are very different from what you’re imagining. They have a hilly look and distinct geographical features not found anywhere else in the world.

If that’s surprising, then you’re still thinking of places like the Arabian Peninsula or the Sahara. Most of Iran is hilly, with quite a few alpine regions that you might want to explore on Google Images.

As far as safety is concerned, Iran is far from the center of religious extremism that we imagine it to be. At an altitude of about 1,200 meters (4,000 ft) and against a backdrop of snow-capped mountains, Tehran may well be one of the most picturesque cities you’ll ever visit. Iranians are also known for their hospitality, something you’ll realize the moment you land.[1]

Of course, it’s still a country ruled by an extremist government and many nations advise tourists against indulging in anything illegal when they’re there. Other than in Britain and the US, though, that’s pretty much the only advice that state departments give regarding Iran.

9 Antarctica

When we think of Antarctica, we think of a frozen wasteland with variations of snow-covered, barren terrain. It also sounds extremely dangerous—as any remote place without a steady stream of supplies tends to be. If you do the research, though, you’ll find that Antarctica is gradually becoming one of the best places for adventure lovers.

In Antarctica, hardly any people die from extreme weather or a lack of supplies. In fact, most casualties occur in research stations there due to scientific reasons.

Of course, that doesn’t mean that you can head to Antarctica like you’re going on a hike for the weekend. It’s still a remote place unsuited to the casual traveler because medical facilities and emergency resources are scarce.

Typically, the research stations are only involved with science, so it’s not a good idea to rely on them for help. If you want to visit, you’ll need to join a planned expedition that will take care of all supplies and other camping needs.

Once there, you’ll enjoy some of the most pristine views in the world. With the snow gradually melting to reveal a unique, almost alien-like landscape, Antarctica is slowly becoming one of the best destinations for modern-day explorers.[2]

8 Detroit, Michigan, USA

The United States is hardly uniform when it comes to tourist safety. Although certain US cities are recognized as among the safest and most hospitable in the world, others—like Gary, Indiana—have become Internet jokes over how dangerous they are. Detroit also has a bad reputation. In fact, its decayed urban setting served as an ideological backdrop for quite a few dystopian cyberpunk movies.

That reputation isn’t groundless. Due to many factors, the last few decades saw Detroit become the poster child for the numerous tales of urban decay and rising poverty hiding behind the famous American Dream. Thanks to some recent efforts by local authorities, though, the situation may not be that bad anymore.

Make no mistake: Detroit still has high crime rates. But you can probably spend a weekend or two there without taking a tremendous risk—if you watch where you go.[3] The crime rate has drastically declined in the past few years, and median income is rising. Quite a few development projects are aimed at restoring the city to its former glory.

Travelers who’ve spent time there refer to it as a vibrant spot with a burgeoning local culture. Of course, you’ll need to avoid going to the bad areas, just as you would in popular cities like New York or London.

7 Kiev, Ukraine

On first look, Ukraine doesn’t seem to be the kind of country where you’d want to spend a relaxing week. Its ongoing war makes it an active conflict zone, giving it a reputation for continuous danger.

That’s absolutely justified because some parts of Ukraine are still embroiled in a battle with Russia. Far away from the conflict, however, Kiev remains one of Europe’s least expensive and most lively destinations.

Although news channels paint pictures of dropping bombs and militias regularly invading government buildings, travelers tell stories of quaint cafes and an old-world, ex-Soviet charm, perhaps only matched by its sister cities in Russia.

Thanks to the diverse cultures that have influenced the city, Kiev features many beautiful churches and two UNESCO World Heritage Sites. You can explore them with little to no risk of being caught up in a skirmish as the war is quite far away.[4]

6 Republic Of Kosovo

Kosovo, a tiny country in the Balkans, may not figure often in our news cycle these days, but it still bears the scars of one of the most brutal wars of the 20th century. Fought between the rapidly disintegrating Yugoslavian troops and Albanian rebels, the war was part of the larger phenomenon of the Balkanization of Yugoslavia.

After witnessing everything from attempted genocides to unprecedentedly brutal sieges on civilian populations, the ex-communist state was divided into several modern countries in Eastern Europe.

Although Kosovo still carries much of that violent reputation, it’s largely unfounded because the war is long over. Today, Pristina, the country’s charming capital, is one of the safest cities in Eastern Europe. Visitors from English-speaking nations will be glad to know that almost everyone speaks the language in Kosovo, even though the republic is as foreign and distinct as it can be.

Kosovo lives up to the reputation of unimaginable hospitality, which is shared by most countries in the region. It is also more affordable to spend time there than in its Western European counterparts.[5]

5 Istanbul, Turkey

Contrary to popular belief, Istanbul is not the capital of Turkey. However, it is still considered to be the best city to visit in the country—at least by some people.

Even so, Istanbul has endured quite a few cases of unrest in the recent past. Turkey’s alleged involvement in Middle Eastern wars has only added to that perilous reputation. Even though some individuals regard Istanbul as a risky travel destination, many past visitors will tell you that it’s still better than other locations with safer reputations.

Although Istanbul has witnessed some violence in recent years, it coincided with the war in Syria, which is now in its final stages. Incidents are few and far between, and there haven’t been any major ones for a long time.

In many parts of the city, things haven’t changed a bit since the war started. As a major center of arts and culture in the region, Istanbul contains an important UNESCO World Heritage Site. Of course, it always makes sense to check travel warnings from your country. But for anyone developing an itinerary for a trip to that part of the world, you may want to explore Istanbul.[6]

4 Zimbabwe

Many conversations in Zimbabwe inevitably turn to its ravaged economy and what happens when you let your leaders do whatever they want for one second. Zimbabwe is a great example of how unchecked inflation can destroy a country. Everyday goods now cost exorbitant amounts of money. Needless to say, it’s not the best spot to have a fancy bespoke wedding.

Then again, you may want to reconsider. Despite being in a state of economic ruin, Zimbabwe remains one of the most beautiful and geographically diverse countries to visit in Africa.[7]

Many tourists have said that the country’s economy didn’t have any effect on them as Zimbabwe is still one of the more popular spots for travelers in the region. The gorgeous waterfalls, flora, and fauna make Zimbabwe a noteworthy destination.

3 Saint Petersburg, Russia

The reputation of Russia as a dangerous travel destination probably originated during the Putin era—and rightfully so. If you’re a journalist working to uncover the shady connections between private Russian industries and Putin, we’d highly recommend against going to Russia to do it. Other than that, the country contains many UNESCO World Heritage Sites and some of the most distinctive artistic styles anywhere.

In contrast to the depictions in 1990s Hollywood shows, traveling to a big city in Russia isn’t all murder houses and dilapidated Soviet structures. If you do visit, make sure to see Saint Petersburg.[8]

Although it’s safer to avoid any unknown parts of the city, Saint Petersburg is a cosmopolitan vacation spot that offers a variety of activities. In fact, many people from across Europe and the rest of the world call the city “home.” It sports some spectacular buildings and corners, so make sure to take your best camera.

2 China

China is often portrayed as a rogue state in the global media—and for good reasons. First, the country has the most internal surveillance of any nation. In fact, its citizens lack many freedoms that we take for granted. China is also becoming more militaristic and flexing its muscles in the region. As a result, most of us think that going to China will probably lead to our arrests—or worse.

In part, stark cultural differences between China and Western nations shape our beliefs that the Chinese are oppressed. But a visit to any major city in China will reveal that the country is one of the most technologically advanced in the world. It may soon become the first truly cashless society, if it isn’t already. In general, the Chinese don’t see the trade-off between freedom and economic prosperity as a bad thing, something you can only know when you go there.

Due to Western beliefs, we’ve isolated ourselves from a unique travel destination. With its futuristic cities, various landscapes, and more, China should be on the top of nearly everyone’s bucket list.[9] (Of course, you’ll want to wait until the current coronavirus outbreak has run its course.)

1 Medellin, Colombia

When Pablo Escobar, the ruthless “king of cocaine,” was alive, Medellin was a particularly violent city. With one of the worst rates of gang-related murder in the world, Medellin was hardly anyone’s top pick as a travel spot. Although we can’t say that all violence has disappeared, we’ll still argue that the situation has improved substantially.

As for murder statistics, they have drastically declined in recent years. In 2018, for example, Medellin recorded about 24.75 murders per 100,000 residents, down from 375 per 100,000 in 1991 and 94.2 per 100,000 in 2009.

Of course, you still have to be careful in Medellin. But the city probably sounds scarier than it is due to the Netflix series Narcos. Some people even consider Medellin to be a “hipster holiday destination.” The city is imbued with a young entrepreneurial vibe that led to its designation as “the most innovative city in the world” in 2013.[10]

About The Author: You can check out Himanshu’s stuff at Cracked and Screen Rant or get in touch with him for writing gigs.

Himanshu Sharma

Himanshu has written for sites like Cracked, Screen Rant, The Gamer and Forbes. He could be found shouting obscenities at strangers on Twitter, or trying his hand at amateur art on Instagram.


Read More:


Twitter Facebook Instagram Email

]]>
https://listorati.com/10-misconceptions-about-the-most-dangerous-travel-destinations/feed/ 0 6826