The ancient Greeks had a thirst for mythological monsters. This obsession spread throughout the world and continues to the present day. However, many of the creatures were inspired not by the imagination but by science and nature.
It has been discovered that the sites of the ancient myths were often places where large numbers of fossils were discovered. In trying to make sense of what they were seeing, many myths were born. Here, we look at 10 mythological creatures from ancient Greece and around the world which may have had their origins in fact.
In Greek mythology, the Cyclopes (plural of Cyclops) were gigantic creatures with a single eye in the center of each of their heads. They were known chiefly for their barbarity, afraid neither of men nor gods.
The most famous Cyclops was Polyphemus, which attacked Odysseus in a cave and ate half of his men. Odysseus blinded the Cyclops by running a wooden stake through its single eye. Then Odysseus and his men escaped by tying themselves to the undersides of sheep.
This might sound implausible. But for a time, there appeared to be some fairly solid proof of the existence of Cyclopes. Many skulls were found with a single eye socket in the center of the head.
It turns out that the skulls belonged to dwarf elephants. The “eye socket” was the central nasal cavity and the opening for the elephant’s trunk. Many dwarf elephant skulls have been found in Cyprus, especially in caves where the Cyclopes were supposed to have lived. Therefore, it is perhaps natural that an elephant skull would have been taken as evidence of a race of giant, man-eating creatures with one eye and terrible table manners.[1]
Release the kraken! Originating in Nordic folklore, the kraken was said to be powerful enough to drag a ship to the depths by wrapping its gigantic tentacles around the vessel or by swimming in circles around it to create a maelstrom that would drag the ship down.
The first written account of the kraken dates back to 1180, and there were many accounts of a gigantic tentacled sea monster dragging ships to their doom. The kraken was said to be able to devour an entire ship’s crew in a single mouthful.
The kraken myth is likely to have arisen after sightings of a species of the giant squid (Architeuthis dux), which can grow to around 18 meters (59 ft) long, or possibly the colossal squid (Mesonychoteuthis hamiltoni), which is significantly larger than the giant squid and can grow to unknown lengths.[2]
Very few colossal squid have ever been found intact as they live in the deep waters of the Antarctic. For this reason, it has proved very difficult to find evidence of how squid attack their prey. Some recent research does show that they encircle prey with their tentacles before pulling it to them and eating it. So, you never know.
Although it’s a more recent story than some of the others, the duck-billed platypus was once considered a mythological animal. But it is completely real, if a little weird-looking.
First discovered in the 18th century, it was considered by many to be a ridiculous hoax and not without reason. This was an age when naturalists were creating all sorts of strange creatures with the help of taxidermy and creative imaginations.
For example, Albertus Seba had a whole cabinet of curiosities. Some were real, and some were not. For instance, the seven-headed hydra turned out to be a sackful of snakes sewn onto the body of a weasel.
The platypus looks just as implausible. In 1799, English zoologist George Shaw wrote that it resembled “the beak of a duck engrafted on the head of a quadruped.”
The platypus is remarkable for many reasons, not just its peculiar appearance. Naturalists could not determine whether the creature was a mammal. Did it lay eggs or give birth to live babies? It took another 100 years for scientists to discover the answer to that. The platypus is one of the very few mammal species to lay eggs.[3]
There have been legends of mermaids almost as long as people have sailed the seas. One of the earliest recorded tales of mermaids was that of Thessalonike. She was said to be the half-sister of Alexander the Great. After going on a danger-filled quest to discover the Fountain of Youth, he rinsed his sister’s hair in the immortal water.
When Alexander died, his sister (who may also have been his lover) tried to drown herself in the sea. But she couldn’t die, so she became a mermaid instead. Legend had it that she would call out to sailors, “Is Alexander the king alive?” If they replied, “He lives and reigns and conquers the world,” she would allow them to sail away. But if they said he was dead, she would transform into a monster and drag the ship to the bottom of the ocean.
One possible explanation for the persistence of mermaid sightings is that sailors were mistaking a mermaid, a fabulous creature with the body of a fish but the head and torso of a beautiful woman, with a manatee (aka a sea cow). It is fair to say that the manatee is not the most attractive-looking creature on Earth. So how could the sailors have made such a mistake?[4]
Well, manatees can hold their heads out of the water and turn them from side to side in the same way that a human can. And seen from behind, the rough skin might look like long hair. It is also known that sailors at sea for prolonged periods experience sailing hallucinations. So perhaps, if it was at a distance or in poor light, then they might mistake a manatee for a mermaid. Or perhaps it was the rum.
The modern view of the vampire began with Bram Stoker’s novel Dracula (1897) and has changed very little since—a pale, thin stranger with an improbable accent who sleeps in a coffin and is more or less immortal.
It is well-known that Stoker based his character on the historical accounts of Vlad the Impaler. It is also possible that Stoker was inspired by the many rumors and superstitions surrounding death and burial at that time as well as ignorance about how the body decays.
After death, the skin of the corpse shrinks. So its teeth and fingernails become prominent and can appear to have grown. Also, as internal organs break down, “purge fluid” can leak out of the nose and mouth, leaving a dark stain. People may have interpreted this as the corpse drinking blood from the living.[5]
There was also the evidence from the coffin itself. Sometimes, scratch marks were found on the insides of the caskets, which was taken as evidence that the dead had become undead and arisen from their coffins.
Unfortunately, it is more likely that the undead became dead and that people who had fallen into a coma, for example, were buried in the mistaken belief that they had passed on. After recovering consciousness, they may have tried to free themselves.
It is believed that the philosopher and monk John Duns Scotus perished in such a way. His body was said to have been found in a crypt outside his coffin with his hands bloodied and bruised from trying to escape.
Giants have been part of folklore for thousands of years. In Greek mythology, we have the Gigantes, a tribe of 100 giants who were born of the goddess Gaia after she was impregnated from blood collected during the castration of Uranus. Yuck.
In Norse mythology, Aurgelmir was created from drops of water that formed when the land of ice (Niflheim) met the land of heat and fire (Muspelheim). He must have been quite big. After he was killed by the gods, the Earth was made from his flesh, the seas from his blood, the mountains from his bones, the stones from his teeth, the sky from his skull, and the clouds from his brain. His eyebrows even became the fence surrounding Midgard, which is the Viking way of saying Earth.
Hereditary gigantism might explain some of the beliefs in giants (though not the more outlandish ones). Scientists believe that they have isolated a gene that can lead to familial gigantism. According to researchers, people with gigantism can also have a tumor in their pituitary gland which can stimulate growth.
The biblical giant, Goliath, was said to have been over 274 centimeters (9’0″) tall. There is no modern definition of what height makes us a giant as different societies have different average heights, with as much as 30 centimeters (11.8 in) difference.
A study published in the Ulster Medical Journal suggested that Goliath, famously killed by David with a slingshot, had “an identifiable family tree suggestive of autosomal dominant inheritance.” The pebble used by David hit Goliath in the forehead. If Goliath had been suffering from a pituitary tumor pressing on his optic chiasm, he would have had visual disturbances which would have made it difficult for him to see the stone.[6]
In Irish folklore, a banshee (meaning “woman of the fairies” in Gaelic) was a beautiful young woman with flowing white hair and eyes red from crying who “keened” to warn the person who hears it that someone in their family will die. Rather than being meant as a threat, it was supposed to give people time to say their goodbyes to their loved ones.
It is unclear when the legend first arose. There were reports of the banshee in Cathreim Thoirdhealbhaigh, a written history of the village of Torlough in 1350, and the accounts were still being told in the mid-19th century.
Keening was a traditional way for women to express their grief. They would gather together at the graveside and wail for their loss. This practice gradually died out during the 19th century after it became something of a tourist attraction to watch the keeners at a “real Irish funeral.”[7]
It is easy to see, however, why the romantic Irish, who were always ready to believe in the supernatural, would take the idea of a fairy woman and mix it with the sadness of women keening over their dead to create a beautiful banshee to call the family home to say their last goodbyes.
In Greek mythology, the hydra was a gigantic sea serpent with nine heads, one of which was immortal. As one head was cut off, two more would grow out of the fresh wound.
Killing the hydra was one of the 12 labors of Hercules. To achieve this, he enlisted the help of his nephew, who cauterized the wounds as Hercules cut off the heads until only the immortal head remained. Hercules cut off that one, too, and buried it under a heavy rock.
The myth of the hydra may have been inspired by nature. There have been many documented cases of snakes with multiple heads (although nine would be pushing it). The incidence of polycephaly in reptiles appears to be higher than in any other species.
It has even been possible for scientists studying conjoined twins to create polycephalic animals. In the early 20th century, Hans Spemann tied young salamander embryos together with a strand of human baby hair to produce babies with two heads.[8]
These days, dire wolves are most known for their association with the Stark children on Game of Thrones. However, the dire wolf is not a figment of the GoT creators’ imaginations.
Much bigger than a modern wolf, the dire wolf lived in the Americas until its extinction about 10,000 years ago. Over 4,000 fossilized remains of dire wolves have been discovered at the La Brea Tar Pits in Los Angeles. It is believed that they may have become trapped while feeding on the carcasses of other ensnared animals.[9]
The dire wolf had a huge skull but a smaller brain than the modern wolf. Perhaps if the brains of dire wolves had been bigger, they would have realized that those animals were trapped in the tar pits for a reason. There is no evidence that an albino dire wolf ever existed, although albino cubs have been born in the modern wolf population.
According to Greek myth and Harry Potter, a basilisk (aka a cockatrice) was a serpent with a lethal gaze and terrible breath. It was said to have been born from an egg laid by a cockerel and hatched by a serpent.[10] Supposedly, it feared only the crowing of the cock and the weasel, which is immune to its venom (or Harry Potter’s sword). In Greek myth, the basilisk was of normal size, though it had grown to mammoth proportions by the time it got to Hogwarts.
While it is unlikely that a cockerel would ever lay an egg or that a serpent would choose to incubate it, the idea of a basilisk seems to have some basis in fact. It is probable that the basilisk of myth was actually an Egyptian cobra, a particularly dangerous snake which hisses continually and spits venom to a distance of 2.4 meters (8 ft) while aiming for the eyes of its enemy.
This may account for the myth that the basilisk killed those who looked it in the eye. The cobra’s greatest predator is the mongoose, which bears a strong resemblance to a weasel.
Alexander the Great was famously said to have used a mirror to defeat a basilisk. When the snake looked upon its own image, it died instantly. J.K. Rowling used a version of this story in her novels, too.
Ward Hazell is a writer who travels and an occasional travel writer.
]]>Many researchers, mainstream and otherwise, believe that we are not the first advanced civilization to have existed on Earth. Furthermore, they postulate that in prehistory, unrecorded history, one or more advanced civilizations just might have existed, thrived, declined, and perished before us.
While this is an outlandish notion for most people, when broken down, it isn’t as crazy as it first seems, not least when you consider what might become of ourselves should a sudden end announce itself without warning. Chances are, should life begin again, nobody would remember that we had even existed. If that’s the case, then, who’s to say that advanced civilizations didn’t exist thousands of years before our own recorded history?
Let’s say that something happened to wipe out the vast majority of human life on planet Earth. Be it a sudden super-contagious virus, a meteorite, a solar flare, a nuclear war, or even (you know it’s coming) an alien invasion, if it was to happen, life would disappear with alacrity.
Seriously, though, many of us don’t realize just how precarious our existence is here on Earth. Let’s say the vast majority of human life is wiped out by any of the hypothetical situations mentioned above; surely there’d be survivors, right? The thing is, where would the power come from? With no one to run them, the power stations, and with them, the world’s electricity supply, would shut down relatively quickly. In fact, many of them would switch into safety mode to avoid any disasters.
However, eventually, with nobody to oversee these procedures, nuclear power plants, their cooling waters having boiled off, would go into meltdown. Chernobyl-type scenarios would unfold all around the planet.[1] In short, you really wouldn’t want to be around, and if you were, you would want to be completely out of the way somewhere. We’ll talk more about the survivors in entry number five.
Most man-made objects, whether comprised of wood, plastic, metal, or anything other than stone (which we will look at later also) will simply disappear, even down to the roads and streets, which will be completely overcome with vegetation within only several decades at the very most.[2] Just to take that a stage further, within “only” a few centuries, the metal frameworks of the buildings and the bridges around the world will simply rust, break down, and collapse. All that will be left will be the crumpled and piled ruins.
Within only 10,000 years, which is but a snapshot in terms of geological time, just about all that remains will be the stone. And even then, only that which was built purely from stone will survive in any recognizable form (and still might be buried). As mentioned, bridges and buildings will have collapsed due to their rusted and decaying metal parts and will lie in ruins. Over the course of time, much like our ancient sites today, these ruins will be lucky to be pieced together in the future, if ever.
Is it any wonder that the structures we have left of the ancient world are the buildings, monuments, and statues carved from stone? And make no mistake, there was much, much more than just stonework at one time.
As we have mentioned above, only true stone structures will survive any type of annihilation of humanity for any significant amount of time,[3] and even then, the remains would then be subject to any future civilizations and explorers stripping such monuments of what they see as valuable and leaving the rest, much like was done with the Egyptian pyramids (and, who knows, maybe the Sphinx) over course of history.
With that in mind, then, how many of our modern structures might survive thousands, or even hundreds of thousands of years, into the future? Ironically, it would mostly be the buildings from antiquity that would still survive. And again, with that in mind, how long have such structures really been there, and who did they once belong to?
Although they are certainly not the same as solid stone monuments by any stretch of the imagination, should we perhaps pay more attention to certain types of myths and legends that persist across many cultures over thousands of years?[4] For example, was there really a great flood, even if only in the form of several episodes of localized but substantial flooding that just might have wiped out entire communities? Such calamities would have surely seemed like the end of the world to those civilizations that experienced them.
And what about the tales of “the gods”—higher beings with advanced technology that ruled over mankind? Are these really just legends? Or might there be some truth to such stories? Shortly, we will look at the possibility of survivors from such an “end of the world” situation. What if “the gods” of the past were the survivors of an even older advanced civilization? Might that explain the powers (or advanced technology) of the gods? Or the advanced knowledge of the gods? It is certainly an interesting notion.
While we have looked in our previous points at what might happen if our civilization was to face a sudden, life-ending disaster in order to prove, at least in theory, that other civilizations very much could have existed before our recorded history, it is also worth looking back at known ancient civilizations. If we look at the ancient Egyptians, for example, it is perfectly obvious, and even accepted by mainstream scholars, that they appear to have begun their civilization already at the height of their power and then went into permanent decline.[5]
To some researchers, who are very much shunned by most mainstream experts, this suggests that the Egyptians “took over” the remnants of an ancient, “lost” civilization. From these types of theories generally spring the further claims that such ancient structures as the Pyramids of Giza are more likely monuments and buildings of an Atlantean-type society as opposed to the work of the Egyptians themselves.
Now, let’s say that some people have survived our hypothetical modern-world-ending disaster. What would become of them, realistically? For a start, they would very likely not be concerned with searching out technology or things of that nature. Chances are, once the system has gone down, and the power is off, they will be concerned with their survival more than anything else.[6] They will no longer be on top of the food chain. Without the aid of our modern plethora of technological gadgets and advanced buildings, many remaining humans will be easy picking for hungry wild animals.
Any survivors would be preoccupied with hunting and gathering whatever food they could and finding some kind of shelter. As the generations go on, humanity’s connection to the “old” world would return. By the third and forth generation, chances are all that would be remembered of pre-disaster Earth would be no different than what myths and legends are to us now. In short, life would be starting again, from scratch.
As a further point of interest, and perhaps evidence, there have been numerous discoveries of ancient objects, apparently the result of intelligent design, dating to far before such objects should have existed. And what’s more, they’ve been found all over the planet. For example, in 1912, in the small town of Wilburton, Oklahoma, two employees at the Municipal Electrical Plant reportedly discovered a particularly oversized piece of coal that they couldn’t fit into the furnace, which they were stocking to keep the plant ticking over.
They would proceed to smash the coal to smaller pieces so that they could toss the remains into the flames. When they did, though, a perfectly formed and recognizable iron pot fell to the floor. It was allegedly examined and found to be authentic. Why was it there, in a piece of coal that was millions of years of old?
Even more bizarre are the strange spheres, made of some very hard substance, purportedly brought up from the mines of South Africa on multiple occasions by miners.[7] These spheres have bizarre grooves in them and are of obvious purposeful design. What they might be and, more importantly, why they are there is open to debate.
As well as mysterious objects that may or may not have a use, many very purposeful and obvious tools have been discovered in pieces of rock that, if we accept what science tells us, are millions of years old.
One particularly intriguing case occurred in London, Texas, in 1936, when the head of a hammer was discovered in a piece of rock believed by some to be as old as 400 million years. (Others say only 700 years.) In 1944, a ten-year-old boy, Newton Anderson, would find a handmade bell in a piece of coal. The lump of coal was reportedly 300 million years old.[8]
Numerous other purported discoveries of strange, seemingly ancient, objects are on record, many of them from the 1800s and before. The book Forbidden Archaeology lists example after example.
Many mainstream historians simply do not accept the notion that many of the ancient civilizations, including the ancient Egyptians and Sumerians, had, at one point in the distant past, advanced technology.[9] However, many reasons are put forward for this notion, not least the obviously advanced knowledge of the cosmos and the workings of the universe that so many ancient civilizations possessed. Even the placement of many of their famous structures mirrors the arrangement of the stars and the planets to such an accuracy that such knowledge cannot be denied.
We could also look to such devices as the “Baghdad Battery” or the traces of acids in the passageways of the Giza Pyramid that suggest some kind of generation of electricity. And what of the many sites around the world that reportedly show signs of nuclear explosions in the distant past? A prominent example is Mohenjo-Daro, which some researchers, most notably David Davenport in his book Atomic Destruction 2000 BC, have postulated was the site of intentionally made nuclear weapons being purposefully deployed long ago. This, of course, would suggest, as Davenport agreed, that a highly advanced civilization existed.
At the end of the day, no matter the interesting, valid, and, to varying degrees, legitimate views and claims on either side of the argument, the sad fact is that for many mainstream historians, much like mainstream scientists, archaeologists, and most other specialties ending in “ist,” the view is generally, in the words of Graham Hancock, “very myopic.”[10]
The reasons for this are numerous. Firstly, amid the constant jockeying for limited funding, nobody wishes to put their head above the parapets. So, as a result, the “status quo” opinion is maintained. Those who do discover things of interest that go against the established paradigm and then, more to the point, attempt to tell the world about them, face a sudden cutting of their funding and, even worse, the wrath of their contemporaries. Perhaps a good example would be the case of Dr. Virginia Steen-McIntyre, who, after discovering ruins in Mexico that suggested civilization in the Americas going back 250,000 years—using accepted, tested methods no less—was suddenly and universally shut out by the “accepted” scientific and archaeological communities.
]]>Social etiquette is more than just a set of polite behaviors—it embodies the values, hierarchies, and unspoken rules that define a culture. In many ancient civilizations, these customs were sacred, with breaches leading to humiliation or even severe punishment. While many of these etiquette practices have faded with time, their echoes can still be felt today, subtly influencing modern social norms and cultural behaviors.
Exploring these ancient customs offers us a window into the complexities of human interaction and the profound importance once placed on seemingly small actions. Here are ten fascinating social etiquette rules from ancient civilizations that time has forgotten.
Related: 10 Social Conventions You Might Be Taking for Granted
In ancient Rome, the emperor was not just a political leader but was often considered a living deity embodying the state’s power and divine favor. Citizens and even nobles were expected to show the utmost respect in his presence, which included avoiding direct eye contact. Looking directly at the emperor was more than impolite—it was a symbolic affront to his elevated status, akin to challenging his authority or questioning his supremacy.
This etiquette was strictly enforced during public appearances, ceremonies, and court proceedings. When citizens approached the emperor to plead cases or seek favors, they would do so with bowed heads and eyes cast downward, demonstrating submission and respect. Even high-ranking officials and soldiers adhered to this practice, acknowledging the vast gulf between the ruler and the ruled.
Failure to observe this rule could result in social ostracism or severe consequences, underscoring the rigid social structures of ancient Rome. The practice reinforced the social hierarchy and maintained the emperor’s near-divine status in the eyes of the people.[1]
In medieval Europe, religion permeated every aspect of life, including greetings. The two-finger salute, widely used across the continent, was a symbol of religious devotion. By raising two fingers—the index and middle fingers—individuals affirmed their belief in the Holy Trinity: the Father, the Son, and the Holy Spirit. This simple gesture served as both a greeting and a silent affirmation of one’s orthodox beliefs.
During times of religious strife, such as the Crusades or the Inquisition, failing to use this gesture appropriately could lead to suspicion, ostracization, or even accusations of heresy. The two-finger salute was not just a polite custom but a crucial tool for social survival in an era when religious conformity was enforced by severe penalties.
Merchants and travelers also used the salute to indicate their faith and good intentions when entering new towns or engaging in trade. It facilitated trust among strangers in a fragmented landscape of feudal territories and varying local customs.[2]
In ancient India, the distinction between the left and right hand was deeply embedded in social norms and religious practices. The left hand was reserved for tasks considered unclean, such as personal hygiene. Meals were more than just eating—they were communal rituals that reinforced social bonds and religious observances. Using the right hand to eat honored the sanctity of the food and the occasion, reflecting broader concepts of purity central to Hinduism.
Using the left hand during meals was considered impure and disrespectful, not only to the food but also to fellow diners and the divine. This rule extended beyond the home into social and religious gatherings, where adherence to this etiquette was a sign of respect to hosts and guests alike.
Even today, in many parts of South Asia and the Middle East, this custom persists, highlighting the lasting impact of ancient practices on modern cultural norms.[3]
In the Ottoman Empire, social etiquette required men to keep their heads covered indoors, especially in places of religious or social importance like mosques or private homes. This custom was rooted in notions of humility before God and respect within the social hierarchy. The head covering, often a fez or turban, was a symbol of one’s faith and societal status.
Removing one’s head covering in inappropriate contexts was more than a breach of etiquette; it was a potential act of defiance or disrespect. The strict enforcement of this custom reflected the empire’s emphasis on maintaining social cohesion and visual markers of identity.
The practice extended beyond religious settings to formal meetings and social gatherings, reinforcing the importance of modesty and respect in daily interactions. While the Ottoman Empire no longer exists, the legacy of head-covering customs continues to influence cultural practices in the region.[4]
In feudal Japan, social interactions were governed by strict codes reflecting one’s status and role. Central to these interactions was the act of bowing, or “ojigi,” which conveyed respect, gratitude, apology, and other sentiments without words. When addressing a superior, especially a daimyo (feudal lord), one was expected to bow deeply before speaking.
Failing to perform the proper bow was a grave breach of etiquette, implying arrogance or disrespect. For the samurai class, who lived by the strict code of Bushido, adherence to proper bowing was a matter of honor and discipline. Neglecting this could lead to serious consequences, including loss of status or even duels.
Bowing before speaking maintained the social hierarchy and reinforced mutual respect, essential components of Japanese society at the time. The practice highlighted the importance placed on non-verbal communication and the subtle nuances of social interaction.[5]
In ancient Egypt, the threshold of a home was more than just a physical boundary—it was a spiritual one. Stepping directly on the threshold when entering someone’s house was considered disrespectful to both the host and the protective deities believed to guard the home. Egyptians believed that the gods watched over families from the entrance, and stepping on the threshold could anger these protective spirits.
Guests were expected to step over the threshold, acknowledging the sacredness of the entrance and showing respect for the household’s divine guardians. This practice emphasized the importance Egyptians placed on hospitality, spirituality, and the sanctity of the home.
Such customs reinforced social bonds and religious beliefs, integrating everyday actions with spiritual significance. While the specific practice may have faded, it reflects the profound connection between daily life and the divine in ancient Egyptian culture. [6]
In Imperial China, Confucian principles shaped societal norms, including etiquette surrounding speech. One of the most important virtues was self-restraint, and in the presence of elders or superiors, it was expected to remain silent unless spoken to. Speaking out of turn or at length was considered a sign of arrogance and disrespect.
Silence maintained harmony by acknowledging the proper social order and showing deference to those of higher status. This etiquette was especially important in familial settings and official courts, where hierarchy was strictly observed.
Failure to adhere to this rule could lead to loss of face, a concept deeply ingrained in Chinese culture that pertains to one’s honor and reputation. The emphasis on measured speech and respect contributed to social cohesion and reflected the value placed on harmony and order.[7]
In ancient Mesopotamia, the feet were considered the dirtiest part of the body due to constant contact with the ground. Showing someone the sole of your foot, even unintentionally, was seen as a grave insult. The sole was associated with filth both physically and symbolically, and displaying it to someone was akin to calling them unclean.
This etiquette influenced how people sat and interacted, ensuring that the soles of their feet were not exposed to others. It extended to formal settings, where individuals were mindful of their posture to avoid offending others.
The practice underscores the importance placed on cleanliness and respect in social interactions. Variations of this custom persist in some cultures today, highlighting the lasting impact of ancient social norms.[8]
In ancient Greece, seating arrangements at social gatherings were significant, particularly during symposiums or banquets. The seat to the left of the host was reserved for the guest of honor, considered the most prestigious position. This placement symbolized the host’s trust and affection, as the left side was associated with the heart.
Hosts carefully arranged their guests to ensure everyone was seated according to their status, reflecting the importance of hospitality and social order. Misplacing someone in the hierarchy could lead to social tension or offense.
This etiquette emphasized the Greek values of xenia (hospitality) and respect for social hierarchies. Proper seating was a tangible expression of these virtues, reinforcing relationships and societal norms.[9]
In Victorian England, etiquette rules were enforced rigidly, especially regarding women’s behavior. Women were expected to cover their mouths while laughing or smiling broadly. Modesty was a prized virtue, and excessive displays of emotion were considered unladylike and could damage a woman’s reputation.
This small gesture allowed women to demonstrate decorum and self-restraint, key virtues in Victorian society. It reflected the era’s strict ideas about femininity and proper conduct, where maintaining an air of modesty and reserve was paramount.
While men had more leeway in their expressions, women were held to stringent standards that dictated their behavior in public and private spheres. The practice highlights the gender norms and social expectations of the time.[10]
]]>A job is something all of us have to do to get by. Those at the top tell other people what to do. Everyone else has to obey in order to take care of pesky things like rent and food. Not everyone is cut out for all the positions out there, though most of us try to do our best with the one we have.
Then there are the weird jobs that you’ve never heard of. In fact, you probably wouldn’t know where to begin if you got one of them. Nevertheless, all of them exist and are consistently looking for applicants—if you know where to look. These aren’t run-of-the-mill, nine-to-five affairs. Instead, they require specific sets of skills that you can only obtain once you get them.
“I wish I could be paid for sleeping” is something most of us have thought at some point in our lives, mostly right after we have to wake up to go to work. It makes sense, too, as sleeping is amazing. But what if we were to tell you that you could actually get paid to sleep? In some cases, you’ll earn a lot (depending on where you sleep).
A professional sleeper is a serious occupation and provides services that are required in many modern industries. Although it’s not your typical full-time job, you could earn a sizable income if you land the right gigs.
From researchers (who regularly need sleep testers for various experiments) to mattress brands that need sleepers to test their products and artists who require sleeping subjects, professional sleeping could be a lucrative career if you advertise yourself well enough.[1]
When you train to be an astronaut, one of the first things you have to learn is how to maneuver yourself in a gravity-free environment. Although it may not sound like a big deal, you’d be surprised how difficult it is.
Inherently, our bodies are trained to only work in gravity. It takes a long time just to get used to a gravity-free environment, let alone be able to engage in day-to-day functions in it.
The people who impart that training are known as parabolic experts, one of the most coveted and highly skilled occupations in the going-into-space industry. In fact, only nine people in the world are qualified to do it.[2]
First, they must become excellent at free-falling in an aircraft (aka a vomit comet in casual NASA lingo). This is the only way to simulate a microgravity environment on Earth for training potential astronauts to go on missions. It’s not their job to actually go into space, even though they’re a crucial part of nearly every space mission.
The death of a loved one can be difficult to deal with, and everyone copes in his own way. Some people mourn for days before getting their lives back to normal. Others shut themselves away until they can be around people again.
Still others take it one step further and get professional mourners to do their grieving. Although it may sound weird to the rest of us, these mourners are dedicated professionals in quite a few parts of the world.[3]
Professional mourning has been a thing for thousands of years in many regions, including Africa, China, and ancient Egypt. However, China is mostly where it’s still big business.
The job consists of showing up to the funeral and staging a believable session of mourning—complete with physically breaking down and wailing. This may sound alien to the rest of us, but it’s completely normal in Chinese culture. These pros can also earn quite a bit depending on how good they are.
With almost everything from traffic to government databases to supply line inventories of big corporations now operating through the Internet, hacking is an increasingly serious threat for almost everyone. This is especially true of hackers in countries where the Internet isn’t regulated much.
As a result, there’s an equally high demand for good hackers who know the basics of getting into networks without being detected. They expose vulnerabilities in high-value targets like government databases.
Known as white hat hackers (as opposed to the regular black hat hackers), these individuals are hired by corporations and governments around the world to do their best to get into the employers’ networks. If you’re really good at it, you may earn a fortune for doing a contract job, with a chance to be hired by your client at the end of it.
The best part?
Absolutely no one will ask how you got those skills in the first place as long as you’ve never used them to do actual harm. To qualify, you simply need to get really good at hacking. Just make sure you don’t practice on live targets, as that is illegal.[4]
Potable drinking water is already on the brink of running out for the whole world. We may not hear much about it because the crisis isn’t as dire in the loudest parts of the world yet.
Many countries are now looking to Earth’s natural ice reserves for help. They are devising plans to haul entire icebergs from the South Pole to their shores to harvest water. The job is given to iceberg moving companies, where you can apply to work if you’re up for it.
Iceberg moving may sound impossible—like mountain relocation or beach stealing. However, many Middle Eastern countries are already grappling with serious water scarcity issues, so they’re willing to give it a shot.[5]
Some iceberg towing companies are already on their way to get the first icebergs to some of those countries. Middle Eastern governments are also hoping that huge floating icebergs off the shores of their countries may serve as tourist attractions, too.
Many current jobs have never existed before, including that of a futurist. If it sounds like someone who sits around all day and makes predictions like a fortune-teller, it absolutely is—except that the predictions have to be backed by data.
It is one of the more accessible jobs on this list (though not for scientific fields). In fact, anyone can apply to be a futurist as many New Age firms are starting to see the importance of the position.[6]
A futurist’s role is to study current data and make reasonable predictions about future trends based on that information. A government may employ a futurist to predict social changes in order to estimate taxes in the future. An advertising firm may try to find out what kind of messaging consumers are likely to respond to 20 years from now.
Depending on your field, a job as a futurist can be as awesome as it sounds or just entail data crunching all day like your old job.
Over the years, food advertising has come a long way. If you’ve ever seen a food ad and thought “wow, that looks good,” it’s not because that’s how it’s supposed to look. Instead, a highly specialized type of professional called a food stylist made the food look good for the camera.
If “food stylist” sounds like a joke, we assure you that it’s absolutely not. You can go to your local employment listings website and find a posting for one of these right now.[7]
The job is not easy to do, either. Apparently, styling food is an art form that you master after years of persistent practice, which also involves know-how of photography and videography.
If you do get your foot in the door and become a food stylist for a local firm, you may work your way up and someday get to style food for the big brands. Needless to say, that pays quite well.
LEGO bricks transcend age, sex, borders, and race. No matter who you are or what you do, a LEGO set is all you need to push aside everything else and spend the next four hours trying to build something that you’ll give up on anyway. And that’s okay. A big part of the company’s success is being able to engage the inherent builder in all of us, even if we’re not that good at the job.
That doesn’t mean that everyone’s bad at it. We all had that one kid in class who could build intricate structures out of LEGO sets and simply mesmerize us. If you were one of those kids and have always wanted to do this for a living, you can get paid tons of money for it.
Known as LEGO master builders, these guys are employed by the LEGO Group to create those huge structures built of LEGO bricks that occasionally go viral on the Internet. If playing with LEGO pieces has been your dream all along, you could definitely give this a shot. However, you’d have to be really good at it to even stand a chance.[8]
Most people visiting a shark tank (not the TV show) have a great time, maybe have a snack or two, and then return home. However, the curious ones look at the whole setup and think, “I wonder who cleans that?”
There’s no way to keep the artificially created ecosystem healthy without someone actually going into the tank and manually cleaning it. In fact, there are shark tank cleaners tasked with doing exactly that.
The job is dangerous because the sharks are always in the aquarium. You can’t just make an expensive secondary tank to house the animals until the cleaning is done.
However, the cleaning isn’t what makes this a badass job. It’s getting used to being around sharks that are already angry at all humans for keeping them in captivity. The cleaners have to learn shark behavior and be good at diving—as well as knowing when to get out—to even be considered for the role. They spend around 30–40 hours in the tank per week as a part of their job.[9]
Smell is an important sense, which is apparent by how many products rely on it to make sales. From soaps to deodorants to candles, how a product smells is as much a part of its commercial success as its appearance. So, who is in charge of making sure that things smell the way they should?
You guessed it (presumably from the title of this entry). Odor judges determine the best smell for a particular product. Obviously, they have to go through plenty of bad odors, like those from armpits, to come up with a deodorant to counter the smell.
It may sound like a horrible job, but it’s also a crucial one. They also deal with issues like whether a particular type of seafood is contaminated by a nearby oil spill. Of course, you must have a keen sense of smell to qualify for this job, and it may end up paying quite well if you work at a large organization.[10]
You can check out Himanshu’s stuff at Cracked and Screen Rant, get in touch with him for writing gigs at [email protected], or just say hello to him on Twitter.
]]>There is something a bit weird about company mascots, folk figures and cartoon characters. You get to know them, care for them and, if you have too much time and an overactive childhood-reclamation gland, a feverish need to write fan fiction about them. This list includes some unreal characters that people swear blind actually exist. Spoiler alert—they don’t…probably.
Top 10 Famous People Who Never Existed
Plenty of people reading this will be thinking “Of course, she’s a mascot, who thought she was real? I mean everyone knows Betty Crocker isn’t real, right?” Well, that would be the people who are reading and thinking “Whaaaaat?! Noooooo!”
Mavis Beacon, the lady who adorns the packaging for popular typing software, is another fabrication from the corporate world with a racially-charged past (Like ‘Uncle Ben’ and ‘Aunt Jemima’). Due to the mascot being modeled by a Haitian-American lady named Renée L’Espérance (who had been scouted from the perfume kiosk at Saks, Fifth Avenue in New York), many retailers were reluctant to stock the product. The product has been used to teach millions of school children to type since 1987, in the same way as the careful tutelage of Professor Rosetta Stone taught me to speak Spanish…[1]
Very little is ‘known’ about the avant-garde music collective ‘The Residents’. Plenty of lore, theories and recordings exist of course, but none of this answers key questions people may have; questions like—”Who are The Residents?” Well, we know that Texan musician Hardy Fox was a founding member. That’s about it.
One notable collaborator is Bavarian composer and theorist N. Senada, originator of the ‘Theory of Obscurity’ and the ‘Theory of Phonetic Organization’. So far, so impressive. But, much like anything in the extended universe of ‘The Residents’, this man who should be a world famous composer with reams of articles about him, an in-depth bio and maybe even a documentary or two charting his rise. Nope. Some have joined up some floating, elusive dots and have surmised that N.Senada was probably Don Van Vilet (better known as Captain Beefheart), due to him having lived on Ensenada Drive, California in the late 60’s as well as The Residents early demo tape being sent to the same record label executive that had signed Captain Beefheart and his Magic Band.[2]
‘The Un-Credited Inventor’ has become a part of modern folklore. Sometimes there are cases of unheralded inventors/originators/contributors to great ideas, (Douglas Prasher’s work on gene sequencing and Bret Weinstein’s work on extended telomeres in lab mice come to mind). ‘The Redemption of the Un-Credited Inventor’ is a fairly old trope started by that is only now entering our culture—as seen with the mania surrounding Nikola Tesla’s work, due to YouTubers ironically making videos (without permission might I add) based on lists from ten years ago! Nothing exemplifies these phenomena as it pertains to modern folklore better than the case of nurse Lupe Hernandez, (supposed) inventor of alcoholic hand gel, ‘Saint of the Coronavirus Outbreak’, ‘Saviour of Mankind’…yeah, this person probably didn’t exist.
The story dates back to an article in The Guardian based on, well, not much except for some overly florid journalistic flourishes (note our incredible restraint in not using the term “fake news”!) Certainly not reality. Nobody knows who this lady (or man, we aren’t sure) was, how to verify the claim…it’s all a bit of a mess. That, of course, hasn’t stopped the twitterati from demanding Hernandez be recognised, venerated and taught about in schools as an example of ingenuity from beyond the white, male hegemony. Try Chien-Shiung Wu for an perfect example of a genius non-white woman, you don’t have to invent one. (She helped invent “the” bomb by the way and was robbed of a Nobel prize).[3]
This story from sixteenth century Britain was a real zeitgeist-grabbing tale of a wild Scottish bandit named ‘Sawney’ Bean and his cave-dwelling, inbred family of man-eaters.
Mr. Bean and his wife lived in an isolated cave in Ayrshire, Scotland. To support his wife, Bean ambushed and robbed travellers in the highlands, murdering them and returning to his wife with their money and their corpses, ready for the pot. Over the years, his family grew, his kids inbreeding with one another until his clan had ballooned to 48 hungry Beans. Eventually, due to the very conspicuous nature of an army of 48 cannibals murdering scores of people, the King himself was compelled to intervene. James I and 400 soldiers hunted the murderous cannibals and put them to death in an appropriately gory fashion. Or did they? Probably not, because no actual evidence save some wonderfully dramatic, propagandistic anti-Scottish pamphlets exist. Being as it is such great horror story, many works of art have taken influence from the tale—The Texas Chainsaw Massacre, The Hills Have Eyes and popular anime series Attack on Titan all took cues from this tale.[4]
The worlds of marketing, wartime propaganda, literature and folklore have seldom met so completely as in the case of Greek hero Konstantinos Koukidis, the man who died wrapped in his nation’s honour.
When the Nazis had successfully completed their invasion of Greece, they decided to mark their victory by raising the swastika-emblazoned banner over the Acropolis in Athens, the architectural embodiment of Greek nation. Koukidis, the soldier charged with guarding the flagpole, refused to abandon his duty and, after removing the flag from its lofty place above the capital, wrapped about himself and jumped from the high acropolis rock to his glorious death. No evidence of the event, or indeed of a soldier named Konstantinos Koukidis, exists. The tale did however prove to be a real morale boost for a nation severely in need under the brutal yoke of the German socialists.[5]
10 ‘Real’ Serial Killers Who Never Existed
Few directors in Hollywood have amassed such a long list of credits as Alan Smithee. From real stinkers like Ghost Fever (1987), The Shrimp on the Barbie (1990) and The Birds II: Land’s End (1994) to name but a few of the movies, TV shows, ads and music videos that form this prolific auteur’s output, a feat is far beyond the abilities of most human beings. So who the heck is Alan Smithee? Or Alan Smithee Jr? Or Alan Von Smithee?
Alan Smithee is the pseudonym directors use when they don’t want to be credited for a film they’ve made, (for the most part), because they suck. Ironically, a film called An Alan Smithee Film: Burn Hollywood, Burn, a comedy mockumentary starring Eric Idle has a director whose name is, unfortunately, Alan Smithee, turned out to be awful, grabbing 3 razzie awards at the Golden Raspberries in the year 2000. The film was so terrible that, you guessed it, director Arthur Stiller opted to use the infamous moniker instead of his own name. The Smithee credit was retired in 2000 by the Director’s Guild of America.[6]
Australian actor Travis Fimmel’s portrayal of Ragnar Lothbrok in the binge-worthy hit series Vikings has been described as ‘engaging’. Indeed it is, as all good fiction is.
Much as I imagine I’m breaking the hearts of droves of Vikings-loving fanboys and, let’s face it, Ragnar-supporting fangirls, this legendary Norse hero is very unlikely to have existed. His kids did, but were probably sired by various other Norsemen.[7]
A murderous family in a rural setting take unwitting guests to their inn, lull them into a false sense of security and kill them. So goes the story of ‘The Bloody Benders’. It is not the story of Agnus McVee, her husband Jim and brother-in-law Al, although it is quite similar save for a couple of key differences—this tale occurs in Canada and not in Kansas, there is no whiff of incest and it never happened.
Agnus, the story goes, had decided to sell a 17-year-old girl she’d kidnapped to a local miner. Jim, Agnus’ husband, followed the miner the next morning and killed him, stealing his money. Agnus then poisoned her husband in retaliation (why?). The girl who was about to be sold into slavery managed to escape and the jig, as they say, was up. The police came to the McVee’s hotel and found 8 young girls cowering in the basement. Good story, thank goodness it never happened.[8]
Strap in for this story, folks. A medical examiner is tasked with the autopsy of one Ronald Opus. Opus had left a suicide note, suggesting that this was the probable cause of death. He had indicated that he was going to jump from the top of his apartment building, and so he did. Problem was, he had a gun-shot wound to the head which the examiner identified as the real cause of death. Little did the deceased know, but a safety net had been placed below the windows of the 8th, thus Opus suicide attempt would have been foiled had he not ended up shot. Maybe it was murder? Further evidence showed that the shot was fired from the 9th floor apartment. They found an elderly couple there and discovered that a shot had indeed been discharged in anger on the day of Opus’ death. The husband had threatened his wife with a shotgun, discharging the shot that killed Opus.
If you mean to kill a person, miss and end up killing another accidentally, it is still murder. However, the elderly man swore, along with corroboration from his wife, that he had made it a habit to threaten his wife with the unloaded shotgun when they had arguments. Accidental death then? But the verdict swung back to murder when a witness testified that they had seen the elderly couple’s son loading the gun a few weeks before the fatal accident. The investigation found that the mother had cut off her son’s financial support which spoke to motive as to why the son loaded the shotgun—he hoped she’d end up getting killed when she next had an argument with his father. The son…was Ronald Opus. In a fit of guilt at dooming his mother to death, he threw himself from the top of the building, thus getting hit by the slug he had loaded in his dad’s shotgun. Verdict: suicide. Wow! What a story.
This case was originally recounted by Don Harper Mills at a banquet for the American Academy of Forensic Scientists, an organisation that he was the president of in 1987. Seems like an episode of Law & Order or CSI, right? The story did, in fact, inspire a whole slew of storylines, including the two aforementioned shows, as well as the 1999 film Magnolia (touted many times here on as one of the best films ever made and featuring on our great leader’s list of Top 10 Favorite Films of JFrater) amongst others.[9]
Tech is changing the world. Neo-Luddites are scared out of their wits at the prospect of automation, AI proliferation and commercially available space holidays. Most of us welcome the new world, excited at the prospect of living in a tech-assisted world where privation and want will be distant memories…when they invent the Star Trek replicator.
When Japanese pop group AKB48 introduced their new ‘member, Aimi Eguchi, it turned out that ‘she’ was a composite of different features from other band members (Read more about her on this previous list). On thispersondoesnotexist.com, however, I start to feel the same panic as they scared truck drivers and oil field workers. The AI, developed by Uber software engineer Phillip Wang, generates seamlessly real photos of people who do not exist. They aren’t composites like Aimi Eguchi, they are completely unique. It’s easy to spend a solid hour refreshing the site’s front page, generating headshot after headshot of amounts to, figuratively, ghosts. Maybe dream people? Whatever or whoever they are, it gets creepy after a while.[10]
10 Of History’s Most Iconic Things That Never Existed
About The Author: CJ Phillips is an actor and writer living in rural West Wales. He is a little obsessed with lists.
]]>How good would you say your reflexes are? And how many reflexes do you even have? Strictly speaking, a reflex is an action your body performs in response to some kind of stimulation but without any conscious effort on your part. Most of us will get tested at a doctor when they tap our knee with that little hammer to see if you kick out, but that’s just one of many reflexes the human body makes use of. Some reflexes are much more unusual and exist for some unpredictable reasons.
Humans certainly can’t claim to have sole dominion over reflex action as most living things possess a variety of reflexes for one reason or another. In some animals they’re quite specialized as well, as can be witnessed in the hippopotamus.
Because a hippo spends so much of its life in water but cannot breathe water, it needs to be careful about not drowning. Hippos will even sleep in the water, which makes breathing even more difficult if you think of it as a totally conscious effort. Luckily for the hippos they have a reflex that takes care of that.
A hippopotamus can fall asleep under the water where they are able to hold their breath for just five minutes. The animal will surface, breathe, and then go back under the water again all while asleep thanks to reflex action.
Many reflex actions exist to provide us with a kind of security. They do things for us that need to be done. Things that, if they were left up to our conscious mind to control, we would probably not be able to get done fast enough or often enough to make use of them. Arguably, if none of your reflex actions worked, you probably wouldn’t live very comfortably. Case in point, the rectoanal inhibitory reflex.
There aren’t a lot of delicate ways to put this, so let’s not beat around the bush. This reflex is what allows you to judge whether you’re about to pass gas or pass something else. You can imagine how valuable this reflex is when you’re asleep. In so many words, the muscles in your rectum are able to determine when you need to go to the washroom or if you can safely work the problem out with a simple fart. It’s a necessary function for the maintenance of continence and without it you’d have to be taking precautionary trips to the washroom far more often.
There are a number of causes for coughing that can range from viral or bacterial infections to inhalation of irritants to allergies and so on. Figuring out what is causing someone to have a chronic cough can be much harder than you might expect as a result of the numerous potential causes. One cause that you can look to when all other potential reasons are ruled out is the Arnold reflex.
The Arnold reflex causes you to cough when the vagus nerve passing through your auditory canal is stimulated. That means if you had something stuffed in your ear, or something was impacted, it could push against this nerve and produce symptoms of a chronic cough that can last weeks as a result. You may have experienced this yourself if you ever tried to clean your own ear too aggressively, and found yourself coughing once or twice as a result of pushing a little too deep. Something as simple as irrigating the ear to clean it out can fix the problem.
Here is a weird one that you can test on your own without any negative consequences. The Glabellar Reflex can be demonstrated by using one finger and tapping a person lightly between the eyebrows above the nose. The person being tapped should blink every time your finger taps them, but only for a short period. This blinking is involuntary, and as you continue to tap, the reflex action will cease and the person will be able to keep their eyes open while you’re tapping.
So what is the point of this reflex, and why would anyone care? It’s actually a useful tool. Habituation to the tapping, meaning the ability to get used to it, should occur in a short period. In an otherwise healthy adult if the person is not able to habituate the response, and they continue to blink, that can be used as an early sign of various conditions that affect the brain including dementia and Alzheimer’s.
Have you ever been in a swimming pool or the bathtub for so long that your hands and your feet got a little pruney? And did you think that the reason your hands and your feet got pruney was just because they were too wet? Like perhaps he just absorbed too much water? Turns out that’s not true at all, and science has been aware of this for about a hundred years now.
If you have a certain kind of nerve damage in your hands, you can’t get prune fingers. What that means is that the effect has nothing to do with the water itself. Instead, it’s actually a function of your autonomic nervous system. Blood vessels constrict below your skin and cause the wrinkly, pruney effect to occur. It’s a reflex action. But what’s the point?
It had been theorized for some time that the pattern of wrinkles on your skin served a purpose. The wrinkles form tiny channels which allow drainage of water and to improve your grip when your hands are wet.
Research conducted to help prove this theory showed that subjects who had prune hands were better able to grip wet objects than people with dry hands. There was no benefit to gripping dry objects with pruney hands, which seems to indicate the purpose of prune fingers is to help you grip wet things in a wet environment.
Using a little hammer to tap somebody’s knee is a pretty non-invasive way to test a reflex action. But a lot of our reflex actions are more difficult to gauge than that one. Testing them requires some techniques that are quite unexpected, and may put you off the idea of getting tested at all. One such reflex is the bulbocavernous reflex which is used to test for spinal cord injuries. So it sounds like it’s important, but it takes a bit of work to do.
With certain kinds of spinal cord injuries, the bulbocavernosus muscle will not contract when stimulated, thus offering an injury diagnosis or prognosis. But testing this muscle requires two hands in uncomfortable places. The muscle itself can be reached through a rectal examination, which means a doctor needs to use their finger to locate it. Then, stimulation of the muscle requires them to pinch the patient’s sexual organs with the other hand to see if the muscle contracts.
Goosebumps is the name we give to a physical reaction that occurs typically when you either get cold, or potentially when you get scared. One of the most common signs of goosebumps is the hair on a person’s arm standing up, and the actual little bumps which we call goosebumps becoming visible on the skin. It looks like the actual skin of a goose if you ever seen one without his feathers and that’s where the name comes from.
Goosebumps are considered a vestigial reflex action that dates back to our primitive ancestors. Each little hair on your arm has an arrector pili muscle beneath it and nerve signals can constrict those muscles, raising the hairs.
Many mammals have a similar reaction and it serves one of two purposes. The hairs either stand up to create an insulating layer against the elements, or an animal like a cat will puff up when in danger to look more dangerous to potential predators. Since this reaction happens in humans most often when they’re either cold or scared, there is a clear correlation between the two, even if the results don’t accomplish much and humans anymore.
Have you ever stopped to think about how often you sneeze in a day? Probably not, so in the interests of clarity you should know that most people sneeze about four times a day. That’s a good deal of sneezing going on. Now, can you ever remember sneezing while you were asleep? Presumably if you did such a thing you would wake up. Or maybe you saw somebody else sneeze themselves awake. Odds are you’ve never experienced that, and it’s thanks to another reflex action.
REM atonia is the reason most people can’t sneeze while they sleep. It’s also the reason you don’t act out your dreams while you’re sleeping. You can imagine the chaos that would be caused if everything that was going on in your head played out with your body while you weren’t even conscious to realize it.
REM atonia is meant to prevent your brain from sending signals to your body when you’re not conscious. It’s a kind of self-inflicted paralysis that holds you still and doesn’t allow you to put yourself in danger or do things that you don’t need to do while you’re not awake.
That’s not to say it’s impossible to sneeze when you’re sleeping. There are two ways this can work. One is that the REM atonia effect isn’t working. You’ve probably heard of people sleepwalking or even sleep driving before. The effect doesn’t work on them. The other way this could play out is that something is physically necessitating you to sneeze and if that was the case your brain would actually cause you to wake up first and then you would sneeze.
If you know anything about wilderness survival, then you may know that if you kill a venomous snake in the wild, you need to be cautious of its head, even if you cut the head off. Reflex action in the snake’s head can cause it to bite and inject venom, even if the head isn’t attached to the body anymore.
A snake head could potentially still bite you after several hours. In fact, it’s not uncommon in some parts of the world for people to assume they’re not in danger and get bitten by a severed head.
There is no reflex response more suited for a horror movie than the Lazarus reflex. The name itself gives some small indication of what’s in store. This reflex action has convinced people in the past that victims of serious brain trauma resulting in brain death were perhaps not as badly injured as they seemed.
The way the Lazarus reflex works is that a person who has suffered brain death may still raise their arms and then cross them over their chest in that classic “vampire in a coffin,” fashion. You can imagine how creepy it would be to walk in on a patient who is supposedly braindead and see that their arms have moved and been crossed over their own chest like they’re dead already. And even more creepy if you happen to be in the room and watch it happen yourself.
The Lazarus sign is extremely rare, which also makes it more unsettling because it’s likely not many people have heard of it. In many cases of brain death the patient will only survive for a short time afterwards. However, in two of the only recorded cases of Lazarus sign, the patients had survived over 100 days after being diagnosed as braindead.
All evidence indicates this is just nerve activity and that the brain is not involved in the function whatsoever. After patients who demonstrate Lazarus sign have completed the action, the arms tend to flop by their sides again.
]]>Musicals are the popular culture equivalent of vegemite or olives. You either love them with a passion or hate them with fervor. So, when you sit down to watch a movie or book a theatre performance, are you the person who loves it when people burst into song at the drop of a hat? Or are you the kind of person who grits their teeth in frustration?
Samuel Tailor Coleridge coined the term “suspension of disbelief” in 1817, using it to explain the theory that we are often willing to avoid critical thinking and logic for the purpose of entertainment. Nowhere has this been pushed to the limit more than in science fiction, and even more when a musical gets added to the mix. Below, we give ten science fiction and fantasy musicals that pushed the boundaries of possibility, for good or bad.
Related: 10 Surprising Musical Moments From Popular Shows
The selling power of toys should never be underestimated. In the late ’80s and early ’90s, the Teenage Mutant Ninja Turtles craze had reached a fever pitch. Based on a comic by creators Eastman and Laird, it was later turned into a cartoon, and with it came a tie-in toy line plus a glutton of merchandise. To promote it, anything was being considered, including a musical.
Most musical forays by the Turtles are fondly remembered. They had a number-one hit that tied into the release of their first movie, and their second outing even featured flavor-of-the-month Vanilla Ice. However, their musical stage show, the Coming Out of Their Shells tour, is often consigned to the dustbin of history.
The plot was as flimsy as they come. The Turtles head out on a musical tour, determined to meet their fans across the world. While performing on stage, the tour gets interrupted by their enemy Shredder and his accomplice Baxter Stockman. The Turtles must then form a plan to defeat their enemy.
Highlights are hard to find. “April’s Theme” is a sickly ballad by their reporter sidekick, while “Skipping Stones” is performed by Splinter, their talking rat mentor. Sponsored by Pizza Hut, it was placed on pay-per-view television and released on VHS.[1]
The ’70s were a pretty strange time for science fiction. The moon landings had just taken place, but the technology burst of later decades was yet to happen. This led people to some pretty wild theories about what the future would hold. For some, that involved ping pong balls, trampolines, aluminum foil, and ballads.
Via Galactica was by Christopher Gore and Judith Ross, with music by Galt Macdermot. Macdermot had enjoyed success with the musical Hair, which had produced three chart hits. Yet he was not the only heavyweight involved in Via Galactica. Hollywood legend Raul Julia was in the cast along with Fame actress Irene Cara. Yet not even they could not save the convoluted plot and unworkable set.
The concept was to create a futuristic musical about society’s outcasts living on an asteroid. After running for just seven nights, it was canceled due to its terrible plot. The scenery and actors would sink into the trampoline surface of the set during performances. At one point, radio mics intercepted emergency service bands and broadcast fire and police radio to the audience. Cara would get stuck in the rigging, and Raul Julia was once locked in a spaceship suspended above the audience.
However, the lack of thought was easy to see with the initial title. Originally, it was supposed to be named “Up” and was to be performed at the Uris Theatre. Once pointed out, the name was quickly changed.[2]
Take one outstanding director who had masterminded the million-dollar adaption of Disney’s Lion King for the stage. Add to that pop music royalty in the form of rock band U2. Finish this off with the most iconic superhero of all time. How could it fail?
The concept of a Spiderman musical had been floated when the first Spiderman movie proved a roaring success. However, problems began to appear when the producer, Tony Adams, had a stroke and passed away. A global financial crisis followed, in which many investors left the project. As well as facing a huge budget deficit, the musical also had numerous technical difficulties.
One of these involved the lead actor web-swinging above the audience but becoming stuck. This meant a crew member had to poke him down with a stick while he hung above the front two rows like a piñata.
The sophisticated equipment used for web-swinging across the theatre not only cost a lot to make but tended to injure performers. Concussions, broken wrists, and toes were all reported.
Even the music was lackluster. Rumors were that U2 had been so unfamiliar with musicals that a CD containing the best bits of 60 years of Broadway compiled onto it was burned for them. Imagine B-Sides from a mash-up of U2’s Joshua Tree and the Les Miserables soundtrack, and you may have some idea of what was in store.[3]
At its core, Carrie is a horror film that deals with a female coming of age and menstruation. How anyone thought these themes would transfer to a musical format are unknown. Based on the novel by Stephen King, it lasted a mere five performances and is widely regarded as one of the biggest failures in the history of musicals.
The book from which it came had a very successful cinematic adaptation. The screenwriter, Lawrence D. Cohen, and the composer, Michael Gore, decided to set about creating musical material. Gore had previous experience with the hit Fame, showing he should have known better.
Carrie debuted in the UK in 1988 and was besieged by technical problems from the onset. One actress quit on the first night after a close call when a stage piece almost decapitated her. The most famous scene in the book and the whole movie, in which Carrie gets covered in pig’s blood, kept shorting out the lead actress’s microphone.
When the show moved to the states, it was already dead in the water. The press was as cruel as Carrie’s tormentors in the actual story. Yet, oddly enough, in life mimicking art, despite loud boos from the audience, the show sold out every night. It was as if people enjoyed wallowing in the misery of a terrible production.[4]
Whale hunting and teenage girls as objects of sexual desire are concepts rightly consigned to the past. Imagine, then, a musical that combines both of these into one politically incorrect and uncomfortable stage play.
The musical was created by Robert Longden and Hereward Kaye. Originally, it was a silly, musical hall-style tale in which a girl’s school decided to put on a stage play of Moby Dick. Complete with a drag-wearing headmistress and laden with innuendo-based gags, it toured universities like an early version of Ru Paul’s Drag Race.
After a string of sold-out shows, it was decided that the show needed a larger audience. It took up residency at the Piccadilly Theatre in London’s West End but faced terrible reviews and, after four months, was canceled. Although it did transfer to the states, it was toned down, and many of its contentious topics were removed.[5]
For this musical, we take a break from the stage and head to the big screen. If this movie was simply Repo! It would have a pretty good premise. Set in 2056, organ failure is plaguing the planet. GeneCo is a mega-corporation that provides replacements on a payment plan. Repo men are then hired to hunt down anyone who misses a payment and take the organs back for the company. It all sounds great… until the part where you turn this dark, dystopian story into an opera. Then cast Paris Hilton in it.
The movie has its genesis in a 2002 musical by Darren Smith and Terrance Zdunich. Smith had taken the inspiration from a friend’s bankruptcy, envisioning a future where body parts were viewed like property. It was a huge success, attracting gothic movie lovers in a similar vein to The Rocky Horror Picture Show. This led to the creation of a ten-minute trailer used for pitching to movie studios.
Most of the movie’s promotion came not from Lionsgate, the film’s backers, but from the cast and writers, who did a road tour of the musical. It did little to buoy what was a plot that did not deliver and contained some pretty standard musical numbers. However, it gained Paris Hilton an award for the Worst Supporting Actress at the Golden Raspberry Awards, second only to her win for Worst Actress at the same event.[6]
Before her first and last musical outing, Raggedy Ann had a decent career. A series of successful books by Johnny Gruelle led to a 1977 animated feature film featuring the character with her sidekick Raggedy Andy. However, for some unknown reason, it was decided that her musical outing would take a dark turn.
The story is about a dying child from a broken home. Her dolls come to life and take her on a mission to meet the Doll Doctor, who may have the ability to save her. While it does have a heartwarming ending where she reunites with her father, themes touch on everything from genocide to sex, none of which are suitable for children.
Only lasting three days, the musical fell off the radar after its cancellation. Bootleg recordings have kept the show alive, and attempts have even been made to revive it, with little success.[7]
For anyone who knows the original Toxic Avenger movie and character, a musical makes a lot of sense. Created by cult movie studio Troma, the story tells the tale of a mild-mannered janitor who falls into a vat of toxic waste. He then becomes a crime fighter, overthrowing a corrupt mayor and ending up as the hero of the town. After starting as a flop, the movie developed a cult following with three sequels, videogames, and inexplicably, a children’s cartoon.
The tongue-in-cheek approach of the movie and character lends itself to a musical format, and as such, reviews were quite favorable, with a fair few awards given to it. Starting life at the New Brunswick Theatre in New Jersey, it then went on to tour the U.S. and perform in Australia, the UK, and several high-profile festivals across the world.[8]
Despite not being a huge commercial hit, Starmites has longevity most musicals would be envious of. Running for two months on Broadway, it now even has a version available for children to perform. Starting in 1980, it has returned sporadically on and off for numerous different performances.
The story is about comic book-loving Eleanor, a shy teenager who often drifts into a fantasy world where she is the hero. It is one of these dreams in which the musical takes place, as the Starmites, Guardians of Inner Space, become involved in a battle with the Shak Graa. While it never set the world on fire, it is a good example of how to do a sci-fi musical without taking it so far it becomes laughable.[9]
Everything is getting a musical as audiences clamor to find someplace to spend their dollars after venturing back into society. While many of them are lacking in quality, this one is actually good. Based on the cult Evil Dead movie series, the story follows a group of teenagers who unleash the undead and demonic entities while holidaying in the woods.
Part of its success is that, like Toxic Avenger, it carries the dry humor of its movie counterpart. It has one-liners, and the musical numbers written for the production are both great tunes and funny. It has now been performed over three hundred times around the world, though be warned if you go to see it that the audience does get covered in gore and guts, albeit fake.[10]
]]>Believe it or not humanity could disappear in a couple of years. This will not be the first time that civilizations have disappeared from the face of the Earth. We know it has been caused by wars, climate change, disease, invasion, eruptions. Though most of the time the causes are large assumptions. Here are 10 mysterious lost civilizations that existed thousands of years ago.
Time: 11500 BC.
Location: North America
We do not know much about the Clovis culture, this prehistoric Native American culture is believed to have existed in North America. Its name comes from Clovis site, an archaeological site located near the town of Clovis, New Mexico. Artifacts found on this site in the 1920s consist of stone blades and bone.
It is believed that these people came from Siberia to Alaska across the Bering Strait by the end of the last ice age. Whether it was or not the first culture in North America no one knows. The Clovis culture has gone rather abruptly. Is it because they too hunted and destroyed their own food supply? Or is it because of climate change, disease or predators? Or did the members of this culture are simply dispersed to join other Native American tribes? Was the fall of a meteorite was the cause of their disappearance?
Time: Between 5500 and 2750 BC.
Location: Ukraine and Romania
The largest communities of Neolithic Europe were built by the Cucuteni-Trypilliens where there now lay Ukraine, Romania and Moldova.The civilization of Cucuteni-Trypillia had nearly 15 000 people – an enormous figure for the time that mysteriously disappeared from the surface of the Earth.
The culture of Cucuteni-Trypillia stands out for its pottery. They had this odd habit of burning their own villages every 60-80 years before building new one on the ashes of the old. To date we have identified around 3,000 archaeological sites from this matriarchal society centered around a mother goddess. Their disappearance might have been cause due to drastic climate change leading to worst droughts in European history. Other theories suggest people dispersed into various tribes.
Time: 3300-1300 BC.
Location: Pakistan
The civilization of the Indus Valley civilization is one of those huge lost civilizations that spread over an area that is now Pakistan and western India. It is one of the most important ancient civilizations. But little is known about them, mostly because nobody has ever deciphered their language. We know that the people built over a hundred towns and villages, including the cities of Harappa and Mohenjo-Daro. Each had its own sewage systems and indoor sanitation. It seems that this civilization without classes and without an army excelled in astronomy and agriculture. It was also the first civilization to manufacture cotton clothes.
This civilization disappeared 4,500 years ago and no one knew about it until the ruins were unearthed in the 1920s. Several theories attempt to explain this disappearance, such as changes in their environment like drying of the river Ghaggar Hakra, colder and drier temperatures. Another theory is that the Aryans invaded the region around 1500 BC.
Time: 3000-630 BC.
Location: Crete
The Minoan civilization was not discovered until the early 20th century, but has since unearthed many clues about this fascinating civilization appeared that existed for about 7000 years and reached its peak around 1600 BC. Over the course of time, the cities and palaces were built and rebuilt becoming increasingly complex. One of these was that palaces of Knossos, the labyrinth associated with the legend of King Minos (from which the civilization gets its name). It is now an important archaeological center.
The first Minoans spoke a language called Linear A, replaced later by the linénaire B, both based on pictograms and still not deciphered today. It is believed that the Minoans were wiped out by a volcanic eruption on the island of Thera (Santorini today). There is evidence that they would have survived, had not the eruption killed all plant life. It caused a famine, and damaged their ships, which started the economic decline. Another hypothesis about this is that they were invaded by the Mycenaeans. The Minoan Civilization is one of the greatest lost civilizations that once existed.
Time: 2600 BC. AD to 1520 AD
Location: Central America
The Mayan Civilization is a classic example of a mysteriously vanished civilization. Its greatest monuments, its towns and roads were swallowed by the jungle of Central America and its population dispersed in small villages. Languages and traditions of the Mayan people still survive today, but the climax of civilization occurred during the first millennium AD, when their greatest architectural monuments was built and rural Yucatan covered a wide area, which will aujourd ‘hui Mexico to Guatemala and Belize.
One of the greatest Mesoamerican lost civilizations, the Mayan used writing, mathematics, developed a calendar and sophisticated engineering to build its pyramids and terraced farms. The reason for the disappearance of this very advanced civilization is one of the great archaeological debates. It is assumed that infighting, combined with climate change in the Yucatan around the year 900, would have weakened crops and created a famine leading to the abandonment of cities.
Time: 1600-1100 BC.
Location: Greece
Unlike the Minoan, Mycenaean flourished not only through trade but also through conquest, to the point where their empire spanned nearly all of Greece. The Mycenaean civilization has experienced five centuries of dominating power before disappearing around 1100 BC. Several Greek myths are centered around this civilization, like that of the legendary King Agamemnon who led the Greek army during the Trojan War. The Mycenaean civilization was rich culturally and economically, and has left behind many artifacts. It is not clear why they disappeared: earthquake, invasions, or may be peasant revolts!
Time: 1400 BC.
Location: Mexico
There once flourished the great pre-Columbian civilization of the Olmecs. The first traces of civilization dates back to 1400 BC. In the city of San Lorenzo, there is one of the three main centers of the Olmecs with Tenochtitlan and Potrero Nuevo.
The Olmecs were master builders. One found on these sites large monuments of giant stone heads. This civilization laid the foundation of all Mesoamerican cultures that followed. It is believed that the Olmecs were the first to develop a writing system, they would probably have invented the compass and the Mesoamerican calendar. They knew the use of bloodletting, did human sacrifices and invented the concept of the number zero. This civilization was not discovered by historians until the mid-19th century. Its decline occurred due to climate changes caused by volcanic eruptions, earthquakes or perhaps harmful agricultural practices.
Time: 600 BC.
Location: Jordan
The Nabataean civilization flourished in southern Jordan, Canaan region and northern Arabia from the 6th century BC. The Semitic people built the breathtaking city of Petra, carved into the sandstone cliffs of Jordan Mountains. We also know about their talents in hydraulics and their complex system of dams, canals and reservoirs that enabled them to grow in a desert region.
No written record has reached us and we know close to nothing of their culture. This however was a thriving civilization thanks to its geographical position which allowed them to create a network for the exchange of ivory, silk, spices, precious metals, precious stones, incense, sugar, perfume and medicines. Unlike other civilizations of the time, the Nabataeans were not aware of slavery and each contributed to the tasks of the city.
During the 4th century BC, the Nabataeans left Petra and no one knows why. Archaeological evidence shows that their departure was not rushed, which suggests that they have not fled before the invaders. It is believed that the northward migration occurred to find better work.
Time: 100 AD. AD
Location: Ethiopia
The Axum Empire began establishing itself in the first century AD in the area where Ethiopia is now located. Legend has it that this is the birthplace of the Queen of Sheba. Aksum was an important trading center where we ivory exported, resources, agriculture and gold to the Roman Empire and to India. It was a rich society and the first African culture to issue its own currency, which at the time was a sign of great power.
The most distinctive monuments are his steles of Axum, gigantic carved obelisks playing the role of funeral terminals for kings and nobles. The first Aksumite worshiped many gods, the main one being Astar. Then, in 324, King Ezana II was converted to Christianity and became Axum therefore a fiercely Christian culture. According to local legend, a Jewish queen named Yodit defeated Aksumite Empire and burned churches and books. Others believe that it is a pagan queen Bani al-Hamwijah would have caused the decline of the empire. Other theories attribute the decline to climate change and overuse of soils leading to famine. Aksum came second in the list of greatest lost civilizations that vanished into thin air.
Time: 1000-1400 AD.
Location: Cambodia
The Khmer empire, one of the most powerful empires and greatest lost civilizations, in Southeast Asia occupied what is now Cambodia to Laos, Thailand, Vietnam, Myanmar and Malaysia. Angkor, the capital, became one of the most famous archaeological sites in Cambodia. This empire, which included up to a million people, was thriving during the first millennium. The Khmer practiced Hinduism and Buddhism and had built temples, towers and other sophisticated structures like the Angkor Wat, a temple dedicated to the god Vishnu. The decline of the Khmer Empire was due to the combination of several factors. Though most believe they were invade by the roads built by the Khmers to facilitate transport goods and troops across the empire.
Did we miss any intriguing stuff or do you want to add anything in this list of lost civilizations. Tell us in the comment section below. Written by: BEAOUI Yusuf
]]>To most people a black market is an illicit way of selling or trading something that may or may not be legal, is scarce or hard to come by. During prohibition there was a massive black market for alcohol, for instance. Any place in which they criminalize drugs is going to have a black market trade in narcotics. And then there are those things that inexplicably develop a black market even though, at first glance, it makes no sense.
The Hormel company introduced Spam back in 1937. It made use of pork shoulder, a cut of meat that no one really liked, and has proven to be an enduring and oddly popular product ever since. These days, they sell 141 million cans of spam every year. You’d figure with such wide availability that there would be no illicit trade. But you’d be wrong.
It’s been common knowledge for some time that Spam is especially popular in Hawaii. They take down more cans than any other state, a tradition that dates back to the Second World War when Hormel used to supply the United States military with the canned meat. Because it doesn’t require refrigeration and has a long shelf-life, it was ideal for shipping to GIs no matter where they were stationed.
Because they inundated the islands with Spam, it became something of a local delicacy. In 2017, the demand for Spam in Hawaii got so intense that stores had to endure a rash of thefts, and a black market for the meat sprang up.
Thieves we’re taking it by the caseload from stores, so shopkeepers started locking it in plastic display cases the same way they do with things like cell phones are razors. Apparently it was being resold and of the backs of cars how to turn a quick profit.
There are several laws and regulations in place that govern the sale of food and drink. However, there are no actual regulations relating to the sale of human breast milk, which kind of falls outside the realm of what is considered food in a commercial sense. Because of that, it’s not really illegal to sell breast milk, and for that reason there actually is a market for it.
If it sounds weird at first, consider that there could be a genuine need for this kind of market. Any mothers who cannot produce their own breast milk but still want their baby to be fed naturally might be interested.
It’s for that reason that the underground breastmilk market exists, including several websites that are dedicated to selling only breast milk. Women who have extra milk to sell will advertise it the way you might advertise anything. Here, they’ll boast if it’s organic, vegan, gluten and dairy free, and so on. On the site Onlythebreast.com there are several thousand classifieds up at any given time from buyers and sellers.
Lest you think the entire marketplace is wholesome and interested in making sure babies are receiving essential nutrients, there is a slightly more sinister side to the whole thing. There’s always the possibility that the people interested in buying breast milk are just doing it to fulfill a fetish of some kind. And apparently bodybuilders also buy it, claiming that it’s a superior supplement to the store-bought kinds for helping to build muscle.
You could make a good case that cheese is one of the best foods mankind has ever created. It goes with just about everything, and there are literally thousands of different kinds. Some people take their love of cheese a bit too far though, and that is abundantly clear in Russia where there is a massive black market for the dairy products.
Thanks to Covid-19 and the restrictions it put on traveling around Europe, a Finnish cheese called Oltermanni is the number one illegally traded item in the country. The cheese cost five Euros per kilo in Finland, but in Russia it’s going to cost you four times as much.
Customs officials in Finland have been trying to monitor trucks from Russia that have been smuggling the cheese across the border. Some supermarkets have tried to crack down by limiting cheese sales to a perfectly reasonable 11 kilograms. That’s over 20 pounds of cheese, incidentally.
Why exactly the Russians have chosen this particular cheese is open to speculation. Russia is not known for producing noteworthy cheese, and the country doesn’t even make their own cheddar.
If you’ve ever gone to a store like Walmart to buy razors for shaving, you may have wondered why they keep them under lock and key all the time. Razors are relatively small and expensive for their size, which makes them a prime target for theft. So stores typically keep them secure to prevent loss.
Cities around the country have had issues with thieves making offers large quantities of razors which are very easy to hide and walk out of a store with. Over $1,000 worth of razors can easily be stashed in a fairly small bag.
Reselling razors is remarkably easy because they are always an in-demand product. They’re not something that you can trace, and people could sell them easily on websites like Craigslist or even eBay.
Research suggests that the razor market will be worth about $22.5 billion by the year 2030, which gives you all the explanation you need for why a black market exists.
More often than not when a black market exists it’s for a physical product. But there are rare occasions when a black market exists not for a product but a service. Black market Disney guides are one such example.
If you have never been to Disney World, you may not be aware that the lines to access attractions can be atrociously long. It’s not unheard of for people to wait hours to get access to a popular ride. But not everyone in the park has to wait that long all the time. If you have a disability, there is a second line available to help you gain access to an attraction sooner. It’s a courtesy that Disney offers customers who are differently abled and may not be able to handle waiting for such a long time in the line.
Now the thing about these lines that allow a disabled person to jump to the front is that people who are accompanying them are allowed to skip the line as well. And that is the source of the black market service. Those who can afford it, and rates around $130 an hour, will hire disabled guides for their Disney World visits so that their families can skip lines.
After word of this scam broke, Disney released a statement saying that they were going to take action, so it’s very possible that this particular black market has been squashed.
School cafeterias have had a bad reputation for about as long to school so had cafeterias. While some school districts really put in some effort to ensure that kids receive food that is supposed nutritious and delicious, other school boards are still back in the days of salisbury steak and mystery meat. You might think that the first option would be the more popular one, but that’s not always the case.
In 2010 Barack Obama introduced the Healthy, Hunger-Free Kids Act, regulating how public schools served food to students. Part of these regulations included limits on certain unhealthy ingredients, like sodium.
Kids at one school in Indiana really liked their salt and did not take kindly to Obama’s nutrition regulations. After they put sodium limitations in place, students began selling salt for $1 a shake in their school cafeteria.
The black market salt shakes were short-lived, and the school’s principal shut it down as soon as she figured out what was going on. Although budding entrepreneurs may not have made too much cash, at least they brought to light the fact that they was little scientific evidence supporting the government’s sodium restrictions, and also the fact that with the salt restrictions in place, no one was eating the school lunches because they tasted bland which was arguably even worse than eating nutritionally suspect lunches.
Some things make more sense being part of a black market than others. The black market for alcohol and drugs is something that, even if you don’t agree with it, you can at least understand. The black market for illegal butt lifts, however, is another matter altogether.
If you’re not familiar, a butt lift is a plastic surgery procedure that involves removing excess skin and fat from around the buttocks and then repositioning what’s left over to make your butt look more toned. That’s if it’s done properly by a real, licensed plastic surgeon.
Black market butt lifts are done by unlicensed people — sometimes people who have no medical knowledge or experience. Victims who have experienced them are typically referred to the pseudo-doctor by a friend and, upon meeting them at their home, have had unknown substances injected into their buttocks. No prep work, no forms to fill out, just an unlabeled injection.
The substance being injected into people is generally not medical-grade silicone, but silicone jelly like you’d get from hardware stores. In real plastic surgery procedures, silicone is in a sealed implant. The illegal ones inject it directly into tissue where it can harden, travel around, or even leak right back out again. More than one person has died as a result of these procedures, and many others have endured serious pain and disfigurement as a result.
You may not have noticed, but Krispy Kreme doesn’t actually advertise anywhere. The company’s reputation is pretty much entirely word of mouth and unsolicited media. Anytime you see a Krispy Kreme donut in a movie or a TV show, they didn’t pay to put it there; the producers wanted to use it. They’re really popular donuts.
Any time a new Krispy Kreme location opens you can count on there being people literally camped out outside the store waiting to be the first to experience a brand new donut. And anytime you see the light on the store declaring the donuts are being made, you can pop in and get a free one.
If you don’t have a Krispy Kreme location near you, you may just lament that you don’t get to experience what all the fuss is about. But at least one family in Juarez, Mexico, decided that they could bring Krispy Kreme to the masses with a black market supply.
The Garcia family traveled from Juarez to El Paso every single day back in 2017 to pick up dozens of donuts and then sell them to locals for a 60% markup. What they’re doing isn’t illegal by any means; they’re not smuggling the donuts or stealing them, they’re just reselling for an impressive profit and proving that people will pay just about anything to get their hands on a Krispy Kreme donut.
Cheating in the academic world is certainly not unheard of. If you’ve been a student in the past 20 years, you know that most schools require you to submit any written essays to a service which will scan it for plagiarism. About 60% of high school students admit to plagiarism, and about one-third of college students have copped to it. Russia has taken plagiarism to a new level that goes well beyond school kids being stupid.
Plagiarized dissertations have become commonplace throughout Russia. These aren’t high school essays; these are the documents on which someone gets a doctoral degree. Vladimir Putin’s Chief of Staff was discovered to have bought a plagiarized dissertation back in 2016.
They sell these dissertations on the black market, written by ghostwriters who clearly don’t have a lot of integrity when it comes to original writing. They plagiarize portions of dissertations, repackage them, and then sell them to someone else who is looking to take the easy road to get a degree.
Obviously the motivation for using a plagiarized dissertation is that you can parlay that into getting a doctorate, which increases your reputation, and likely the salary at your future job. And in Russia, the motivation to do things honestly often isn’t really there. For instance, that Chief of Staff didn’t actually suffer any kind of repercussions when it was discovered he had simply bought his way into a degree. He made a semi-apologetic statement on his own behalf, and then business continued as usual.
Over 1,000 high-profile Russian officials were accused of the same kind of plagiarism. They range from politicians to judges to heads of universities and lawyers. A website called Dissernet, which is run by volunteers looking to expose fraud, found 5,600 cases.
Because there seem to be no actual consequences for their actions, the plagiarists barely even try to hide their dishonesty. At least one case discovered by Dissernet was a plagiarized dissertation on the Russian chocolate industry. The highjacked version was about the beef industry, and they literally just copy and pasted the word beef over chocolate in the original.
There’s nothing like a pandemic to make people come up with novel ways to preserve their own health and well-being. We’ve already seen people resorting to trying to drink household cleaners to prevent Coronavirus, but a new industry has been cramping up as well.
Possibly inspired by the Ebola outbreak of 2014 when those afraid of the virus were purchasing plasma from survivors of the disease, a black market for Coronavirus survivor plasma has sprung up in the Middle East.
Plasma has been successful in helping treat the disease so those who have a ready supply in their own blood have been selling it for as much as $1,500 to $2,000 per liter.
]]>Finding out with absolute certainty what happened hundreds or even thousands of years ago is an almost impossible task. Scholars try their best, but they are limited from the outset by the historical evidence they have at their disposal. Today we look at 10 well-known figures from the past whose place in the historical record is uncertain enough that some people have doubted their very existence.
Even by ancient history standards, the existence of Lycurgus is poorly attested, even though he was, arguably, the most important Spartan in history and the one mainly responsible for transforming the city-state into the most feared and powerful nation of the Greek world.
This is because the Spartans simply weren’t interested in keeping written records. We found out about Lycurgus from external sources, mainly Greek philosophers and historians, such as Herodotus, Xenophon, and Plato. But by that point, hundreds of years have already passed, so whatever historical fact may have existed got mixed in with legend and myth.
If Lycurgus did exist, he would have lived around 800 BC. After traveling around the Greek world, Lycurgus decided that it was time for a change in Sparta, so he returned home and enacted the set of laws known as the Great Rhetra which transformed Sparta into the fierce military society that gained everlasting notoriety.
The story of Jenny Geddes is a perfect example of the butterfly effect: a seemingly-insignificant incident that, ultimately, leads to a much greater and momentous event. In this case, we start with an angry woman throwing a chair and end with the execution of Charles I, King of England, Scotland, and Ireland.
The story begins in 1637, at St. Giles Cathedral in Edinburgh. Tensions were already pretty high between the English and the Scots, and they were exacerbated by the fact that King Charles I seemed determined to force Scotland to abide by the religious rites of the Church of England, whether it wanted to or not. On the day in question, the Dean of Edinburgh began reading for the first time from the newly-printed Scottish version of the Book of Common Prayer, to the dismay of his congregation.
One woman named Jenny Geddes decided she wasn’t going to sit there and take it. “Devil cause you colic, false thief: dare you say the Mass in my ear?” she shouted, as she got up and threw her folding stool at his head. A full-scale riot ensued, which spread to all of Edinburgh, and then all of Scotland. That act is now considered a catalyst for the War of the Three Kingdoms, which ended with Charles’ execution.
All of this was well and good, but historical evidence for the existence of Jenny Geddes has proven hard to come by. Maybe she was never real, or maybe it was just a random name given to the unknown woman who actually threw the chair. Some have even speculated that Jenny and her friends may have actually been apprentices dressed in women’s clothing, who attended church specifically to cause a riot.
A hundred years before the infamous Jack the Ripper, London was terrorized by the crime spree of another maniac, one who liked to follow women around at night, cut up their dresses, and prick them in the buttocks using a knife or a needle. He became known as the London Monster and may have been responsible for over 50 attacks over a two-year period but, at the same time, he may have never existed at all and simply been a product of mass hysteria.
But what exactly caused it in the first place? While a few attacks on women certainly happened, there was nothing to suggest they were caused by the same man. The newspapers certainly sowed the seeds of panic, as did criminals who often used cries of “Monster!” to create a frenzied mob and escape in the ensuing confusion. Then there were also women who claimed to be victims of the Monster for the attention.
The drama came to a head in 1790, when a 23-year-old Welshman named Rhynwick Williams was charged with the crimes, but his trial was a complete farce. Some of the women called to testify admitted to making the whole thing up, while others positively claimed that Williams was not the man who attacked them. Ultimately, he was still convicted of three charges and sentenced to six years in prison and this seemed to help the whole hysteria die down and the London Monster was never heard from again.
There is no denying that William Tell is an important folk hero and a pivotal figure in the history of Switzerland as a man who stood up to foreign tyranny but the question is: was he real?
His story takes us back to 1307, to the town of Altdorf, at a time when Switzerland was under the dominion of the Austrian House of Habsburg. As a sign of his authority, the tyrannical local bailiff Albrecht Gessler placed his hat atop a pole in the market square and demanded that all who passed bow before it. One day, a peasant named William Tell walked by the hat with his son and refused to bow. As punishment, Gessler demanded that Tell shoot an apple off his son’s head in one try from 120 paces or face execution. William Tell succeeded, so his life was spared, even though he was still sent to prison. On the way there, he escaped and eventually assassinated Gessler, starting a rebellion that led to the liberation of Switzerland.
It was a crucial moment, but scholars remain unable to trace the actions to a historical figure named William Tell. His story wasn’t even written until hundreds of years later and, controversially, the apple shot scene seemed to be based on an older Danish tale involving a Viking named Toko and King Harald Bluetooth.
At first, the story of Yang Kyoungjong seems perfect as one of those random facts you can whip out during a conversation: he was the only soldier to have fought on three sides during World War II.
As a native Korean, Yang was first conscripted into the Imperial Japanese Army at the start of the war, since Korea was part of the Empire of Japan back then. During battle, he was captured by the Soviets and sent to a gulag, but later was pressed into the Red Army to fight Nazi Germany on Europe’s eastern front. There, he was once again captured and became a POW of the Wehrmacht and sent to Occupied France, where he was forced to defend Normandy during D-Day. There, he was taken prisoner one last time by US paratroopers, which is when the famous photograph was taken, purportedly showing Yang wearing a German Wehrmacht uniform.
His story was often shared without too much scrutiny, at least until 2005 when a South Korean company wanted to film a documentary about Yang Kyoungjong. They quickly discovered that there were no records to attest that he fought for Japan, the USSR, and Nazi Germany. Furthermore, they also found that nobody really knew for sure who the Asian man in the famous photograph was, as the Americans could not communicate with him and simply labeled the picture as “Japanese man.”
https://www.youtube.com/watch?v=bH7-_I4RWVE
Gather around the fire, everyone, to hear the tale of Agnus McVee, a ruthless and bloodthirsty Scottish woman who operated a “murder hotel” along Canada’s Cariboo Wagon Road during the Cariboo Gold Rush of the late 19th century. It is said that Agnus, alongside her husband Jim McVee and her son-in-law Al Riley, kidnapped young women and sold them as sex slaves, and also killed miners who stayed at their hotel, the 108 Mile House, in order to steal their gold.
Allegedly, Agnus and her accomplices were responsible for over 50 killings and were only discovered after one of the girls they kidnapped managed to escape and tell the authorities about the gruesome goings-on at the 108 Mile House. It is said that Agnus buried all of the gold she took off her victims and every now and then, people still find some of these lost treasures, as a reminder of the trail of blood left by Agnus McVee.
It sure makes for a good story, but it seems like the legend of Agnus McVee may have originated during the 1970s, in a book about buried treasure in British Columbia. You would think that a killing spree that left over 50 people dead might be newsworthy, but historians couldn’t find mentions of it in newspapers of the day.
Including Egyptian pharaohs almost feels like cheating a little bit since there are many of them who are only known as names inscribed on a wall. But we aren’t going with just any pharaoh – Menes, if he existed, would have been the first pharaoh, the one who united Upper and Lower Egypt into a single empire.
The reason we are uncertain of his existence is that we have two different men who both claimed to have unified Egypt. One of them is Menes and the other is called Narmer. We have more evidence to suggest that Narmer was legit, including the Narmer Palette, a contemporary tablet that not only depicts the unification but also shows Narmer wearing both the crowns of Upper and Lower Egypt, an important symbol that signified the pharaoh’s authority over the two lands.
So, then, where does that leave Menes? It is possible that he was a mythical figure, or maybe a different pharaoh who came after Narmer, but there is also a strong belief among Egyptologists that he and Narmer might have been one and the same person. Unless we find more evidence, this remains just speculation.
The legend of John Henry is one story that most American kids will hear at least once in their lives. During the mid-to-late 19th century, John Henry was a big, strong Black man who worked on the railroads as a steel driver. His job involved swinging a giant sledgehammer to drill holes into the rock to make room for dynamite. It was arduous, backbreaking labor, but one that John Henry did not want to give up. Therefore, when the railroad company brought in a steam-powered drill to speed up the process, Henry insisted that he was still the better option.
And so a challenge was set – John Henry versus the drill, man versus machine. Henry picked up two large hammers and started pounding away at the rock, drilling a 14-foot hole whereas the machine only managed to penetrate nine feet. John Henry had won the contest and then he collapsed of exhaustion and died with the “hammer in his hand.”
Since then, the tale of John Henry has been passed around numerous times, in the forms of stories, and ballads, and novels. But whether or not there ever was a real John Henry remains a mystery. For most of his existence, he has been regarded simply as a folk hero, but some historians in recent decades believe they may have found the real John Henry – a former Union soldier who was imprisoned for theft in Richmond, Virginia, and then sent to work on the railroad.
We arrive now at a conspiracy that has been going strong for over 150 years. It is possible that the greatest writer in the English language was himself a work of fiction? Most people would say “No. Of course not! Don’t be silly,” but there are many others (including actors, authors, and scholars) who believe that somebody else wrote all the plays we credit to the man known as William Shakespeare.
So who was the true author, then? Dozens of candidates have been put forward, but the most popular choices include Sir Francis Bacon, Christopher Marlowe, himself a successful playwright, and Edward de Vere, Earl of Oxford. But even within this conspiracy cabal, there are many disagreements. Some believe that Shakespeare got his hands on the plays using subterfuge, or even with the real writer’s permission. At the very least, they don’t deny the existence of William Shakespeare, the man.
But others are convinced that Shakespeare was simply the pseudonym of the true author, who did not want to be associated with writing, for various reasons, and that the guy we’ve seen in paintings was just someone hired to act as a front. Some think it is possible that the writer may have been a woman, or maybe not even a single person, but rather an entire group of authors.
We end with one of the most famous kings of all time, one whose story has been the subject of countless books, movies, works of art, and even musicals: King Arthur of Camelot.
According to medieval histories, Arthur was a king who reigned during the late 5th – early 6th centuries and defended his people against the Saxon invaders. But the only surviving contemporary account of those battles, written by a monk named Gildas, makes no mention of the king. It wasn’t until hundreds of years later that Arthur was named in the writings of a Welsh historian named Nennius.
During the early 12th century, Geoffrey of Monmouth wrote his famed chronicle Historia Regnum Britanniae, where he presented the lives of all the kings of Britain, starting with the legendary Brutus of Troy. Geoffrey was the first to present many of the elements associated with Arthur, such as the Excalibur sword, Lancelot the loyal knight, and Merlin the wizard.
Nowadays, many historians consider his work almost pure fantasy, although some of them still believe in the existence of a real Arthur or, at the very least, consider it a possibility that has not been debunked yet. Either way, the legend of King Arthur is still going strong.
]]>