Ancestors – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Wed, 12 Feb 2025 07:44:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Ancestors – Listorati https://listorati.com 32 32 215494684 10 Bizarre Things Our Ancestors Did For Fun https://listorati.com/10-bizarre-things-our-ancestors-did-for-fun/ https://listorati.com/10-bizarre-things-our-ancestors-did-for-fun/#respond Wed, 12 Feb 2025 07:44:56 +0000 https://listorati.com/10-bizarre-things-our-ancestors-did-for-fun/

Our ancestors did some strange things out of boredom that we today would have trouble getting our heads around. Once upon a time, people burned cats for fun and thought competitive walking was the height of entertainment. No matter how hard we try, we may never be as crazy as our grandparents.

10Cat Burning

02

In today’s world, the killing of some animals is met with immediate outrage. In 17th-century France, cat burning was a form of entertainment.

Every year, Parisians gathered during midsummer bonfire in Place de Greve to play, dance, and sing. To make the gathering more interesting, the crowd gathered live cats into sacks, hung them over the fire from a mast, and watched them die slowly. The cats were chosen for their supposed link to the devil and witches. Sometimes, a fox was thrown into the fire. While the poor animals shrieked and cried, the people partied.

French kings and other important dignitaries were also known to take part in this sick fun. Sometimes, they got the honor of lighting the bonfire. Similar midsummer bonfire rituals took place in other parts of France and Europe. After the burning, people took the ashes home because they believed it brought good luck.

9Incubated Babies Fairs

01

Before the 20th century, premature babies had very little chance of survival. That was set to change when Dr. Martin Corney invented his incubator, but not many people trusted the machine. Hospitals rejected it, and investors were not forthcoming. To prevent the death of his invention and convince skeptics, Couney came up with a very strange and crazy solution. He built an exhibit in which premature babies will be put on display at fairs and parks. The first exhibit, or “child hatchery,” opened in Berlin in 1896. He soon moved to the US, where he opened an exhibit on Coney Island.

Couney’s exhibit looked like a normal hospital. He placed babies in wards and employed doctors and nurses to look after them. The only difference was that one side of the ward was glass, and people watched through it.

The exhibit was very successful. Parents brought their premature babies to Couney and did not have to pay for medical care. The customers were charged up to 25 cents for the show, and the money collected paid for all expenses. Most of the premature children on display survived. By the time incubated babies fairs ended four decades later, Couney had managed to convince everyone that his incubators were safe.

8Blackened Teeth

03

Today, we brush, floss, and even go the extra mile of whitening our teeth to give us that extra confidence when smiling. Back in 16th- and 17th-century England, however, a fad was born from the lack of dental hygiene.

At that time, sugar was very expensive because it was imported into the country. Therefore, only the upper class Elizabethans could afford it. Excessive use of the commodity rots the teeth. Elizabeth I lost many of her teeth because of her love for sweets, and people could hardly understand her when she spoke. The few remaining teeth in her mouth were black and decayed.

People began seeing black teeth as a status symbol. People whose teeth were not black enough applied cosmetics and used coals to blacken them.

Incidentally, despite the lack of care for dental hygiene, Elizabethans did everything they could to prevent bad breath. This was perhaps because people believed the plague could be contacted from the bad breath of those around them. They used vinegar, anise seeds, and other similar herbs to clear their mouth of bad odor.

7Pedestrianism

04

In the 19th century, competitive walking was the most popular spectator sport in America. The sport, known as pedestrianism, was born out boredom by the people who migrated to the cities after the Civil War and needed a new form of entertainment. In filled arenas, competitors walked around tracks almost nonstop from Mondays to Saturdays in front of spectators. (Sundays were excluded because public amusement was not allowed on that day.) The competitors walked to see who could cover the most distance during the race’s duration. Some of these competitors would reportedly walk up to 160 kilometers (100 mi) in 24 hours.

The sport was so popular that celebrities were known to visit arenas during the event. Future president Chester Arthur was a regular spectator at the arena. As in modern sports, there were rivalries among the stars, who earned a fortune from the prize money and sponsorship deals. There were scandals over fixing games and over drugs.

Pedestrianism was eventually replaced by competitive cycling after the invention of the safety bicycle by John Starley in 1885.

6The Alexandra Limp

From the moment she got married to the Prince of Wales, Princess Alexandra of Denmark was beloved by the British public. She was the 19th-century version of Princess Diana. Aside from her good looks and lively nature, Alexandra’s love for charity also endeared her to the public.She was so adored that people copied her every move and style.

After the birth of her third child in 1867, Alexandra contacted rheumatic fever, which left her with a limp. This gave birth to the Alexandra Limp, a fad in which women in London and Edinburgh started limping intentionally. To make the limp more realistic, women bought mismatched shoes. Subsequently, shoemakers began making uneven shoes intentionally. The fad got to the point where walking canes became one of the most sought after accessories of able-bodied females.

The Alexandra Limp was criticized by prominent newspapers of the day, many of which saw it as an act of mockery against the princess. Thankfully, it did not take too long before the fad faded.

5Headless Portraits

06

A lot of bizarre trends emerged after the invention of photography. The most famous of these trends was post-mortem photography, which showed us how death-obsessed the Victorians were due to the high mortality rate back then. However, that wasn’t the only morbid photography trend that was popular at that time. In 1853, a prominent photographer named Oscar Rejlander started an equally disturbing trend known as “Headless Portraits.”

Oscar, sometimes considered the father of art photography, combined negatives and formed pictures of living people beside their own decapitated heads. Soon, there was a huge demand for these photographs, and several photographers adopted the technique.

People took pictures in which their heads were everywhere but on their neck. They held their decapitated heads tucked under their arms, some placed theirs on a platter, while others even dangled it from their own hands with a knife on the other hand. No one was left out of this bizarre fun, as even children were known to take headless portraits.

4Fasting Girls

07

Anorexia nervosa is an emotional disorder involving the desire to lose weight by refusing to eat. The disorder is prevalent common among teenage girls and young women. Though this was thought to be a recent problem, it has been around for centuries. The most famous cases of anorexia nervosa can be found in the late 19th century, when a group of girls known as “fasting girls,” claimed to be able to survive without eating anything over a long period of time.

These girls were known as having anorexia mirabilis, a miraculous lack of appetite. Their ability to survive without food was seen as a miracle, and they became celebrities. People from different part of the world came to them bearing gifts and offerings to find the favor of God.

3Tear Catching

08

No one knows how and when tear catching began, but the first documented reference of the practice can be found in the Bible. The practice was also quite popular during the Roman period. Back then, mourners filled small glass bottles with their tears, and these bottles, known as lachrymatory bottles or tear catchers, were left in the burial tomb as a symbol of respect. In some cases, women were paid to cry into the bottles during procession. This was done to get as many full bottles as possible since a person’s value was measured by the tears produced during their funeral.

The practice was revived during the Victorian era. Mourners cried into vials equipped with special stoppers that aided evaporation. Once the tears evaporated, the mourning period was over. On the other side of the world, in Civil War America, Women wept in tear vials and waited till their husbands came back from war to show them how much they have been missed. The more the tears in the vial, the more she had missed her husband.

2Ant Farms

09

Moving now to the more recent past, the ant farm was a popular toy created by Milton Levine in the 1950s. The idea came from an army of ants he saw at a picnic during a Fourth of July celebration. Milton found ants captivating and thought that if he put them in transparent plastic designed to look like a farm scene, people would enjoy observing them. He named the product “Uncle Milton’s Ant Farm.” After advertising the $1.98 product in newspapers, ant farms became an instant hit, and Milton received thousands of orders from people around the country.

The ants Milton used in his product were gathered by ant rustlers, who were paid a penny for each one. To purchase an ant farm, a customer had to first buy the farm before ordering for his ants, which would be delivered in vials within a day. An instructional manual, special sand, and a liquid dropper were also included in the product.

In 2011, Milton Levine died at the age of 97, having sold more than 20 million ant farms during his lifetime.

1Uranium Sitting

10

In 1953, a bizarre fad known as uranium sitting was born on a Texas dairy farm. The fad began after the owner, Jesse Reese, claimed his wounded leg healed because he buried it in the soil of his farm, where a group of scientists had recently found traces of uranium. Believing the radioactive soil had healing properties, people with all sorts of ailment ranging from the cold to cancer came to him for help. He soon sold his cows and turned his farm into a “uranium dirt house.” To get better, the customers buried their sick body parts under the soil.

Several uranium dirt houses cropped up in other parts of the state. They charged up to $20 for the service and added trailer camps and landing strips to their place of business to accommodate the increasing crowd. Although they did not know the effect of uranium sitting, medical experts spoke against it and refused to approve the practice.

In 1955, R.E. Hight and his business partner, Walter Miller, opened a uranium dirt house in Corydon, having leased 1.5 tons of radioactive soil from Jesse. Despite the promises of healing, they made customers sign a waiver stating that no miracle was expected from the sitting. Barely months after their business took off, a newspaper published an article stating the sand used in Corydon was not radioactive at all. Hight and Miller hired a geological engineer to examine the soil. The engineer discovered that the soil contained hardly any radioactive particles . Consequently, people began to doubt the healing power of the sand, and the fad died.

]]>
https://listorati.com/10-bizarre-things-our-ancestors-did-for-fun/feed/ 0 17884
10 Reasons To Believe We Have Aquatic Ape Ancestors https://listorati.com/10-reasons-to-believe-we-have-aquatic-ape-ancestors/ https://listorati.com/10-reasons-to-believe-we-have-aquatic-ape-ancestors/#respond Mon, 16 Dec 2024 03:16:57 +0000 https://listorati.com/10-reasons-to-believe-we-have-aquatic-ape-ancestors/

Why do modern-day humans appear so drastically different from apes when ape species don’t look all that different from one another?

Fifty years ago, the mainstream scientific consensus said that our ancestors went from living in trees to hunting on the savanna. Then fossil evidence challenged what we thought we knew. In the Great Rift Valley where early hominids thrived, paleontologists discovered that the accompanying microfauna, pollen, and vegetation from that period weren’t savanna species at all.

The image of early hunters chasing red meat through a golden savanna sure painted beautiful illustrations for biology textbooks. But the truth is, we became bipedal before the savanna existed. The former depiction also didn’t account for other puzzle pieces, like the development of our big, complex brains.

Although it began as a lunatic fringe theory, the idea that humans evolved alongside the water and generally had a more aquatic existence has gained considerable steam in the scientific community. Even esteemed natural historian Sir David Attenborough has said, “It isn’t yet the hypothesis that most students are taught, but perhaps its time has come.”

Originally known as the “aquatic ape theory,” it has since been dubbed the “waterside model,” presumably because it sounds a little less silly. Nobody is saying that our ancestors were mermaids swimming in the deep blue alongside whales and talking crabs. Rather, as Elaine Morgan, a proponent of the theory, suggests, “The difference between man and apes has something to do with water.”

Here are just a few reasons to think that the aquatic ape theory might not be so crazy after all.

10 Bigger Brain

Human brain anatomy is markedly different than that of an ape, especially where the cerebral cortex is concerned. Ours is much larger (even though it doesn’t always seem that way). The qualities that define us—like our abilities to use language, make tools, and have fine motor skills—are the result of this key advantage. The question is: How did our brains evolve for these unique purposes?

The enlargement of the human brain can be compared to those found in seals and dolphins. Marine diets seem to be the only foods capable of fueling brain development as they contain “brain-specific” polyunsaturated fats such as docosahexaenoic acid (DHA), which is an omega-3 fatty acid.

The land-based food of savanna hunters simply doesn’t provide the nutrients needed for this change to occur. No primate can develop a large brain with only a land-based diet. As animals evolve bigger bodies on land, their brains actually shrink. Think of the horse with a walnut-sized brain.

In the sea, however, the opposite is true. For example, dolphins have a brain that weighs 1.8 kilograms (4 lb) because seafood provides the nutrients that stimulate a boost in brain growth. This is why a dolphin has a bigger brain than a zebra despite their similar body sizes. Interestingly, the sperm whale has the largest brain in the world, weighing in at 7–8 kilograms (15–18 lb).[1]

9 Large Sinuses

Our noses are among the most bizarre in the animal kingdom. These oddly shaped snouts are not shared with any other member of the ape family or, for that matter, any other land mammal. We have strikingly large sinuses, which are those empty spaces in the skull between the cheek, nose, and forehead. They seem to serve no significant purpose until one considers the possibility of aquatic adaptation.[2]

If one imagines an aquatic lineage, it seems that our design has a very useful function indeed. These vacant air cavities may be understood as buoyancy aids that could keep our heads above water. They also protect the upper airway tract in a watery world.

Have you ever wondered why our nostrils peculiarly tilt at a downward angle? It does seem to keep water out of the nose when swimming, does it not?

Our sense of smell, however, is notoriously poor. Keep in mind that one doesn’t need a sharp sniffer underwater. For example, diving mammals typically have a reduced sense of smell. It’s only a breadcrumb on the trail, but you may be getting a whiff.

8 Bipedal Shift

Humankind has been walking on two legs for about two million years. (Some sources say four million years, while others say six million. Suffice it to say, it’s been a while.)

Once upon a time, it was believed that the transition from life in the trees to surviving in open grassland was the reason for this change. However, when baboons ventured from the jungle onto the savanna, they stayed resolutely on four legs.

Why? If walking upright for long periods of time is advantageous, why didn’t more species adapt this trait?

Walking on four legs is clearly a superior means of travel when it comes to balance and speed. Of course, now we know that the savanna wasn’t even in existence at that time anyway. Baboons, it turns out, do stand upright from time to time but only for a very specific purpose—to wade through water for food!

David Attenborough explained that during his lengthy career, he saw many species of primate wading bipedally in the water, crossing shallow rivers and pools. But as soon as they hit land, they always dropped onto four legs again. Wading through water is the only circumstance when these primates will walk on two legs.

Humans only walk somewhat efficiently because of our extended legs and vertical hips. But even so, it’s a little awkward. It’s like we are falling forward with style. So how did we learn to do this?

Studies have observed humans walking in water versus on land, and they suggest that it may be how our early ancestors learned to walk. Water has a natural buoyancy that makes walking upright easier. The evolution of our gangly legs and our silly walk was facilitated by a strong need to develop these traits.[3]

7 A Subcutaneous Fat Layer

When human babies are born, they look like little cherubs with chubby cheeks and adorable little rolls of fat. This is because their bodies are naturally wrapped in a comparatively thick layer of fat. Other primates are not born so plump. They usually appear to be wrinkly, malnourished balls of skin. So, what gives?

Underneath our skin is a layer of fat that covers almost our entire body. It’s one of the main reasons that humans can become morbidly obese in a way that’s impossible for other primates.[4] No other primates have these subcutaneous fat layers. Lucky us.

However, this fat layer is seen in sea mammals, including whales, seals, walruses, and manatees. This blubber provides buoyancy and insulates the body, maintaining body heat in cold water. Also, the fat streamlines the body and allows for more effective swimming. It creates a serious advantage in aquatic environments because heat loss happens more quickly in water than air. It’s not just cozy, it’s necessary.

6 Curiosities From Birth

When human babies are submerged in water, they instinctively know to hold their breath and open their eyes. This reflex is called the bradycardic response. Don’t try this at home, but if they are dipped in cold water, their little hearts will instinctively slow as the flow of blood shifts from the peripheral muscles to conserve oxygen.[5]

This reflex safeguards the oxygen for the brain and heart. It’s not rocket science that a freakishly fat baby who instinctively holds his breath underwater didn’t evolve on an African savanna.

When infants first pop out, they are covered in a mucous layer that looks like cheese. This slick coating that keeps out the cold is called vernix caseosa. It used to be believed that it was unique to humans until a research team from Cornell discovered that newborn seals, or pups rather, are also born in this disgusting greasy ball of vernix caseosa.

Now it’s hypothesized that this physiological phenomenon may extend to all marine mammals.

5 Sweat And Tears

Living near salt water requires the occasional release of salt from the body. While sweating is an effective cooldown ability on its own, it’s not really needed when you’re right next to a body of water and can just take a dunk. That’s where tears come in handy.[6]

Crying is useful for shedding excess salt. Humans not only sweat more than any other mammal, but they are also the only mammal that sheds tears. Other mammals cry, but no tears fall. As it turns out, humans exude greater quantities of salt water than any other mammal.

4 Breath Control

The reason gorillas can’t speak has nothing to do with their teeth, tongues, lungs, or vocal cords. We only speak because we have mastered conscious control of the breath, and that’s the key.

All diving mammals hold their breath to shield water from the lungs and regulate the pressures in the respiratory tract as they submerge, feed underwater, and surface. This refined mastery of the airway entrances was preadaptive for the evolution of speech. Living in a seascape could explain why we urgently evolved to be able to control our breath.

In our respiratory valves, we have a unique design from other primates in that a soft palate can lift to close off the nasopharynx. This is a critical feature for aquatic mammals to keep water clear of the respiratory passage.[7]

Humans also have a uniquely descended larynx, which just means that it’s closer to the lungs. As the human baby grows up, the larynx sinks down lower. Again, animals with this design are seals, sea lions, walruses, and dugongs. The one and only primate that has it is us. It allows for gulping large amounts of air easily.

Some researchers argue that our upright posture naturally slid the larynx down, but others suggest that it may have actually been a selected pressure. Next time you hold your breath for a swim, imagine how this seemingly inconsequential feature may have been the biggest game changer in our evolution.

3 Fossils And Observation Of Behavior

The fossil of Lucy, an early upright-walking hominin, and other well-known ancestors were discovered near shorelines of massive lakes, of which the surrounding area was subject to periodic flooding. In an analysis of 20 hominid fossil sites in East and South Africa, there is fossil evidence that suggests our early ancestors were either living lakeside or in flooded grasslands.

How did they handle this deluge?

Researchers have a clue based on observing baboons at Botswana’s Okavango Delta during the summer months when it is flooded. When fruit becomes rare, the baboons eat water lily roots instead.[8]

Fossil evidence demonstrates that early hominids also resorted to aquatic plants like water lily nuts. The nuts from this thorny plant are a pain in the butt to collect as they require diving 5–7 meters (16–23 ft) deep. Then they can be roasted over a fire where they pop open like popcorn. Even today, people still collect water lily nuts and eat them in the same way.

We also know that our ancestors were chowing down on seafood, in general, about two million years ago. For example, we’ve discovered fossilized bones of a catfish 2 meters (7 ft) long that had been cut with stone tools. It may be no small coincidence that early humans followed the coastlines around the world before heading inland. The water is what they knew best.

2 Pruney Fingers

Have you ever wondered why our fingers get all wrinkly when we’re in the water for a while? This active process is controlled by our autonomic nervous system. Evolutionary neurobiologist Mark Changizi believes it served a valuable function in our distant past.

He thinks that the pruney pattern may improve our underwater grip on objects. An independent team from Newcastle University found some support for this hypothesis in a study that demonstrated how people can pick up wet marbles more quickly with pruney fingers than dry ones.

This advantage only applied to the wet marbles. When the marbles were dry, both wet and dry fingers had the same ability. Evolutionary biologist Tom Smulders said, “We have shown that wrinkled fingers give a better grip in wet conditions—it could be working like treads on your car tires, which allow more of the tire to be in contact with the road and [give] you a better grip.”[9]

He explains how this advantage could have helped our ancestors gather food from bodies of water.

1 Nakedness

We are the only smooth-skinned primates. Nakedness is an advantage underwater because it allows the body to glide gracefully through the water with ease. Why, then, do we still have hair on our heads?

It is hypothesized that cranial hair remained to protect us from sun radiation. Hair shields the head, but the shoulders and upper arms also tend to have more hair. The hair that did stick around is arranged diagonally, pointing inward toward the midline of the body. This pattern provides the least resistance while swimming.

There’s a close connection between nakedness and water. Mammals that have lost their body hair are the aquatic ones like the hippo, dolphin, manatee. and more. The elephant may be pointed to as an example of a nonaquatic animal that lost its hair, but hold on. Turns out, they have an aquatic ancestor as well. In fact, all naked pachyderms do, even the rhino.

It seems that every naked mammal was conditioned by water at some point.[10] Why not us?

]]>
https://listorati.com/10-reasons-to-believe-we-have-aquatic-ape-ancestors/feed/ 0 16770
10 Mundane Jobs That Horrified Our Ancestors https://listorati.com/10-mundane-jobs-that-horrified-our-ancestors/ https://listorati.com/10-mundane-jobs-that-horrified-our-ancestors/#respond Tue, 23 Jul 2024 12:53:39 +0000 https://listorati.com/10-mundane-jobs-that-horrified-our-ancestors/

Jobs. There are a million of them out there, and most have one thing in common: boredom. Turning human beings into mindless cogs in the machine, the soul-crushing tedium of modern occupations can be scary for sure.

But that’s nothing compared to the horrors endured by the workforces of yesteryear. Thumb though a history book, and suddenly, even the most run-of-the-mill job springs to truly terrifying life. Keep these vicious vocations in mind the next time you find yourself praying for five o’clock.

10 Waiting Tables

10-waiter-roman-feast

Waiting tables has long been the domain of struggling actors and those working on their screenplays; it’s sort of a holding-pattern profession, not something you aspire to. But the ancient world somehow found a way to make this most humble of professions even less profitable and even more degrading.

The wealthy of ancient Rome were fond of a good feast. They would attend lavish banquets and gorge themselves on wine and various delicacies, all served by slaves, until they simply couldn’t eat another bite. But what was a Roman aristocrat to do when a full belly came a little too early in the evening? They made some room.

Excusing themselves from the party, diners would occasionally force themselves to vomit in order to rejoin the feast (in a not-too-dissimilar manner to some of our own size 0 models and actresses). The downtrodden waitstaff—slaves—would then mop up the last course before returning to serve up the next. And they didn’t even get a tip. Incidentally, contrary to popular belief, the Romans did not purge themselves in rooms called vomitoria or vomitoriums—those were simply passages in an amphitheater.

9 Cutting Hair

9-barber-surgeon

Between sweeping up other people’s hair and forcing boring small talk, the duties of a modern barber aren’t exactly glamorous. But luckily for those aspiring stylists out there, the last few centuries have done a great job filtering the unbridled horror out of a job that once left our ancestors scarred in more ways than one.

In addition to trimming hair, the barbers of medieval Europe held a host of other job titles. They dabbled in dentistry by extracting the rotten teeth of their clients. They played doctor by selling various primitive medicines, performing bloodletting, and even giving enemas. Though most shocking were the duties of the notorious barber-surgeons.

As the terrifying title suggests, these barbers made a living hacking open their customers. Barely trained and almost never literate, these maniacs’ attempts at medicine were little more than butchery.

It was common for bloodstained rags to be seen hanging from the walls of the barbershop, inspiring the iconic red-and-white poles we still see today. Luckily, barbers were forbidden to do anything but cut hair by King George II in 1745.

8 Bartending

8-ducking-stool

The gravest dangers facing barkeeps today are bad tips and the occasional drunken brawl. Other than that, it’s pretty cut-and-dried. Even if a mistake is made, the worst that can be expected is a demanded refund. But that wasn’t the case in 17th-century Europe.

During the Tudor era, it was common for brewers to sell their products directly to the alcohol-crazed masses. The ale went bad in a matter of days, so alehouses—or taverns—brewed their ale on-site to serve it as quickly as possible. This was a pretty efficient system, but the fact that nonprofessionals were handling the brewing often led to bad batches. People didn’t like bad batches.

Punishments for inferior ale were swift and bizarrely severe. In addition to fines, the offending brewer, which was traditionally a woman, would have her entire stock confiscated and distributed for free to the poor.

But strangest of all was the use of the “ducking-stool.” The “alewife” in question would be tied to a chair on the end of a long pole and submerged in dirty water. This primitive waterboarding was used on countless women whose only crimes were making a few bad drinks.

7 Making Musical Instruments

7-catgut-violin-strings-sheep-intestines

Modern instrument manufacturing is typically carried out like any other kind of modern manufacturing—on a cold, monotonous assembly line. Workers are essentially soulless living machinery, but the experience still beats the methods used in ages past.

Violin strings were, and occasionally still are, made of only the finest sheep intestines. Violin manufacturers would often set up shop right next door to the local slaughterhouse to get their hands on the grisly guts the moment they were cut from the sheep.

Then the manufacturers would cart their haul back to the factory and set about scraping out the feces, blood, fat, and slime. This would all be done by hand as the intestines were too delicate for machinery to handle.

After cleaning, the guts were wound up and dried to produce the violin strings. Ironically, this gruesome process was said to result in the most beautiful-sounding strings. If they were cleaned properly, that is. If not, they were known to begin rotting on the violin.

6 Hairdressing

6-ancient-roman-hairdresser

Modern hairdressing may be looked down upon by some, but the stylists of the ancient world were the targets of almost universal disgust. Far from their chatty, hairspray-blasted modern counterparts, hairdressers in ancient Rome were slaves who reeked of several less pleasant substances.

Called ornatrixes, these pitiable professionals spent their lives catering to the whims of the ultra-vain elite. The pressure was intense as a mistake meant a brutal whipping, but that still wasn’t the worst part of the job.

There were no hair products back in the day, forcing the dedicated ornatrix to improvise. Bile, cuttlefish ink, and even decomposed leeches were mixed to produce dark hair dye, but bleaching was even worse. Pigeon droppings and ash were slathered onto the scalp and then rinsed out with human urine.

However, the ornatrix’s worst days came from dandruff sufferers as the Romans believed that a flaky scalp could be cured with human feces.

5 Washing Clothes

5a-fullonica

Aside from dry cleaners, you would be hard-pressed to find a laundry washing professional in modern society. Washing machines and detergents have made the task so easy that there really isn’t a need for a dedicated laundry person. But there used to be, and his job was truly disgusting.

Again, ancient Rome is to blame for the foulness of what should be a squeaky-clean profession. Large vats were a common sight on Roman streets, which acted as primitive public restrooms. Citizens would wander by, urinate into them, and go about their business. When the vats were full, they were hauled off to the local fullonica.

This building was the ancient equivalent of a laundromat. Workers would pour the massive jugs of strangers’ urine into large tubs with the dirty laundry. But that was only step one.

Next, they would stand knee-deep in the urine-filled tubs and stomp around to agitate the clothes. Ironically, the ammonia in urine is great for breaking down dirt and grease, making this a surprisingly effective process.

4 Party Planning

4-roman-orgy

Whether it’s a graduation party, wedding reception, or just a weekend house party, a lot goes into crafting the perfect get-together—so much that many people choose to make their living coordinating such events. But odds are that none of them have ever been asked to plan a night of group sex.

As you may have guessed by now, this extremely dirty job comes to us from ancient Rome. Emperors had their own personal orgy planners committed to throwing the largest and filthiest sex parties imaginable. Often lasting multiple days, Rome’s elite would meet at these carnal carnivals to indulge in acts so legendarily lurid that they would be painted on public walls for all to enjoy.

While this may sound like a dream job to some, it comes with a catch. Humiliated family members of partygoers sometimes “vented their frustrations” on the orgy planner or his employer. That’s a diplomatic way of saying that the family brutally tortured and murdered the orgy planner.

3 Working In A Carnival

3-carnival-geek-eating-snake

Working in a carnival is by no means a pleasant experience. Sitting outside, listening to screaming kids, and huffing fumes from the Tilt-A-Whirl isn’t exactly paradise. Luckily, today’s carnival goer is a bit more squeamish than his early-20th-century counterpart or it would be so much worse.

The word “geek” is usually used to describe the socially awkward, but it began as the title for a carnival performer. This performer did only one thing: He bit the heads off things, including snakes and rats but usually live chickens. Playing the role of a savage “wild man,” the carny shocked crowds with his gruesome and bloody displays.

But it gets worse. Obviously, very few would volunteer for this position, so carnival owners were notorious for finding homeless drug addicts for the part. The owners would simply offer the addicts their fix in exchange for a performance.

The addict was given a razor blade to sneakily cut the neck of the animal, making his job easier—at first. Once the “performer” was completely dependent on the owner, the razor was taken away, leaving the carnival with a brand-new geek.

2 Making Hats

2-hatmaker-carroted

Like so many professions, the job of hatmaking has been simplified to the point of being phased out. Machines have replaced most of the workers, making modern hatters little more than glorified factory equipment. But that may not be such a bad thing.

The 17th century gave us one of the worst manufacturing innovations in history. “Carroting” was a hatmaking shortcut that allowed hatters to work their stiff materials into complex shapes more easily. By simply washing the fabric with mercury nitrate—which temporarily turned it orange, hence the name—the fabric was much more workable, cutting down production time. It seemed like a miracle—until hatters started losing their minds.

As it turns out, holding a mercury-soaked wad of cloth inches from your face for years isn’t the healthiest pastime. Breathing mercury fumes allows the deadly metal to build up in the body and attack the nervous system as well as the teeth and gums.

This led to a rash of “mad hatters.” Their poisoning led them to drool, lose teeth, shake uncontrollably, and eventually suffer permanent brain damage. This is actually where we get the phrase “mad as a hatter.”

1 Making Matchsticks

1-phossy-jaw

No one would argue that matches are dangerous. But barring a freak fire, how could making the tiny, innocuous sticks possibly be harmful? Just dip a few pieces of wood into some incendiary sludge, and call it a day. Sure, it would be tedious, but it’s easy money. Right?

Well, no. It turns out that one of the most gruesome workplace epidemics of the 19th and 20th centuries was suffered by workers producing “strike anywhere” matches. Yellow phosphorus—which we now call white phosphorus—was needed to produce these matches, and factory workers spent 10–15 hours a day handling the dangerous substance. However, its danger came not from the potential for burns but from the fumes it produced.

In 1838, the first case of “phossy jaw” was recorded. After breathing poisonous phosphorus fumes in a matchstick factory, workers began to experience intense pain and swelling in their lower faces. They started to lose teeth, and large, open sores appeared along their jawlines.

Both skin and bone rotted and fell away, leaving the hapless employee permanently disfigured. The only course of action was a complete removal of the jaw. Luckily, the early 20th century saw strict regulations, if not outright bans, placed on phosphorus match production.

Ian is a struggling writer who suddenly doesn’t feel so bad about that.

]]>
https://listorati.com/10-mundane-jobs-that-horrified-our-ancestors/feed/ 0 13829
10 Spectacular Cosmic Events Witnessed By Your Ancestors https://listorati.com/10-spectacular-cosmic-events-witnessed-by-your-ancestors/ https://listorati.com/10-spectacular-cosmic-events-witnessed-by-your-ancestors/#respond Wed, 17 Jul 2024 06:43:03 +0000 https://listorati.com/10-spectacular-cosmic-events-witnessed-by-your-ancestors/

With a couple of recent exceptions, cosmic phenomena (often hyped up in the media) tend to be underwhelming. Which, to be fair, is probably a good thing. But history has recorded plenty of genuinely spectacular events in the centuries and millennia before modern astronomy.

10. The Julian Star

Caesar’s Comet, aka the Julian Star, appeared after Julius Caesar was stabbed to death in the Senate. It was visible after sunset for seven days during the Ludi Victoriae Caesaris (games held to honor the ruler). Naturally, it became an object of worship. From the Roman writer Pliny the Elder we also learn that Augustus, Caesar’s heir, the first Roman Emperor, saw the comet as a sign that his rule had begun. 

In fact, the comet’s appearance was pivotal for Augustus and by extension the world. As Caesar’s grand-nephew, his legitimacy to rule was debated—not least by Caesar’s general Mark Antony, who accused the boy of having sex with his grand-uncle to get his will. For Augustus, the comet was a gift from above. Taking advantage of Roman gullibility, he declared the “new star” the soul of Caesar on its way to join the gods—which, neatly, affirmed his own divine status in the process.

It was so spectacularly timely, in fact, that some wonder if Augustus made it up. They point to unusual discrepancies like the 26-year gap between its alleged appearance and depiction on coins. However, ancient Roman sources are corroborated by Chinese records. Also, comets were seen as bad omens by Romans. Augustus was cunning enough to spin it as auspicious, but it wouldn’t have been his own choice.

9. The Supernova of 1054

In 1054, a supernova bright enough to be visible in daylight was recorded by astronomers worldwide. Ancient Chinese called it a “guest star” and compared it to Venus, the “morning star”, since both were best seen before dawn. Unlike Venus, though, “it had pointed rays on all sides”. Meanwhile, in the Levant, the appearance of this exploding star was linked to an epidemic that killed 14,000 people in Constantinople before spreading southwards to Cairo.

The light hung around for 23 days before it finally fizzled out and dispersed, although it remained visible for 21 months at night. Today, we know it as the Crab Nebula—the brightest remnant of any supernova we can see. However, up until recently, we didn’t know exactly what caused it. We just knew it was unlike any other supernova on record. That is, it wasn’t an iron-core collapse (whereby the mass of a huge star flows into its core causing it to collapse and explode) nor a thermonuclear supernova (whereby a small white dwarf siphons so much mass from another that it explodes). It wasn’t until 2018 that a new type of supernova was discovered: electron capture. Previously only theoretical, it more closely resembled the supernova of 1054. Electron-capture supernovae occur in stars 8-10 times the mass of the Sun when internal pressures force electrons to merge with the nuclei of atoms. This causes the core to collapse and explode. 

We didn’t see the supernova of 2018 because it happened 30-40 million lightyears away in the galaxy NGC 2146, whereas the supernova of 1054 happened in our own galaxy, just 6,500 lightyears away.

8. The Total Solar Eclipse of 585 BC

The total solar eclipse of May 28, 585 BC was among the earliest predicted cosmic events. It was foreseen by the Greek philosopher Thales of Miletus, who studied patterns in earlier records.

But it’s remembered for another reason too. On the day of the eclipse, two kingdoms, the Medes and Lydians, were engaged in a brutal battle. But as the moon passed in front of the sun, blocking it out and turning day into night, the fighting suddenly stopped. Both armies interpreted the darkness as an omen—a sign of the gods’ displeasure. They didn’t just stop fighting; they came to a hastily brokered peace agreement that included the marriage of a Median prince to a Lydian princess.

Fittingly, the same eclipse that stopped the savagery on the field (albeit through superstition) ushered in the dawn of rational astronomy. Thales’ prediction showed celestial events follow the laws of nature and not the whim of the gods. It laid the groundwork for future inquiry and marked a shift from superstition to science.

7. Halley’s Comet (1066)

Easily the most culturally significant object of its kind, Halley’s Comet is a part of the human story. One of its most famous appearances was in 1066, shortly before the Battle of Hastings, which imposed on the English a Norman aristocracy that remains in power today—almost 1,000 years later. 

The comet was seen as an omen at the time. The Bayeux Tapestry, an 11th-century embroidery, is thought to have been the earliest depiction, showing not only the comet but also men looking up in fear. But not everyone was afraid. Whereas the English saw it as a sign of their doom, the Normans under William the Conqueror took it as a sign of God’s blessing. He wanted them to enslave the English and steal all their land. 

Halley’s Comet’s 1066 visit is a classic example of how celestial events have been perceived as harbingers of change. Its appearance not only influenced medieval beliefs and actions but also left a lasting legacy in art and history, symbolizing the intertwining of cosmic phenomena with human destiny.

6. The Great Fireball of 1783

On the night of August 18, 1783, a lone fireball set the skies of Britain ablaze. This bright, slow-moving meteor appeared to be roughly the size of the disk of the moon and was estimated to be half a mile across and traveling at 20 miles per second. It was only visible for a minute before it broke into pieces, leaving only its core continuing on its path.

This so-called Great Fireball, which sailed across the sky just 50 or 60 miles off the ground, inspired awe and curiosity worldwide. Astronomers like Charles Blagden gathered reports, hoping to identify its origin. At the time, however, meteors were seen not as rocks but as electrical phenomena in the upper atmosphere. Hence it didn’t seem to cross anyone’s mind that, given the size and speed of the object, the world had just narrowly avoided a catastrophic impact.

Nevertheless, the fireball’s appearance and subsequent studies marked a shift from this old view to one of meteors as extraterrestrial objects. 

5. The Great Comet of 1744

Also known as de Chéseaux’s Comet, the Great Comet of 1744 dazzled observers on November 29, 1743. Although initially quite dim it brightened as it got near the sun. By mid-January the following year, the comet had a tail seven degrees long (roughly four finger widths at arm’s length). By February 1, it rivaled Sirius in brightness, with a curved tail extending 15 degrees (which is roughly the distance between the tip of your index finger and pinky spread apart at arm’s length). Still, though, the comet continued to intensify. By February 18, it was as bright as Venus and had two tails. It peaked on February 27 at a brightness of -7 apparent magnitude. The full moon is -11 and Sirius is 1.5. It was visible even in daylight, despite being just 12 degrees from the sun.

The Great Comet reached its perihelion on March 1. But the show wasn’t over just yet. When it reappeared in the morning sky on March 6, it appeared to have six brilliant tails fanned out like a Japanese hand fan across 60 degrees of the sky (four times the distance between the tip of your index finger and pinky spread apart at arm’s length!). Interestingly, these six tails were really just the most visible parts of a single, enormous curved dust tail.

4. The Great September Comet of 1882

Often said to be the brightest comet on record, the Great September Comet of 1882 was first seen by Italian sailors. By the middle of the month, near the Sun, it was bright enough to see in broad daylight. It was only 264,000 miles from the Sun’s surface, which, although it sounds a lot, is actually just a tiny fraction of Mercury’s distance of 28.5 million miles. It’s also not far off the distance between the Earth and the Moon. Hence the Great September Comet’s classification as a Kreutz Sungrazer—a comet that passes close to the Sun. 

Spectacularly, this incredibly close approach illuminated the comet 1,000 times brighter than the full moon. Observers called it a “blazing star” or “super comet” and watched in awe as its nucleus broke up into fragments. It was visible in the sky for weeks and was witnessed around the world.

3. The Great Meteor Procession of 1913

The Great Meteor Procession of February 9, 1913, remains both rare and unexplained to this day. Unlike normal meteor showers, where meteors zip across the sky at blink-and-you’ll-miss-it speeds, this procession moved slowly with the meteors crawling across the sky in formation. Also, because of their nearly horizontal trajectory, they were visible for much longer than usual: up to a minute for individual meteors and several minutes for the whole procession. There was no radiant point from which they emerged, as with meteor showers.

Witnesses across North America, the North Atlantic, and even down to Brazil reported seeing the phenomenon. Some even reported rumbling sounds suggesting the meteors might have been close to Earth when they finally broke up and disappeared. Canadian astronomer Clarence Chant, who gathered more than 100 eyewitness reports, described the meteors as two bars of flaming material trailing sparks, followed by a bright, star-like ball of fire.

Theories about the procession’s cause vary. Some think the meteors may have been fragments of a temporary second moon—a small, short-lived, natural satellite of Earth. Despite extensive study, though, the Great Meteor Procession remains a mystery to science.

2. The 1833 Leonid Meteor Storm

Ever stayed up late for a meteor shower, only to be disappointed? The term—a favorite of the media—is misleading. Even “meteor drip” would overstate it. In most cases, you’ll be lucky to see 50 in an hour, which isn’t even one every minute. It’s nothing like what people imagine. That would be a meteor storm, which is worth staying up for.

On November 13, 1833, the skies of America were utterly transformed by as many as 20 meteors per second, or 72,000 meteors per hour. It was so intense that the region of the sky around Leo looked like an umbrella of falling lights. Of course, in those days, very few knew what it was and the phenomenon caused widespread panic. People described the lights as falling “thick as snow in a snowstorm”. Fearing the end of the world, many fell to their knees and prayed. Others ran into churches to manically ring the bells. The spectacle didn’t end until dawn, fading with the first light of day. 

Today, it’s remembered as the most stunning meteor storm on record. It was also the beginning of meteor astronomy. Before this, “shooting stars” were not considered worthy of study. Hence astronomers later identified the phenomenon as caused by the comet Tempel-Tuttle and predicted its return 33 years later. Right on cue, 1866 brought another spectacular Leonid meteor storm—this time over Europe.

1. The Carrington Event

The Carrington Event of September 1-2, 1859, remains the most powerful geomagnetic storm on record. It was caused by a coronal mass ejection, a cloud of superheated plasma flying out of the Sun toward Earth. Basically, the Sun shot a magnet at our planet. And when it collided with the Earth’s magnetic field, auroras usually only visible in the far north (like Iceland and Greenland) were seen as far south as the Caribbean. 

This geomagnetic storm also caused telegraph systems right around the world to malfunction, giving electric shocks to operators, sending sparks flying, setting paper alight, and even sending telegrams without a power source. The distortion of the Earth’s magnetic field by the charged solar particles was so great that it electrified the air around us. If an event of this magnitude happened today, in our hyperconnected world, the fallout would be catastrophic. And there’s no reason not to expect one. In fact, there was a near miss in 2012 that, had it hit, would have caused trillions of dollars in damage.

]]>
https://listorati.com/10-spectacular-cosmic-events-witnessed-by-your-ancestors/feed/ 0 13722
10 Bizarre Ways Our Ancestors Explained Disease https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/ https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/#respond Thu, 16 May 2024 06:34:58 +0000 https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/

We all admire and respect medical experts for their knowledge and ability to help us overcome various sicknesses and diseases. We forget, however, that doctors are only human and as capable of mistakes as the rest of us. This was especially true in the past, when the diseases that afflicted the human race led doctors and medical experts to some truly bizarre theories and explanations.

10Spread Of Diseases Caused By Night Air

05

In the Middle Ages, the theory of miasma was born. According to this theory, “bad air,” which emanated from decaying organic matter, caused diseases such as cholera, Chlamydia and the Black Death. It seemed to worsen around swamps and during the night. Thus most people avoided the night air by going indoors and keeping their windows tightly shut.

When John Adams and Benjamin Franklin, two prominent American figures, were traveling together in 1776, they were forced to share a room in a crowded inn. Adams later noted in his autobiography that “the window was open and I, who was an invalid and afraid of the Air in the night (blowing upon me) shut it close.” However, Franklin objected and convinced Adams to reopen the window. The fact that a highly educated man like Adams, who later went on to become president, believed that nighttime air was noxious, shows us that the miasma theory was widespread and not solely limited to the poorer, uneducated classes. Indeed, doctors and other highly educated men supported the miasma theory for over a century.

Though the reasoning was flawed, closed windows did have some good health effects. Closed windows helped the prevention of malaria or the poison which produces autumnal fever and the exclusion of moisture, which often chills the body.

In the second half of the 19th century, the miasma theory was replaced by the germ theory.

9Epilepsy Caused By Divine Visitation

02

The early Greeks thought that epilepsy (a word which originated from the Greek verb epilambaneim, meaning “to seize, possess, or afflict”) was caused by “divine” visitation. Epilepsy was also known as a “sacred disease,” and it went by more than just one name. Some other names for epilepsy in Ancient Greece were “seliniasmos,” “Herculian disease” (because it affected the demigod Hercules), and “demonism.”

Epilepsy was considered to be a miasma—pollution or noxious form of “bad air”—that was cast upon the human soul. Thus epilepsy was regarded as divine punishment for sinners and was connected with Selene, the goddess of the Moon, since it was believed that those who offended her were afflicted with the disease.

The Ancient Greeks attributed the disease to different deities depending on the different symptoms that occurred during an epileptic fit. Thus, if the fit included teeth gnashing, epilepsy was ascribed to the goddess Cybele (goddess of nature). If the victim of epilepsy screamed like a horse, the disease was ascribed to the god Poseidon (god of the sea, earthquakes, and horses). The cure for epilepsy included a process of ritual purification as well as well as the recital of healing chants.

8Leprosy Caused By Divine Retribution

03

In the Middle Ages, leprosy was thought to have been caused by divine retribution. Victims of leprosy were believed to be suffering from the disease as a result of their wickedness and personal sin. This explanation for the disease was especially popularized by several biblical accounts, in which leprosy is sent to sinners as a divine punishment. Leprosy was seen both as a disease of the body and a disease of the soul. Thus, lepers were seen as a threat to society not only because of their physical condition but also because of their moral decay which the morally upright were terrified of catching.

As a result, lepers were treated horribly during the Middle Ages—they were shunned by society, were often forced to wear bells to warn people of their approach, and sometimes had to attend their own funeral mass during which they were declared officially dead to the community.

7Colds Caused By Waste Matter

01

The ancient Greek doctor Hippocrates is often considered as the father of medicine. He was the first person to dispel the myth that diseases were caused by angry gods and insisted that illnesses were caused by nothing more but outside factors on Earth. In fact, his influence and teachings were so influential that in the past, physicians took a Hippocratic Oath, swearing to uphold specific ethical standards.

However, in a time when the most absurd explanations for diseases were born, Hippocrates was no exception and contributed some crazy theories of his own, such as his belief that colds were caused by waste matter buildup on the brain. According to Hippocrates, when this waste matter overflowed, it resulted in a runny nose. This is where the Greek word for the common cold, catarrh originated. In Greek, catarrh means “flow,” and the Greek word is in fact still used in English today.

6Mental Illness Caused By Witchcraft

04

In the Middle Ages, people who suffered from mental disorders were thought to be either under the curse of witches or wizards or possessed by the devil. The most common medieval treatment of mental illness was exorcism. During the Renaissance, burning the body and saving the captive soul was the preferred method of “treating” the mentally ill.

During the Middle Ages and the Renaissance, all the tragedies of humanity fell on witches and diabolical possession. Women were condemned as witches far more frequently than men because it was widely believed that women were more likely to be afflicted by demonic possession due to their weaker and more imperfect nature. It was thought that a woman’s reproductive system was the proof of this, with the uterus being the source of evil. Supposedly, during their menstruation cycle, women were full of venom that contaminated them and gave them power to contaminate others.

It was also believed that through imagination one could produce physical changes in the body, and thus imagination was seen as another form of witchcraft. It was thought that the uterus received pathological images that could not be subdued. However the principal process of imagination originated in the spleen. Thus, because two organs—the uterus and the spleen—which could produce pathological images existed, women had two sources of evil and were more powerful than men, since men could only practice evil through their spleen.

5Hysteria Caused By A Wandering Womb

06

In Ancient Greece, women who suffered from any type of mental illness were considered to be victims of hysteria. And hysteria, according to the ancient Greek doctor Hippocrates, was caused by a wandering womb. According to the Ancient Greek physician Aretaeus, the womb could move upward and downward as well as left and right. So for example, if the womb moved up, it caused sluggishness, lack of strength, and vertigo. If the womb moved down, it caused a sense of choking as well as a loss of speech and sensibility. The womb moving downward could also cause a sudden, incredible death.

To cure a wandering womb, physicians applied pleasant scents, such as honey, to the vagina because the womb advanced toward them. Alternatively, the womb could also be driven away from the upper body to where it belonged through the application foul scents. Other prescriptions for a wandering womb included constantly chewing on cloves of garlic, hot and cold baths, consistent sex, as well as frequent pregnancy to keep the bored womb occupied and less likely to migrate around the female body.

4Porphyria Explained As Vampirism

07

Many myths surrounding vampirism emerged during the Middle Ages. However, it is now believed that a rare genetic disease called porphyria may have actually started the bizarre tales concerning “creatures of the night” and not just the easily excitable minds of Middle Age peasantry.

Scientific and medical knowledge was highly limited during the Middle Ages and thus the effects of porphyria could have easily been misconstrued as something of a supernatural nature. Patients with porphyria are extremely sensitive to sunlight and thus may rarely go outside. If they do dare wander outside, the Sun may cause terrible disfigurements to the patient’s hands, feet, or face. In worst case scenarios, their face may seem mutilated or distorted. Their noses, ears, or lips could recede or fall off, and excessive hair growth may occur, making them seem like a wolf or an animal (hence the werewolf myth, another popular tale during the Middle Ages).

Porphyria can also cause erythrodontia (the red discoloration of teeth) as well as receding gums that could have created the illusion of fangs. As for garlic (we all know those blood-suckers hate it), its consumption results in the worsening of porphyria symptoms and might actually inflict pain and cause the patient to become sick.

Today, porphyria is sometimes treated with the injection of a blood product called “heme.” Of course, treatment like that did not exist in the Middle Ages so if we get a little creative with our imaginations, victims might have been instinctively seeking heme by biting human victims and drinking their blood. Brothers and sisters could have unknowingly shared the defective gene that caused porphyria, so a victim of the disease biting their sibling for blood might have triggered an attack of the disease in the bitten sibling, creating a new “vampire” (hence the myth that a vampire’s bite resulted in the victim becoming a vampire as well).

3Ulcers Caused By Stress

08

William Brinton was one of the first few doctors to describe a stomach ulcer in 1857, but the lack of diagnostic tools made ulcer detection incredibly difficult. As well as that, no causative agent of ulcers could be found, and no single associated germ existed. Thus, doctors worldwide turned to the study of psychic and environmental factors to explain the appearance of ulcers. Eventually, it was agreed that poor diet, smoking, and stress caused high acid levels and so were the cause of ulcers. Doctors Arvey Rogers and Donna Hoel even wrote that “a peptic ulcer used to be a badge of success. Up-and-coming professionals were expected to earn one, and if they didn’t maybe they weren’t working and worrying hard enough.” The medical advice dispensed by doctors worldwide was to take antacids and modify your lifestyle.

However, patients with serious ulcer problems fell so ill that they had to have their stomachs removed and sometimes bled until they died. Shocked by all this atrocity, a physician named Barry Marshall and a pathologist named Robin Warren began working together in 1981, determined to get to the bottom of what really caused ulcers. Two years earlier, Warren discovered that the gut could be overrun by bacteria called Helicobacter pylori. Through biopsying ulcer patients and culturing organisms in the lab, Marshall traced ulcers (and stomach cancer) to this gut infection. The cure was antibiotics.

The world stayed skeptical until Marshall (who was unable to make his study with mice and who was not allowed to experiment on people) drank the Helicobacter pylori himself. Within days, he developed gastritis, the precursor to an ulcer. He felt sick and exhausted and started to vomit. Back in the lab, he biopsied his own gut, culturing the Helicobacter pylori and proving to the whole world that it was not stress but bacteria that was the cause of ulcers.

2Autism Caused By The Lack Of Maternal Warmth

09

The syndrome of autism was first identified by a child psychiatrist, Leo Kanner, in a 1943 paper. However, he went further than simply describing the schizophrenia-like features of children by focusing profoundly on their parents and their role in contributing to the syndrome.

Kanner had observed a small sampling of children from educated families and concluded that the parents of autistic children tended to be highly intelligent but at the same time coldhearted and formal. He claimed that autistic children were raised in isolation with no warmth emanating from their mothers or fathers. In fact, he went as far as to say that the parents of autistic children were “just happening to defrost enough to produce a child.” Kanner was not the only one to blame the parents. Numerous other psychoanalysts and child development specialists such as Bruno Bettelheim stressed the role of the parents in causing autism which gave rise to the “refrigerator mother” theory. Throughout the 1950s and 1960s, “refrigerator mothers” (and fathers) not only had to deal with their autistic children but also had to bear the guilt of turning them autistic in the first place.

In early 1960s, however, the refrigerator theory came under fire as parents of autistic children began to fight back. Kanner eventually abandoned his original position, although other specialists such as Bruno Bettelheim continued to defend it. The bizarre refrigerator theory was mostly abandoned in the 1970s, but small numbers of its supporters are still scattered across Europe and places such as South Korea to this day.

1Birth Defects Caused By Maternal Impressions

10

According to the theory of maternal impressions, any fears, desires or strong emotions a woman experiences during her pregnancy months could have a significant effects on her child’s physical appearance. This theory was extremely popular in the 18th century and was often used to explain birth defects. Thus, if a child was born deaf, for example, this was the result of the mother having been shocked by a loud sound during her pregnancy. As a consequence, it was advised that pregnant women exposed themselves solely to pleasant stimulation and were advised to visit galleries and concerts to ensure that their child was cultured and healthy.

However, the theory of maternal impressions was not confined to the 18th century only and in fact goes back centuries. In ancient Greece, the Greek physician Galen believed that if a pregnant woman looked at an image of someone, her child could resemble that individual. So the practice of looking at statutes the mother admired was encouraged to produce attractive children.

It was also believed that a pregnant woman’s mental state not only caused vascular birthmarks but also influence their shape and location. Thus, if a woman craved or ate a lot of strawberries during her pregnancy, she could have a child who had a birthmark that resembled a strawberry.

The maternal impressions theory thrived through the Middle Ages, the Renaissance and the 18th century. It was eventually challenged by the physician and anatomist William Hunter in mid-18th century, but most people still believed that maternal impressions had an impact on infants and thus this rather bizarre theory continued right into the 19th century. By the end of the 19th century, however, the maternal impressions theory was dismissed completely.

Laura is a student from Ireland in love with books, writing, coffee, and cats.

]]>
https://listorati.com/10-bizarre-ways-our-ancestors-explained-disease/feed/ 0 12316
10 Amazing Ways We Study The Diets Of Our Ancestors https://listorati.com/10-amazing-ways-we-study-the-diets-of-our-ancestors/ https://listorati.com/10-amazing-ways-we-study-the-diets-of-our-ancestors/#respond Thu, 25 Apr 2024 05:17:05 +0000 https://listorati.com/10-amazing-ways-we-study-the-diets-of-our-ancestors/

When an archaeological site is discovered, the question “what food did these people eat?” might sound unexciting, especially to those used to the idea that archaeology is about discovering lost temples filled with hidden doors and haunted treasures. But understanding what our ancestors ate means understanding their subsistence strategies and their relationship with the environment. The cumulative effect sheds light on some big questions of the human past, such the transmission of technological innovations, cultural contact, and even the spread of agriculture.

10Teeth Marks

1

Teeth, made of resistant tissue, tend to survive remarkably well. The food we eat leaves microscopic marks on tooth enamel, and the length and orientation of these marks depend on the food we consume. Modern Inuit groups in Greenland, whose diet is largely based on meat, display mostly vertical marks on the lateral surfaces, while groups living on an almost exclusive vegetarian diet display shorter marks, both vertical and horizontal.

Scientists can not only assess whether a particular group had a diet based on meat, vegetables, or a mix of both but can also arrive at more general conclusions. Fossil teeth from the early Stone Age onward (2.7 million–200,000 years ago) shows interesting results: Newer fossils display a decrease in vertical marks (and their average lengths) and an increase in horizontal marks. As time went by, the diet of our ancestors became more varied and less reliant on meat.

9Remains Of Individual Meals

2

Under some rare and incredibly fortunate circumstances, archaeologists can find meals, almost intact. The ancient city of Pompeii is arguably the most famous example of this kind: Many meals were found virtually untouched, still served at the table and preserved for centuries under thick layers of volcanic ash. We have also identified entire food shops with their products still suitable for recording.

Meals were also an essential part of funerary offerings in several cultures. In Egypt, we have tombs containing not only basic food ingredients such as fish and fruit but also intricate meals like cakes, cheese, and wine. In China, tombs belonging to the Han dynasty (206 BC–AD 220) are filled with meals and even labels attached to the dishes stating their composition.

8Animal Remains

3
Animal traces such as bones, antlers, and shells provide useful information on past diets. Based on animal bone samples, specialists perform statistical analysis on the age, sex, and the season of the death of the animals.

It may also be possible to assess whether these animals were wild or domesticated. Some aspects of domestication leave traces on the bones, particularly when the animal was used for traction (e.g. camels, cattle, horses), which tends to manifest in osteoarthritis and deformities in the lower limbs. Some domesticated animals like alpacas and llamas present a higher mortality rate on younger animals compared to their wild cousins.

7Digestive Tract Contents

4

The stomach is made of soft tissue and survives only under rare circumstances, such as extremely dry climate or cold temperatures. Sometimes, it is possible to retrieve samples of food traces from the stomach and other areas of the digestive tract, including the colon and even the intestines.

The famous prehistoric Danish Tollund Man had only eaten plants during the days prior to his death. Forensic examinations on the mummy of Lady Dai (the wife of the Marquise of Dai, second century BC China) retrieved 138 seeds of sweet melon, suggesting not only that she had a phenomenal appetite but also that she died during the summer, when the fruit is in season.

6Tooth Decay

5

Changes in our diet has affected tooth decay rate significantly. Refined sugars and starchy food has elevated the overall incidence of tooth decay and encouraged a specific pattern of tooth disease and tooth loss. Starch makes our teeth more vulnerable to dental disease, so societies that rely on cereal consumption diet present a higher rate of tooth decay.

The difference in tooth decay rates between ancient hunter-gatherers and farmers is so significant that some archaeologists and anthropologists use statistical analysis of tooth decay and tooth loss pattern to distinguish between prehistoric hunters and farmers. Human adults who died around 30,000 BC had an average of 2.2 teeth missing at the time of death; in 6500 BC, 3.5; in the Roman world, 6.6. The lesson is clear, the fierce Stone Age hunters had a smile more beautiful compared to the civilized Romans.

5Fecal Material

6

Some organic traces survive after they have passed through our alimentary tract. The brave specialist who digs into the secrets held by ancient poo is sometimes referred to as an analyst of desiccated peleofecal matter.

Human feces has specific chemical markers that distinguish it from animal feces, and it can show a wide variety of food remains: pollen particles, plant fibers, seeds, bone fragments, egg traces, nuts, mollusks particles, and in some cases even insects. Animal hairs can be found buried in the feces, and they can be analyzed to determine the species. Traces of fecal material can also be retrieved by adventurous researchers who dare to access cesspits, latrines, and sewers.

4Food Processing Tools And Equipment

7

Food procurement and processing is normally assisted with tools. Evidence of fishing, for example, comes in the form of hooks, fish spears, fish traps, and nets. Stone tools used for butchering animals display a specific microscopic wear pattern, and this evidence can be cross referenced with animal bones assemblages found at the site.

Evidence of hunting can be found not only by studying specific artifacts like bows and arrows but also by identifying arrowheads and other traces of hunting equipment embedded on animal bones. Evidence of cereal farming can be found in the presence of stone grinders, sickles, and pottery, which in some cases can have microscopic traces of foodstuff.

3Isotopic Methods

8

We are what we eat. At least this holds true for isotopic methods, which rely on chemical traces that the food we eat leaves in our bodies. One of these techniques reads the ratio of nitrogen isotopes found in bone collagen.

Nitrogen-15 increases as it travels up the food chain, while nitrogen-14 decreases. Individuals who depend largely on cereals and vegetables display a low nitrogen-15 to nitrogen-14 ratio compared to individuals who ate animal meat, blood, and milk. Individuals who rely heavily on a marine diet display even higher ratios of nitrogen-15 to nitrogen-14, since the marine food chain has a higher number of levels compared to terrestrial environments.

Interestingly enough, nursing babies tend to display the highest nitrogen-15 to nitrogen-14 ratio, since they are technically preying on their mothers and are therefore on the very top of the food chain.

2Botanical Remains

9

Phytoliths are minute botanical particles specific to certain plant species and may come from the flower, stem, or root. These particles can be recovered from sediments, pottery fragments, the surface of human teeth, or attached to stone tool edges. Phytoliths can identify specific botanical species linked to food consumption and differentiate between wild and domestic plant species.

Pollen grain particles are highly resistant and can be recovered from sediments, human feces, and teeth. If an archaeological site displays evidence of different levels of occupation, pollen particles coming from different phases of occupation can reveal changes on plant exploitation strategies during a given period of time.

1Tartar

10

Tartar can be a truly informative source of past diets. As it builds up on our teeth, traces of food get trapped in it. Dentist who scrape tartar are a recent privilege, so our ancestors had a lot of tartar on their teeth. Tartar is especially informative on the plant species consumed, since pollen grains survive a long time. Microscopic fragments of fossilized plant species and bone fragment particles can also be found buried in the tartar.

One of the unique features of tartar is that it can act as a food biography: Traces from the early stages of life will be found in bottom and inner parts, while the upper and outer tartar layers hold evidence of food consumed during late stages. Although scientists are still perfecting this technique, we have examples of remarkably well-preserved food evidence contained in tartar coming from two-million-year-old hominids.

+Further Reading

Chocolate_Chip_Cookies_-_kimberlykv
Now that we have ancient diets covered, how about some more quirky lists from modern times?

10 Bizarre Modern Diets You Won’t Believe Exist
Top 10 Craziest Diets Ever
10 Eccentric Eating Habits Of Influential Figures
Top 10 Food Facts and Fallacies

]]>
https://listorati.com/10-amazing-ways-we-study-the-diets-of-our-ancestors/feed/ 0 11786
10 Ordinary Domestic Things Our Ancestors Did That Killed Them https://listorati.com/10-ordinary-domestic-things-our-ancestors-did-that-killed-them/ https://listorati.com/10-ordinary-domestic-things-our-ancestors-did-that-killed-them/#respond Thu, 02 Nov 2023 15:17:20 +0000 https://listorati.com/10-ordinary-domestic-things-our-ancestors-did-that-killed-them/

People from the 19th and early 20th centuries were fascinating, to say the least. Inspired by the Industrial Revolution and groundbreaking scientific discoveries, the average householder developed an interest in inventions, “improving” their lives by using cutting-edge (untested) science and technology in every aspect of their social and domestic lives.

Unfortunately, their enthusiasm often ran away with them, and their disregard for health and safety often led to disaster. Here, we look at some of the ways the average person from the Victorian era through the early 20th century may have killed himself while enjoying the benefits of scientific and technological progress.

10 Going To The Bathroom


Visiting the bathroom shouldn’t be a dangerous undertaking. However, the Victorians had a number of perils to contend with. First was the water heater, which was gas-powered and often exploded, possibly because of the candles and oil lamps which were often used by residents who were caught short in the night.[1]

And then there were the toilets themselves. Prior to the Great Stink of 1858, when London was practically uninhabitable due to the hot weather and sewage smells, toilets with the s-bend design that we know today were rare. Toilets dropped their contents straight into the sewers below, and the smells from sewer rose through the unimpeded pipes and, shall we say, lingered.

And the sewers didn’t only contain eliminations but other sorts of human waste, too. The cemeteries of the period were not well-regulated, and human remains frequently contaminated drinking water or flowed directly into the sewers. And decomposing matter produces methane and carbon dioxide. Methane, particularly when combined with a flame from a candle or water heater, causes explosions.

Straight up through the toilet.

In order to control the methane problem, a number of sewer gas lamps were installed. In a surprisingly green fashion, engineers attempted to power the city’s streetlights using methane gas in order to reduce the dangerous buildups. The lamps were only partially successful, but the widespread introduction of s-bend toilets after the Great Stink made using the facilities a little safer.

9 Eating A Sandwich

A lot of foodstuffs in Victorian England were made with contaminated ingredients. One report in 1877 showed that ten percent of butter, eight percent of bread, and 50 percent of gin had copper added to it, while red lead was added to cheese to give it a “healthy” color. Other adulterants included strychnine in beer, copper in pickles and jams, lead in mustard, iron in tea, and mercury in chocolate.[2]

Bread, however, was a particular problem. Very few poor people at that time had the facilities to bake their own bread and therefore bought their daily loaves from street vendors. Bread was cheap, so it was a staple food for many, and almost the only food for some. However, the majority of this bread was adulterated with alum. Though it was not poisonous in itself, alum acted to prevent the absorption of nutrients in food.

The alum bulked out the bread, making loaves appear larger for their weight and thus more attractive to poor families with many mouths to feed. Those who survived the sandwich fillings would have developed rickets or other diseases because of their inability to absorb nutrients efficiently.

8 Walking Down The Stairs


Though anyone can fall downstairs, the Victorians were more vulnerable than most to severe injury and even death.

There were very few building regulations at that time and none at all when it came to the construction of modest homes. Stairs were very narrow, often with several steep turns, which made navigation tricky. Also, the builders did not have a standard measurement when constructing their staircases, so steps within a single staircase were often of different height and width.

Not only that, but no one thought it necessary to install a handrail. Some staircases were nothing more than glorified ladders, up which women were expected to climb while wearing long dresses, often while toting a child or two on their hip.

Unsurprisingly, deaths from falling down stairs were common.[3]

7 Playing Billiards


Snooker and billiards were once considered games for gentlemen only. The balls were made from ivory and were therefore very expensive. However, when celluloid was developed as an ivory replacement, the possibility of billiards for the masses seemed a very real one.

There was a big disadvantage of using celluloid over ivory, though: It was volatile and flammable—very flammable, in fact. That was unfortunate, because one billiard ball striking against another was sometimes enough to cause an explosion. Players complained that the noise sounded like a gun going off.[4]

Which is enough to put you off your shot.

6 Wearing Makeup


Usually, when you tell a woman that they have a certain glow about them, it is a compliment. For the Radium Girls, however, it was more a sign of impending death.

During the early 20th century, radium was considered to be something of a miracle element. Cosmetics manufacturers claimed (without any evidence) that small amounts of radium were beneficial to health. Customers were sold face creams and soaps laced with radium that were guaranteed to make their skin glow. Other manufacturers added radium to energy tablets, butter, and even chocolate.

Radium was also added to paint, which was used to decorate clock faces with luminous dials. And during the 1910s and 1920s, women who painted them were told to lick their brushes after dipping them in the radium-laced paint in order to point the end of the brush.[5]

The radium was extremely dangerous, and those who were in regular contact with it often died painful deaths. The clock painters, known as the Radium Girls, suffered terribly. When the body of one was exhumed five years after her death, it was still said to be “glowing.”

5 Cleaning Out The Gutters


The Victorians loved their scientific discoveries and inventions, but they weren’t always careful about testing them before they went into full-scale production.

So when they discovered asbestos, a cheap, nonflammable material, they used it for everything. Its use in guttering was common, but it was also found all over the Victorian and Edwardian home in insulation, floor tiles, and heaters. It was also used in some more unlikely and disturbing products, such as children’s toys. The attractiveness of a nonflammable material in such products is obvious.

Unfortunately, though asbestos is wonderfully flame-retardant, it causes severe respiratory diseases and cancer.[6]

4 Waking Up To A Nice Cup Of Tea


Ever inventive, the Victorians and Edwardians were always looking for ways to save labor for even the simplest things. Some of their inventions were brilliant, but others fell into the wacky and useless category. And some of them were just plain dangerous.

They tried to develop bottles that babies could feed themselves with, to save parents the trouble of having to pick them up, and made a pump-action vacuum cleaner that such needed vigorous bellow-pumping that it would have given Charles Atlas a tough workout. But right at the top of the list of the inventive, ridiculous, and dangerous was Albert E. Richardson’s patented Automatic Tea Making Machine. He combined an alarm clock with a kettle set over a spirit burner.

The burner used methylated spirits, which were lit by the automatic striking of a match when the alarm went off. Another alarm rang when the kettle was boiled, and a spring mechanism tipped the water into the waiting cup. However, if the match failed to ignite, or if it ignited at the wrong time, the teasmade was potentially lethal.[7]

3 Setting The Table

The ingenuity, or stupidity, of Mr. Henry Cooper knew no bounds when, in 1902, he invented the self-illuminating table cloth. Why go to the trouble of putting a cloth over a table and then placing a lamp on top of it, he reasoned, when you can accomplish both at once with his patented electric tablecloth?

The cloth consisted of two layers of felt with an electrical circuit sandwiched in between them and six electric light bulb sockets poking out through the cloth. When plugged in, the cloth would give a lovely, intimate feel to his dinner party, without all the extra (two seconds) effort of using separate lamps.

Lovely. Unless, of course, a guest spilled their wine, in which case the whole thing would have gone up like a box of firecrackers. Back to the drawing board, I think, Mr Cooper.[8]

2 Stocking The Fridge


Keeping food fresh has always been a big domestic problem. Various nonmechanical methods had been developed, such as meat safes, but for the inventive Victorians, that wasn’t good enough. They wanted to produce a mechanical or electrical refrigerator that would keep food cool.

In 1834, American inventor Jacob Perkins unveiled the first-ever refrigeration unit. The fridge was billed as a vapor compression refrigeration unit and as an “apparatus and means for producing ice, and in cooling fluids.”[9] However, the fridge was not particularly reliable and very expensive and never caught on.

By the 1890s, however, the cooling process had been “improved” by the addition of methyl chloride gas. This cooled the fridge but was, unfortunately, extremely toxic. Manufacturing ceased when a fridge leaked while still in the factory, causing several deaths.

Though the Victorians were innovative and farseeing in developing fridge technology, less than two percent of the population of Britain owned a fridge before the outbreak of World War II. Later, safer, innovations, of course, demonstrated just how right the Victorians were about the usefulness of a fridge.

1 Doing A Bit Of Light Ironing

Being a laundry maid in the Victorian era was a tough job. Irons were made of (surprise, surprise) iron, which was heavy, and a set of irons were needed in different shapes and sizes to tackle different jobs. They were placed in a fire to heat and then cooled to the correct temperature. Steam was created by covering the garment with a damp cloth before ironing. It was hot, sweaty work.

So it was only to be expected that someone would try to make an electric iron to make the job easier. In 1882, Henry W. Seely of New York was the first person to patent a workable electric iron. The iron was wired, permanently, into a circuit.[10] However, it was not possible to regulate the temperature of the iron, which made it difficult to iron clothes without burning them. And it was a fire risk, which rather defeats the point.

Nevertheless, like many of these inventions, these early electric irons were the forerunners of something really useful and exciting (electric tablecloths excepted), which shows that perseverance can be the mother of success. Or dangerous table dressings.

Ward Hazell is a writer who travels, and an occasional travel writer.

]]>
https://listorati.com/10-ordinary-domestic-things-our-ancestors-did-that-killed-them/feed/ 0 8378
10 Things Your Ancestors Did Better Than You https://listorati.com/10-things-your-ancestors-did-better-than-you/ https://listorati.com/10-things-your-ancestors-did-better-than-you/#respond Mon, 21 Aug 2023 01:44:50 +0000 https://listorati.com/10-things-your-ancestors-did-better-than-you/

We have a rather arrogant quality about ourselves in these modern times. We seem to truly think we have reached the acme of human society. Well, I hate to be the one to disabuse you of that notion, but I’m going to . . . Almost everything we do has been done better before us. Our ancestors hold the key to solving many problems of modern society and this list looks at those solutions.

See Also: Top 10 Reasons We Should Revive the Dark Ages

For clarity, when I say “ancestors” I mean the generations up to (roughly) the time of the end of World War II; for most us that means our great (or great-great) grandparents: modern people—just not as modern as us.

10 Save The Planet


While the most common plastic today (polyethylene) was actually invented in 1898, it didn’t go into general production until the 1930s. It was then that ICI opened the world’s first manufacturing plant—on the same day Adolf Hitler invaded Poland: September 1st, 1939.[1]

Nevertheless, it wasn’t until the 1960s that the production of plastic on a mammoth scale became truly viable. Since then . . . well, you know what happens next because you’re living in it. Ubiquitous products like straws, bags, mass-produced trinkets, and the many objects used in our daily lives, have led to a pollution nightmare. And the stuff we don’t just toss in the trash ends up in the dump anyway because of the high cost of recycling.

So what did our ancestors do? They stored their goods in glass containers which were then re-used. Most food was produced at home so waste was almost non-existent. We seem to be perpetually trying to find solutions to our modern problems with only one caveat: it must make someone a lot of money and it must not imply that things in days gone by were good. The way we once lived has many solutions to our modern problems but we seem dead-set against them.

This might also be a good time to correct an error from a previous list written in 2010: Top 10 Places You Don’t Want To Visit. On that list I discuss the great pacific garbage patch. Since writing that list, much information has come out about the patch. The reality is: it is basically a myth. Yes there is a concentration of micro-plastic under the surface, but it turns out the definition used to describe the “extremely high” concentration of plastic was somewhat open to interpretation and in our eagerness to point out the evil we had done to the ocean, no one seemed to notice that the “highest concentration of microplastic is around three pieces of plastic the size of a pencil eraser in a cubic meter.”[2] Oops. The photo of a giant patch of trash with men in boats is actually taken off the coast of the Philippines.

9 Live Within Their Means


In 1944 the Bretton Woods system was created in which instead of countries redeeming their money for Gold, they would redeem for US dollars. This made the US dollar the new form of gold in a sense. The US dollar was pegged to the value of gold still and that was meant to keep the system stable while allowing things like the newly invented International Monetary Fund and the World Bank to have a little more flexibility in the creation of money. And then in 1971 it all fell apart. President Nixon ended the connection between gold and the US dollar and the printing presses began to roll full steam ahead.[3] From that moment on, we began living off the earnings of the future. The reason our lives today don’t look like pages out of a book on the Great Depression is that we have all been made artificially rich. The terrifying thing is that it is all going to come crashing down—we just don’t know when.

So in the times before these systems were put in place, people earned money from their labour, and they saved that money. And when they needed to buy something, they used the money they had saved. They did not use a Visa to pay for it with income from the future. They didn’t use installment plans, and they didn’t use equity in their homes to buy new drapes! At the very most they may have done a lay-away which allows you to buy something on time payments, but unlike an installment plan, you didn’t get the goods until you had paid in full. Delayed gratification is a far better thing than the instant type we are all used to now. Funnily enough, even layaway was considered bad until the newly created Federal Reserve bank (a private company in the US)[4] caused the Great Depression; it was then that layaway emerged.

The long and short of it is this: if you revert to the old ways of spending, you will be far richer than your neighbors. Use a system like the snowball method of debt reduction to eradicate your credit cards and installment plans and start living in the now. You can learn how to do that on Top 10 Tips For Achieving Financial Freedom which I wrote in 2007.

8 Improve Their Mind


Books. Simple as that. We all have access to them—but instead of reading we stare at glowing pieces of glass all day. It is all well and good to fill your phone with apps like Google Books and Kindle, but it is hard to resist the pull of the thumbs-up on a photo of your latest cafe lunch or clothing purchase at The Gap. Why bother reading about “shirts of sheer linen and thick silk and fine flannel, which lost their folds as they fell and covered the table in many-colored disarray”[5] when you might get 15 likes on your photo of that funky political tee-shirt you wore today? “O quantum est in rebus inane!”[6]

There are many ways we can mimic our ancestors in terms of improving our minds; the first is to abolish the use of our electronic devices for at least a small part of the day. Set aside an hour to read a book—any book will do, though obviously I recommend The Ultimate Book Of Top 10 Lists by us! When you become engrossed in a good book, your mind opens up in a myriad of ways. Your rational and emotional minds need to be tended but more importantly perhaps is your imagination. It is from the imagination that so many great creations of man have come,[7] and movies and social media posts prevent the use of the imagination, replacing it instead with the ideas from another’s mind. You might be in possession of the brain that will invent the next great concept in human development, but you’re too busy scrolling through pics of incredibly hilarious sports photos to notice.

7 Eat


I might be sounding like a broken record at this stage, but in brief: the US government made everyone fat by advocating a ridiculous pro-vegetarian anti-fat diet in the 1960s and 1970s. Now that that is out of the way, the history of eating: breakfast didn’t exist as anything more than a drink or snack until the 19th century[8] and luncheon was the main meal of the day for most (this is all in reference to European culture of course). Then the Seventh Day Adventists got us eating cereal every day, the government got us eating chemical margarines instead of butter and fat, and the media (fake news long before the president used the term) sold us on coke and other delights. This is not to say there weren’t fat people before these things occurred, but, frankly, these days obesity seems to be less the thing to avoid and more the goal.

How do we fix this? When virtually everything the media promotes or says is to enrich someone . . . we should certainly not advocate for anything recommended or endorsed by them. That means no diet fads. There is one place the media hates: the past. The answer to eating right is to mimic the behavior of our ancestors (and I am not meaning restrictive Paleolithic diets). Edwardians were similar to us. They had grocery stores,[9] cars, and other fairly modern trappings of life. Perhaps the answer is to simply reject any dietary advice from beyond 1940 until someone without an ulterior motive funds some genuine studies into the ideal human diet. Just say no to diet myths.

6 Exercise

So you need to lose a few pounds or improve your health. What do you do? Sign up for a decade long membership (“it was the best price!”) at a fancy gym and then attend one session. Yup: just the one. We have somehow been programmed with this awful notion that exercise is best done indoors and intensely for a short period of time. But our thinner (and probably longer-lived—if we exclude child mortality from age of death statistics) ancestors knew better.

While the photo above is incontrovertible proof that some evil madman in the Victorian times took inspiration from the tools of the Spanish Inquisition to devise the most horrific of all torture devices man would ever know: gym equipment, don’t be fooled! Our predecessors didn’t, as a general rule, take their exercise in this rigid manner. They simply walked everywhere they went and did the hard jobs themselves. Doing the laundry took all day[10] and the average housewife (unless she could afford a maid) would call upon her friends to assist. My own father as a child was required once a week to jump up and down on the sheets in a copper kettle full of boiled water and soap (I’m not a boomer: my parents were old when they had me).

How can we learn from our ancestors and ditch the gym membership? Add an hour a day (in addition to your daily reading hour as mentioned above) to walk around your neighborhood. Smell the flowers. Say hi to the guy with the Trump 2020[11] sign in his yard. Help an old lady across the street. You will find that not only does your health improve, but so does your relationship with your neighbors and you may even find it makes you a nicer person online! If you are particularly courageous, you might want to consider doing some shopping at the same time. It is certainly better to carry a can of beans around for an hour than spend 10 minutes sweating indoors with a dumbbell.

5 Have Manners


Long gone are the days of writing thank you notes[12] for gifts. In some cases people don’t even bother with thank you notes for wedding gifts which was, for most people, the last remnant in society of formal thank yous. Once upon a time (not so long ago), a thank you note was issued for a million different reasons. If someone gave you a gift, you wrote a note. If someone had you to dinner, you wrote a note. It was the height of bad manners not to make the effort to thank a person for the effort they had gone to for your pleasure. It makes a lot of sense really. But alas, out comes those shiny bits of glass again and the most you can expect these days is a “thanks!” on instagram or snapchat (or Facebook if you are elderly).

In other areas of society manners were also far better than now. If you were invited to a meal at someone’s house, you always reciprocated at a later date by inviting them to your own home or a restaurant. And if you did invite them to a restaurant, the person who invited the guests paid for everyone’s meal. That’s right! You didn’t split the bill when you were the host. Just as you don’t charge your friends for the cost of the food when you cook at home, you shouldn’t in a restaurant. Admittedly things were a little different then: if you hosted you might also have ordered a particular menu for everyone at the table so you could plan for the cost, but there is no reason you couldn’t follow this nice old rule of good manners[13] these days with real friends who you can presume won’t take advantage of you.

4 Build


New construction regulations coupled with new building materials (such as styrofoam, plastics, and various metals) have meant that we can build houses much faster and more unconventionally than we have in the past. But the end result is sometimes buildings which can simply not stand up to age, the weather, or natural disasters in the same way as old construction built with stone and timber.

One good example of this was the Christchurch earthquake in 2011.[14] In that old city, 60% of the deaths were the result of the collapse of the modern CTV building, and while many of the old buildings were later condemned, they stood up to the quake better than those built with strict earthquake guidelines but new building codes. Next time there’s a big one, I sure as heck hope I’m in an old Victorian building.

And don’t even get me started on how hideous a lot of modern buildings look in contrast to the beautiful neoclassical facades[15] that came to be the dominant form of buildings in London and Paris, and the inspiration for much of the beauty in American Colonial construction. Granted there are some occasional modern buildings that are quite beautiful in their own unique way, but by and large giant concrete towers filled with glass can’t really compete with the likes of Buckingham Palace (pictured).

3 Teach and Learn


The moment that colleges began to teach that there is no objective truth, they revoked their own remit to educate. If there is no objective truth, there is no reason to teach.[16] You can now watch video after video on YouTube of young university students who don’t know who Adolf Hitler was (but will call you a Nazi if they don’t like your opinion). Feelings surpass facts,[17] and the entire history of western civilization is rendered down to, and described as, “white male patriarchy”, in a tone that is palpably hateful.

I am not even going to begin to try to describe a solution to that mess. It simply needs to be abolished. But when that day comes perhaps we can think about the classical education that was given to those who came not so long ago.[18] A modern education system but grounded in the Trivium and Quadrivium of the ancients. Not only did the ancients themselves come up with some of the greatest notions in man’s history, the successive generations through the Middle Ages expounded upon them. And then in the late 20th century we burnt it to the ground. Perhaps it is time to bring it back.

2 Create Art


There will be, in time, a list about modern art and its decline (some would obviously not agree that it is indeed a decline). But until then we shall content ourselves with this brief declaration: modern people don’t know how to “art”. What passes as fine art these days is astonishingly bad. Bottles of human urine with sacred objects in them, excrement smeared on canvases, and even an unmade bed are lauded as fine things.[19] Of all the items on this list, art is single-handedly the most extreme example of the differences between our ancestors and us.

Pretty much all art up until the 1900s is beautiful (and some beyond into the early stages of the 20th century). But at some point we got sidetracked and art ceased to be about beauty and became an expression of novelty or shock. Aristotle’s concept of mimesis (artistic mimicry) states that “art is not only imitation but also the use of mathematical ideas and symmetry in the search for the perfect, the timeless[.]”[20] We now live in a postmimetic world in which the ruination of the perfect is sought. And judging by some of the dreck we see on social media and in the news, Oscar Wilde rather presciently said “[l]ife imitates Art far more than Art imitates Life”.[21]

1 Have Fun


If future generations looked at an Apple advertisement, they would believe that we lived in one of the most fun-filled times.[22] The ads are full of laughing groups of divers friends exploring all manner of fun hobbies. But the whole thing is contrived; the reality of our time is that we dutifully go to work, earn the money needed to take the occasional holiday, and spend a lot of time online getting rather upset at other people’s opinions and feelings. And then we lap up mainstream movie interpretations of the likes of Charles Dickens’ books filled with dour and ugly old women beating children or crippled men with misery in every line drawn across their face.[23] Perhaps this allows us to be thankful we aren’t them. Perhaps it lulls us with a false sense of comfort against the miserable parts of our own lives.

But life for the Victorians (and the generations before them) was anything but bleak. They lived in a time of discovery and invention. Museums opened to the public for the first time and the great exhibition was held in a glittering crystal palace.[24] The delightful fairy tales that Disney so loves to wreck were penned and adored by happy children. The ghoulish tales of goblins and ghosts thrilled even the strongest of men. The homes were filled with the smells of exotic fruits and spices from mysterious lands newly colonized, and optimism filled the air like the clouds of incense that filled their churches. Those were the days.

Jamie Frater

Jamie is the founder of . When he’s not doing research for new lists or collecting historical oddities, he can be found in the comments or on Facebook where he approves all friends requests!


Read More:


Facebook Instagram Email

]]>
https://listorati.com/10-things-your-ancestors-did-better-than-you/feed/ 0 7214
Ten Bizarre Discoveries about Ancient Civilizations and Our Ancestors https://listorati.com/ten-bizarre-discoveries-about-ancient-civilizations-and-our-ancestors/ https://listorati.com/ten-bizarre-discoveries-about-ancient-civilizations-and-our-ancestors/#respond Sun, 16 Apr 2023 05:40:16 +0000 https://listorati.com/ten-bizarre-discoveries-about-ancient-civilizations-and-our-ancestors/

When someone does something outrageous, people often rush to use the phrase, “What is the world coming to?” But the fact of the matter is, human beings have always been weird. They were doing strange things long before you and I were born, and they will carry on being odd long after we are gone.

Archaeologists are constantly uncovering all kinds of bizarre artifacts and discoveries from the crazy civilizations of yore. Here are ten of the most ludicrous things they have found out about our absurd ancestors.

Related: 10 Intriguing Cases Involving Rare Ancient Art And Writing

10 The Cannabis Smokers of Ancient China

In the mountains of ancient China, more than 2,500 years ago, people smoked cannabis to get high. Archaeologists have found evidence of long-gone stoners during an excavation of Jirzankal Cemetery. Jirzankal is a historic burial place in the Pamir Mountains of Central Asia—an area now in China. Scientists studied incense burners from the site and found marijuana residue that they believe dates back centuries. The samples are rich in THC, the psychoactive substance found in weed.

Scientists found the residue on the pipes had a higher concentration of THC than the wild cannabis in the region. Researchers suspect that, in ancient times, locals would gather or even domesticate certain strains of cannabis for their mind-altering properties.

The excavation of Jirzankal threw up some interesting discoveries. Scientists analyzed the bones buried at the site and found that many of them were not native to the region. These immigrant remains support the theory that the Pamir Mountains used to be connected to an ancient Silk Road-type trade network. According to co-author Robert Spengler, this suggests that marijuana may have been exchanged along the pathway.[1]

9 Peruvian Paint Contains Human Blood

Before the Incas, many civilizations lived in the area of South America we now know as Peru. For around 500 years, the Sicán culture occupied the region. Unfortunately, much of their history has since been erased, which means historians know relatively little about the ancient peoples.

In the 1990s, archaeologists uncovered a thousand-year-old Sicán tomb secreted under the Huaca Loro temple. The excavation turned up various bizarre objects, including an upside-down skeleton painted red and surrounded by the bodies of two women and two children, with a gold mask placed on the man’s disembodied skull.

The mask, just like his remains, had been painted red using cinnabar. Historians believe that the Sicáns only used the mineral for respected people. This suggests the skeleton was once someone of high status.

However, there was one mystery that left the scientists scratching their heads. How had the cinnabar stuck to the gold for such a long time? It took until 2021 for researchers at Oxford University to solve that conundrum. Infrared analysis revealed that the paint was bound using human blood and egg whites. Scientists believe that this had some cultural significance to the Sicáns, possibly something to do with reincarnation.[2]

8 European Salt Miners Loved Beer and Blue Cheese

It doesn’t sound like the most pleasant job, analyzing the excrement of 2,700-year-old miners. But scientists have learned a lot about the diets of ancient workers in the salt mines of the Alps.

Human feces tend not to last very long before starting to decompose. The excreta found in the Alps were only preserved due to the cool, dry atmosphere and the high level of salt.

Microbiologist Frank Maixner was stunned to see that the miners had the knowledge and the ability to ferment their food. Maixner, who works at the Eurac Research Institute in Bolzano, Italy, described it as “very sophisticated.” Throw in some Buffalo wings, and they’d be ready for this century![3]

7 The Mysterious Mummies of the Silk Road

In the Tarim Basin, an area of desert in northwest China, lie hundreds of human corpses. The dry climate of the desert has preserved these bodies for thousands of years. The oldest is thought to date back to 2,000 BC, while the youngest arrived in AD 200.

Although they are buried in China’s Xinjiang Uyghur Autonomous Region, only a short distance from the Silk Road, the Tarim Basin mummies look nothing like the locals. Instead, they have features that historians have described as “Western.” They were buried in boat-like wooden coffins, which were covered in cowhide. And evidence found at the site suggests they farmed sheep and goats, grew wheat and barley, and even made cheese.

For years the origins of these mummies have been something of a mystery. But in 2021, genetic analysis revealed that the oldest are directly descended from the Ancient North Eurasians, who lived in the vast plains of North Eurasia (Northern Steppe and Siberia) many thousands of years ago.[4]

6 Remains Found of Non-Binary Finnish Leader

In 1968, archaeologists in Finland unearthed a 900-year-old grave containing a person in women’s clothing with a sword. They were lying on a soft feather blanket alongside other grave goods and furs, indicating they were likely a well-respected individual in the community. But scientists struggled to agree on the remains found within. Some argued that the body was that of a female warrior. Others disagreed, asserting that the tomb contained a man and a woman.

In 2021, over half a century after the grave was excavated, researchers finally learned the identity of the mysterious remains. DNA analysis revealed the body to be gender non-binary, meaning they were born with an unusual set of chromosomes.

Chromosomes play a vital role in determining the sex of a child. Girls are usually born with two X chromosomes, and boys with an X and a Y. But scientists believe the deceased Finn had two X chromosomes and one Y—a condition known as Klinefelter syndrome. People with Klinefelter syndrome generally have male characteristics, but many contend with low testosterone levels, enlarged breasts, and infertility.

As lead author Ulla Moilanen explained, “If the characteristics of the Klinefelter syndrome [had] been evident on the person, they might not have been considered strictly a female or a male in the early Middle Ages community.” [5]

5 Neanderthals Caught Birds with Their Bare Hands

Neanderthals were fascinating people. Scientists believed that prehistoric humans ate, among other things, raven-like birds known as choughs. But this led scientists to wonder how the primates caught their dinner without the help of modern technology. How do you chow down on a chough if all you have to catch it with is your bare hands?

To investigate further, a team of researchers decided to test it for themselves. Evolutionary ecologists from Estacion Biologica de Donana in Spain ventured into dim-lit caves and, without any tools to help them, managed to catch over 5,500 choughs. On a good night, the group could capture 200 birds. At other times, they only managed to grab a few dozen. All they needed to trap their feathered targets was a small source of light.

This might sound like something only a mad scientist would attempt, but the team says they found the experiment illuminating. Literally. Scientists now believe that Neanderthals could generate fire to light up their surroundings. It also points to them having had much higher cognitive abilities than we once thought.[6]

4 Did Humans Hibernate During Winter?

Half a million years ago, our human-like ancestors would survive harsh winters by curling up in caves and hibernating. Or at least they might have, according to a 2020 research paper by two European paleoanthropologists. Fossil evidence suggests that our long-gone forebears used to lie dormant during the winter months, although scientists doubt they were any good at it.

Archaeologists have turned over at least 1,600 human fossils from the caves of Atapuerca in Spain. By studying bone structure and growth, the study’s authors determined how people lived at the time. Records indicate an annual drop in nutrition and vitamin D from the sun. Scientists say this suggests our ancestors possibly spent the winter in hibernation.[4]

3 The Ancient Act of Exorcism

Deep in the British Museum lies a 3,500-year-old tablet. Drawn on it is what researchers believe to be the oldest ever depiction of a ghost, along with instructions on how to conduct an exorcism.

Dr. Irving Finkel, an expert in ancient civilizations, only discovered the image in 2021 on a tablet in one of the museum’s vaults. It shows, Finkel explains, a young man accompanying the ghost of a middle-aged woman back to the underworld. On the back are instructions for helping dead spirits out of the realm of the living.

The scribe recommends making two figurines, one male, one female. According to the tablet, they must be dressed and given various useful items, including a comb and a bed. The ritual involves waiting until sunrise to prepare two beer vessels and reciting an incantation to Shamash, the Mesopotamian god of the sun. The instructions end with one final piece of advice: “Do not look behind you.” [8]

2 The Iron Age Skiers of Norway

In the mountains of Norway, as the ice sheets melt away, strange discoveries rise to the surface. It was there that glacier archaeologists uncovered a remarkable pair of 1,300-year-old skis. The first of the two skis was found in 2014 on Digervarden Mountain in the southern county of Innlandet. The second turned up seven years later, only 5 meters (16 feet) from the first.

It took a lot of effort to free the second artifact from the ice. The team decided to return to Digervarden after spotting satellite images showing the receding ice sheets. Their first attempt to retrieve the ski left them empty-handed, but a mix of mild weather, pickaxes, and boiling water helped pry it loose.

This discovery is particularly noteworthy due to skis being in such astonishing condition. Secrets of the Ice, the treasure-hunting researchers who uncovered both skis, reckon they could be the best-preserved prehistoric skis ever seen.[9]

1 The Prehistoric Origins of Genital Herpes

In 2017, scientists discovered the ancient human ancestor responsible for genital herpes. The genital herpes virus, also known as HSV2, dates back millions of years. It is a close relative of HSV1—the cold sore virus. For much of that time, genital herpes was only an issue for chimps and similar primates. Our ancestors were lucky enough that it did not affect them. But at some point in history, around 3 to 1.4 million years ago, the viral blisters jumped the species barrier and began infecting early humans as well.

Researchers in Britain used data modeling to pinpoint the primate that caused HSV2 to leap across species. It turns out the guilty party is Paranthropus boisei. They are a stocky, human-like species with small brains and dish-like faces. Scientists predict that these robust primates picked up the virus while scavenging for chimps to eat. At some point, early humans likely ate the infected P. boisei, at which point they would have started contracting genital herpes too.

“Herpes infect everything from humans to coral, with each species having its own specific set of viruses,” explained University of Cambridge researcher Dr. Charlotte Houldcroft. “For these viruses to jump species barriers, they need a lucky genetic mutation combined with a significant fluid exchange. In the case of early hominins, this means through consumption or intercourse—or possibly both.”[10]

]]>
https://listorati.com/ten-bizarre-discoveries-about-ancient-civilizations-and-our-ancestors/feed/ 0 5405
10 Unexpectedly Weird Ancestors of Animals Living Today https://listorati.com/10-unexpectedly-weird-ancestors-of-animals-living-today/ https://listorati.com/10-unexpectedly-weird-ancestors-of-animals-living-today/#respond Mon, 10 Apr 2023 09:45:17 +0000 https://listorati.com/10-unexpectedly-weird-ancestors-of-animals-living-today/

We all know about woolly mammoths and saber-toothed cats, but such is evolution that all species have some pretty weird ancestors. Often they look nothing alike. From the least unexpected to the most, here are 10 of the weirdest of all.

10. The short-necked giraffe

The giraffe’s prehistoric forebear was roughly the size of a bull moose, complete with similarly large antlers. Sivatherium (along with Bramatherium and others) had a long neck to graze on treetops in Eurasia millions of years ago, but only about half as long as the present-day giraffe’s. Nevertheless, it’s thought to have been the largest ruminant (hoofed grazing animal) that has ever existed. 

Interestingly, although the fossil evidence dates it to millions of years ago, it may have survived to much later. Not only do cave paintings depict the animal but a copper rein ring found by archeologists excavating the ancient Mesopotamian city of Kish also appears to feature a detailed image of Sivatherium.

9. The vested ant

Ants may be the most successful animal on Earth, comprising up to a quarter of the biomass in tropical regions and a fifth of the biomass in general. The ant family Formicidae has proliferated into more than 9,500 species known to science and an estimated 3,000-9,000 species yet to be described. They’ve also existed for millions of years, and continue to live in harmonious symbiosis with their planet. 

However, little is known about how they originated. The earliest fossil evidence is from the mid-Cretaceous just 100 million years ago, when their planetary dominion was still in its fledgling stages. And there are few clues as to what came before. Instead, our best theories come from comparing ants to species living today. Their hive-like colonies, for example, bear similarities to those of wasps and bees — especially given that all generally center on a single mother, the queen. 

But there’s one species of wasp to which researchers think the ant is most closely related: the mud dauber. Female mud daubers are known to house their eggs in carefully built mud cylinders. Then they find a victim, paralyze it, and seal it inside the nest with their eggs so that when they hatch the larvae have something to feed on. It’s thought the original proto-ants started out the same way, “building simple nests and delivering food to their offspring.” Then when the offspring grew up, it may have helped the mother raise more.

8. The four-legged fish

It might not be such a stretch to imagine that frogs evolved from fishes, but the intermediate creatures did look bizarre. Icthyostega was one of the first, living as long as 364 million years ago. It was, in many ways, a fish. It had scales, vestigial gill bones, and a dorsal fin along the length of its tail. But Icthyostega, which grew to three feet, also had four fleshy limbs, each with digits, as well as strong ribs for dwelling on land. Unlike fish, it also had lungs.

Obviously this traits emerged slowly. Most of them developed while Icthyostega’s forebears were still living fully aquatic lives. The limbs, for example, gradually evolved from ‘lobe-fins’, which looked like and served as fleshy paddles. The lungs also probably evolved underwater. 

7. “Adam and Eve” the worm

Despite our differences, what all animals (except sponges and jellyfish) have in common is a bilaterally symmetrical body (mirrored left and right), along with a front side with a mouth and a back side with an anus. We are the ‘bilaterians’. And scientists think the earliest ancestor of us all was “a sluggish blob about the size of a grain of rice” called Ikaria wariootia

Discovered in the Australian outback from fossilized burrowing traces, it’s dated to the Edicaran Period (560-551 million years ago). It differs from other possible candidates, such as Dickinsonia, by its possession of a mouth and gut.

This, then, is the ancestor of all other creatures on this list, as well as the creatures reading it.

6. The horned horse

The prehistoric Brontotheres had a special place in Sioux mythology. Known from its fossilized bones, it was called the Thunder Horse and was said to come down in storms and trample on the buffalo. True or not, the Brontotheres was indeed a fierce beast — the largest mammals in the whole of North America during the Eocene.

One species, for example, the 8-foot tall, 15-foot long Megacerops, had a pair of long horns which it probably used for headbutting. Emblotherium, meanwhile, had just one horn — long like a battering ram — containing its nasal bones. It’s thought it may have been used to make loud vocalizations across long distances. 

All Brontotheres were extinct by the end of the Eocene, but their relatives today include rhinos, tapirs, and… horses! In fact, aside from the horns and their common depiction as rhino-like, they may have looked quite similar to horses — at least in the head, on account of their elongated skulls.

5. The meat-eating ground sloth

The so-called “great beast from America,” Megatherium americanum, looked similar to the sloths of today — except ten times the size. Weighing roughly the same as a bull elephant, it stood up to 12 feet tall on its hind legs. Needless to say, it lived on the ground and not in the trees. 

Unlike present-day sloths, ground sloths ate meat in addition to plants to support this great size. But they probably scavenged from kills made by big cats, wolves, and so on, rather than hunting for themselves. 

They were still roaming the pampas of Argentina and elsewhere in South America as late as the Holocene 8,000 years ago, living with early humans. In fact, humans are thought to have hunted ground sloths to extinction. Although some think they survive to this day.

4. The towering hornless rhino

You may have heard of the woolly rhinoceros, which went extinct around 12,000 years ago. They were a common subject of cave paintings. As the name suggests, they all had woolly coats. And, curiously, one species of woolly rhino had two horns instead of one.

But they were nothing compared to the mighty Paraceratherium. Over 26 feet long, the rhino’s ancestor from 35-20 million years ago was tall with a long, brontosaurus-like neck. It weighed as much as five adult elephants (15-30 tons). And, weirdest of all perhaps, for the rhino’s distant forebear, it had no horns at all.

It’s thought that elephants (not humans for a change) destroyed Paraceratherium’s habitat by stripping and felling trees, driving the giant to extinction. But there’s still much we don’t know about this dino-like mammal. For example, we still haven’t even pieced together a full Paraceratherium skeleton.

3. The giant beaver

Imagine a beaver taller than a human, weighing 200 pounds with six-inch incisors, and you’ve imagined the genus Castoroides. This shaggy-haired giant beavered away in North American woodlands from 3 million to 10,000 years ago, when it’s thought to have been hunted to extinction by humans. It’s likely that both their meat and their fur was in demand.

Like the present-day beaver, Castoroides had large gnawing teeth and lived on plants. It was also partially aquatic, probably because it was an easy mark on land for predators like the saber-toothed tiger. 

As to whether it built giant dams, though, it’s not entirely clear. No evidence remains except, possibly, a four-foot high one in Ohio.

2. The ferocious pangolin

The dominant carnivorous mammals 55-35 million years ago were the Creodonts, relatives of the present-day pangolin. What makes this all the more interesting for such a timid-seeming creature is that Creodonta means “meat teeth,” and the pangolin doesn’t have any. Instead, they gather up insects with their tongues, earning the nickname “scaly anteater” despite not being related at all.

So what were the phylogenetic ancestors of the pangolin like? Of the roughly 30 species, perhaps the most impressive are the Hyaenodontids. Named for their hyena-like teeth adapted for shearing flesh as opposed to clamping down, these species hunted in packs like wolves — usually at night. Some of the larger Hyaenodontids, like the 4.5-foot tall, 10-foot long, 1100-pound Hyaenodon gigas, may have hunted alone in the day.

1. The land-based whale

How do mammals end up in the sea? Whales, dolphins, seals, walruses, and so on all descend from species that once roamed the land. The pinnipeds, for example (seals, walruses, and sea lions), are thought to have evolved from primitive bears, just like their land-lubber cousins the weasels, otters, and skunks. The sirenians or sea cows, meanwhile, appear to be related to elephants, as well as that most unlikely of elephant relatives the hyrax. 

The most iconic group of ocean-dwelling mammals, however, the cetaceans (whales, dolphins, porpoises, narhwals), descend from something unrecognizable — a creature that “ran like a wolf … waded like a hippopotamus … put its ear to the ground to hear distant rumbles … [and] had the ankles of a cow.” Pakicetus had the body of a land mammal but the distinctive long skull of a whale. Preying on animals both on land and in water, it lived around the edges of the shallow Tethys Sea 50 million years ago.

]]>
https://listorati.com/10-unexpectedly-weird-ancestors-of-animals-living-today/feed/ 0 5290