Century – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Sat, 05 Oct 2024 19:49:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Century – Listorati https://listorati.com 32 32 215494684 10 Apocalypses That Didn’t Happen This Century https://listorati.com/10-apocalypses-that-didnt-happen-this-century/ https://listorati.com/10-apocalypses-that-didnt-happen-this-century/#respond Sat, 05 Oct 2024 19:49:19 +0000 https://listorati.com/10-apocalypses-that-didnt-happen-this-century/

Mankind has always made predictions about the future, and quite a few people throughout history have claimed to know what lies ahead. Even during the 20th century, sci-fi writers made some surprisingly accurate predictions about the computerized 21st-century space age.

Then there are the doomsday prophets who, over the centuries, have repeatedly predicted the end of the world. Given the hundreds of end dates provided, we fortunately seem to have a pretty good success rate for surviving apocalypse theories. Some predictions were based upon close analysis of biblical references. Others involved planetary movements and collisions with comets. Many were just plain weird, to the point of even changing the date when the anticipated apocalypse failed to arrive.

Today’s Internet age has provided a means of delivering these theories to a vast audience within a short time frame. With websites dedicated to imaginary planets, fake news sites, and social media shares, these messages of imminent doom can quickly go viral. It’s surprising how many people have actually believed the world was going to end tomorrow. Again.

10 Y2K
2000


As the calendars clicked over to the year 2000, a combination of uncertainty about the dawning of a new era and widespread scaremongering saw millions of people hoarding supplies and bunkering down for the end of the world as we know it.

Some believed that computer coding issues meant that programs would be unable to recognize the year 2000, crashing systems and causing worldwide chaos. Many feared that on the stroke of January 1, 2000, computers would recognize the new date as 1900.

Millions of dollars were spent worldwide preparing for “Y2K” or the “Millennium Bug,” which was supposed to affect everything from banking and retail to emergency services and airplane safety systems. Even the skeptics were hoarding extra groceries, water, and flashlights “just in case” essential services actually did fail.

On the stroke of midnight, the predicted global meltdown failed to eventuate, and the world as we know it continued to go on.[1]

9 Nibiru Collision
2003


The fictional planet “Nibiru” (also referred to as “Planet X”) first failed to collide with Earth on May 27, 2003. Nibiru is said to be a planet which sits on the outskirts of our solar system. Some followers claim that the planet’s orbit is controlled by a giant UFO.

Nancy Lieder claimed in 1995 that a brain implant enabled her to communicate with aliens in the Zeta Reticuli system. She believed it was her chosen mission to warn the world of the impending end to humanity.[2]

NASA scientists have repeatedly refuted the existence of the planet, suggesting that at most, it’s a small, inconsequential comet, if it’s anything at all. This has sparked claims of a cover-up from believers in Nibiru.

A number of dates have been proposed for the apocalypse since 2003. When the planet fails to arrive, the date is shifted to a new estimated time of arrival. It would seem that Lieber’s extraterrestrial communications may be a little unreliable.

8 Live On The Internet
2008


Ohio-based pastor Ronald Weinland took to the Internet to warn everyone that the world would end on September 30, 2008. The minister of the “Preparing for the Kingdom of God” church also released a book in 2006, stating that he and his wife had been appointed as witnesses to the end of the world.

Citing biblical references and a complex series of events which would occur in the lead-up to Armageddon in a series of sermons streamed live on the Internet, Weinland urged his followers to prepare for the end in 2008.

Unfortunately, there had been an error in Weinland’s calculations, so the apocalypse didn’t appear as expected. Under his new calculations, it was due on May 7, 2012. After that, it was May 19, 2013. Weinland was found guilty of tax evasion in 2012.[3]

7 Catastrophic Earthquake
2011


Evangelical broadcaster Harold Camping advised the world that it was due to end on May 21, 2011. His radio ministry invested in an extensive advertising campaign to warn the world of impending doom. Billboards, motor vehicles, and radio advertising spread the word that the end of the world was nigh.

According to Mr. Camping’s revelations, true believers would ascend into Heaven on that date, while the rest of the world would suffer a catastrophic earthquake. A number of followers divested themselves of their worldly goods in preparation, only to be sadly disappointed when the day passed without the forecast doom.

A review of the scripture revealed to Mr. Camping that he had failed to factor in God’s mercy, which would extend the date of impending doom until October that year. Mr. Camping’s previous predictions of a 1994 apocalypse had also failed to come to pass.[4]

6 Comet Elenin
2011


Throughout history, the appearance of comets has always been seen as a harbinger of doom. So when Russian astronomer Leonard Elenin first spotted Comet Elenin late in 2010, the doomsday theorists hit the Internet. The armchair astronomers predicted everything from earthquakes and tidal waves to a full-on collision with Earth between August and October 2011.

This was despite the fact that the comet was some 647 million kilometers (402 million mi) from Earth when it was first spotted. Mainstream news all but ignored the existence of the comet, as there wasn’t really very much to report, according to the facts provided by space scientists.

NASA astronomers assured the world that the comet posed no threat to life as we know it.[5] In fact, the object broke into small pieces during its journey through the inner solar system in 2011.

5 A Transformation Of Sixes
2012

Followers of the Miami-based sect “Growing in Grace” proclaimed the end of the world would occur on June 30, 2012. According to their leader, Jose Luis de Jesus Miranda, on that date, his followers were to be transformed into magical beings that would fly and walk through walls. The cult leader claimed he was a reincarnation of Jesus, a fact which was revealed to him through talking to the prophets.

According to the cult, on June 30, 2012, the Earth’s rotation would accelerate to a speed of 107,289 kilometers per hour (66,666 mph). Jose Luis de Jesus Miranda turned 66 in 2012. All cult members were tattooed with the numbers “666.” These inexplicable coincidences all pointed to the inevitable end of the world as we know it on the predicted date.[6]

Advertising billboards were erected to proclaim the date on which his followers would be taking over the world. As usual, July 1, 2012, arrived without incident.

4 Maya Doomsday
2012


According to some scholars, the ancient Maya calendar indicated that the end of the world was coming shortly before Christmas 2012. Misinterpretations of the ancient calendar suggested that it ended on December 21, 2012, signaling the end of time.

So popular was the “2012 Phenomenon” that many were pointing to natural disasters and world events at the time as indications that the prophecy was unfolding. Once again, the Internet was flooded with theories of galactic collisions that would mark the end of the world on that date.

Yet again, NASA scientists were quick to debunk the doomsday theories, seeking experts to explain the complexities of the Maya calendar. The Maya perception of time was infinite, and therefore their calendar could not be said to pinpoint a specific date in time or be read in the context of our modern calendars. Nor were there any other indications in Maya culture of a cataclysm on that date or of any potentially Earth-shattering comets or planets currently on NASA’s radar.

NASA was so confident that the Earth was safe that they issued their press release a day early. As was the case with the previous predictions, the world continued to turn on December 22, 2012.[7]

3 Rasputin’s Apocalypse
2013

Grigori Rasputin was a holy man most famous for his connections with the ill-fated Russian Royal family. Dubbed the “mad monk,” Rasputin’s “mystical powers” were claimed to have cured the Russian prince of the blood disease hemophilia. In letters to the Russian royal family during the revolution, he made a number of accurate “predictions,” such as his death at the hands of government officials and the subsequent murder of the Russian royal family.

These, however, may have been more of an astute understanding of the political turmoil of the time than any mystical revelation. Among the predictions in his final letters was the suggestion that the “second coming” would occur on August 23, 2013, and that the Earth would be consumed by fire—yet another apocalypse that failed to materialize.[8]

2 Blood Moon Prophecy
2014


In 2014, we once again managed to escape the “Blood Moon Prophecy.” In fact, this was the 62nd time in 2,000 years that we have escaped a lunar-induced apocalypse, specifically the end that’s supposed to come after a series of four lunar eclipses, referred to as a tetrad. Biblical scholars have long cited references from Acts and Revelation where “the sun shall be turned into darkness, and the moon into blood” as a biblical verification that the lunar eclipse signifies the imminent end of the world.

Christian pastor Mark Bilz predicted that a series of eclipses in 2014 would mark the beginning of the apocalypse, while John Hagee, author of Four Blood Moons, also suggested that the string of blood moons would mark the end of the world. Both these predictions attracted wide attention, with some people actually hoarding supplies in preparation for the impending apocalypse. However, like every lunar eclipse before them, nothing happened other than the Moon being temporarily shadowed.[9]

1 Nibiru (Again)
2015


In 2015, Nibiru was once again threatening life on Earth, with claims that its collision course with our planet would end on September 23 that year. According to conspiracy theorist David Meade, NASA was hiding information on the planet from the general public.

Biblical verses indicated that the apocalypse would definitely arrive shortly. When Nibiru appeared to miss its connecting flight in September 2015, the estimated time of arrival was revised to October 15 that year. Nibiru’s latest no-show was on April 23, 2018.

NASA once again continued to reassure worried stargazers that Nibiru was nothing more than an Internet hoax. Given the number of times the mythical planet has failed to show up this decade, you tend to believe them.

Lesley Connor is a retired Australian newspaper editor who provides articles to online publications and through her travel blog.

]]>
https://listorati.com/10-apocalypses-that-didnt-happen-this-century/feed/ 0 15353
10 Things We’ve Learned About Schizophrenia In The 21st Century https://listorati.com/10-things-weve-learned-about-schizophrenia-in-the-21st-century/ https://listorati.com/10-things-weve-learned-about-schizophrenia-in-the-21st-century/#respond Fri, 09 Aug 2024 14:16:01 +0000 https://listorati.com/10-things-weve-learned-about-schizophrenia-in-the-21st-century/

Psychology is odd; it’s old enough to seem as though it’s been around forever but young enough so that there’s still an almost terrifyingly large amount of things that even professional psychologists don’t fully understand yet. When it comes to the general public’s popular image of psychology, a lot of folks hardly ever move past the timeless mantra of “lie down on the couch, and tell me how you feel” or Dr. Phil repeating people’s problems to them in a louder and slower voice on TV. While psychology and psychological disorders today are no longer kept in the corner like forbidden books of black magic, there’s still a long way to go in terms of actually spreading comprehensive knowledge.

Society has moved from tepidly prodding psychology with a long stick to an almost cult-like fascination and macabre fetishization of the concept. Comic books, TV shows, and movies almost always generate a buzz with the strategic use of the word “crazy.” Were that not the case, how much differently would Heath Ledger’s Joker performance have been received? Buzzwords like “psychopath,” “insane,” and “sociopath” are top contenders for words that are most frequently used despite a general misunderstanding of their meanings, right up there with “ironic” and “rhetorical.”

Schizophrenia is another one of those hot buzzwords that gets passed around corners in hushed tones but is rarely (accurately) expounded upon. The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) defines schizophrenia as the presence of “two or more of the following symptoms for a significant portion of time during a 1-month period (or less if successfully treated): (1) delusions (2) hallucinations (3) disorganized speech (e.g., frequent derailment or incoherence (4) grossly disorganized or catatonic behaviour (5) negative symptoms, i.e., affective flattening, alogia (poverty of speech), or avolition (lack of motivation).” (Note: The DSM-V terminology is essentially the same.) The psychological community has examined schizophrenia with an increasingly stronger lens since the beginning of the 21st century, and they have made many surprising findings.

10 Schizophrenia Is The Result Of Over-Intense Mental Processing

Hot Brain
A common misconception about schizophrenia is that those who suffer from it have weaker mental processing skills, which many believe is to blame for paranoid delusions and inaccurate memories. On the contrary, neural activity tests have provided evidence supporting the exact opposite explanation.

If you’ve ever been guilty of throwing back a shot or five too many on Cinco de Mayo, then you might be familiar with the phrase, “Follow my finger.” The follow-my-finger sobriety test is an example of more than just why you should usually have more water in your body than tequila; it’s an example of what psychology wizards refer to as saccadic eye movement. To put it simply, your brain processes resources and memories differently when your eyes are in motion, as opposed to a static point of view.

Scientists at the UC Davis Center for Mind and Brain conducted a test that focused on saccadic eye movement. The purpose of the study was to distinguish between the eye-movement (EM) brain activity of people with schizophrenia and healthy control subjects without the disease. All of the participants were asked to shift their eyes to a “target” in their peripheral vision, while avoiding a “non-target” closer to the center of vision—the catch was that they all had to keep a certain random color in mind during the exercise.

The hypothesis was that the non-target would be more distracting to the participant if its color matched the one that they were asked to keep in mind during the exercise. The results showed that the effect of matching color between the non-target and the imagined color was much more intense in participants with schizophrenia than those without. It was also observed that participants with schizophrenia were prone to hyper-focus on the space surrounding the main target’s position.

The findings served as more support to the belief that schizophrenic symptoms might actually be the result of a super-narrow, abnormally intense level of resource processing than normal.

9 Schizophrenia Is Linked With Brain Areas That Process Cannabis

Marijuana Plants
Whenever somebody suggests that cannabis “kills the brain,” chances are that they’ve never heard of something called the endocannabinoid system (ECB). The ECB is a part of brain that modern science has found to be specially fine-tuned to reception of cannabinoids for emotional processing, memory maintenance, and learning.

The existence of the ECB is not evidence that lighting up in your parents’ basement actually makes you a genius, but its discovery helped us understand the brain a lot better and also raised many more questions. The existence of cannabinoid receptors provoked questions such as “why do we have cannabinoid receptors in the first place?” and “how do cannabinoid receptors interact with mental diseases?” Scientists at the Department of Anatomy and Cell Biology at the University of Western Ontario conducted a study to address the latter question, specifically focusing on schizophrenia.

The heavily cited report states that the medial prefrontal cortex (PFC) and the basolateral nucleus of the amyglyda (BLA) are not only both cannabinoid receptor–heavy areas that are extremely important for emotional regulation but are also prone to serious distortions in cases of schizophrenia. In addition to the relationship between cannabinoids and schizophrenia-affective brain regions, research conducted at the University of Western Ontario’s labs also reported a strong interaction between cannabinoid transmission and dopamine. Dopamine is a neurotransmitter that’s been found to be essential in explaining addiction and schizophrenic pathology.

8 Schizophrenics’ Memories Are More Resilient To Long-Term Substance Abuse

Substance Abuse
Up until very recently, there hasn’t been much research done on the effects of long-term substance abuse on the working memories of people with schizophrenia. The relationship between schizophrenia and poorer memory is well documented, as is the relationship between substance abuse and forgetting your entire weekend. Less well-studied is the impairment of base-level memory by substance abuse of schizophrenics.

Drs. Jessica A. Wojtalik and Deanna Barch of the Washington University School of Medicine conducted a study to provide some much-needed data in this area. Thirty-seven schizophrenia patients (17 with a history of substance abuse and 20 non–substance abusers) and 32 non-schizophrenic controls (12 with a history of substance abuse and 20 non–substance abusers) completed a working memory task while being scanned with an fMRI. The results of the study showed that the control group was much more divided in neural activation rates between past substance abusers and non–substance abusers than the schizophrenia group.

Whereas the memory-processing brain regions of the formerly substance abusing participants in the control group were far more active during memory tests than the non–substance abusing controls, there was little to no difference in neural activity between the formerly substance abusing schizophrenic participants and non–substance abusing schizophrenic participants. Schizophrenia patients were much less accurate than the controls on all tasks, but these findings indicate that substance abuse may have a relatively smaller impact on the base-level working memory of schizophrenics compared to those without.

7 Schizophrenics Have Trouble Identifying Facial Expressions But Process Them More

Facial Recognition
How many times have you awkwardly run into that one person whose name you just can’t ever seem to recall, but you recognize their face every time? It’s moments like those that seriously make you wonder about what your memory will be like a decade from now.

In a report on the interaction between cognition and emotions in schizophrenia, Dr. Quintino R. Mano and Dr. Gregory G. Brown cited a number of peculiar findings about the working memory patterns of schizophrenia patients, one of which had to do with simple facial recognition. It was found that while schizophrenia often causes those with the condition to have difficulty expressing and identifying facial emotions, schizophrenia patients also show a significantly heightened rate of automatic and implicit processing of facial emotions.

6 Siblings Of Schizophrenics Have Different Brain Activity Than Others

Brain Activity
Dr. Alan Ceaser and associates conducted working memory tests with the participants split into three groups: schizophrenia patients, their siblings without schizophrenia, and a control group of healthy participants without the condition or any direct relationship to people with the condition. The results of the study showed that the patient and sibling group, but not the control group, exhibited different neural reactions to changes in dopamine availability than healthy controls. This supports the hypothesis of excess dopamine being a key player in the emergence of schizophrenic symptoms.

The most important implication of the study is that there are abnormal neural activity spikes in the dorsolateral prefrontal cortex (DLPFC), cerebellum, and striatum in both schizophrenia patients and those at risk for schizophrenia—this includes the brothers and sisters of those with the condition.

5 Male Schizophrenic Smokers Are More Susceptible To Nicotine Withdrawal

Smoking
The subtle neurocognitive deficits of schizophrenia patients can even be observed in the brain’s reaction to nicotine withdrawal. The Clinical Psychiatry Research Center at Tabriz University of Medical Sciences conducted a study to examine the effect of short-term nicotine abstinence on schizophrenic smokers.

The 45 participants, all male schizophrenic smokers, were split into three groups: one group that would abstain from smoking for one night, a second group that would use a nicotine patch after avoiding smoking for a night, and a third control group with no intervention at all. Each participant was given a visuospatial memory test at the beginning of the experiment and the following morning, after the intervention.

The nicotine patch group and the freely smoking group showed no significant difference in scores between either test, but the group that was withheld from both smoking and nicotine patch use exhibited significantly worse test scores after the intervention. The study concluded that nicotine abstinence causes visuospatial deficits in male smokers with schizophrenia.

4 Gender Affects Schizophrenia Symptoms

Genders
Few people really take into account the subtle differences that gender can make on the manifestation of a psychological disorder, let alone schizophrenia, but the effects are very real. It’s understood by many in the field of psychology that schizophrenia often, if not always, accompanies visual perceptual organization impairment—particularly in those patients with rough social histories. Until recently, there was not a complete understanding of just how intensely gender differences can affect the visuospatial deficits in question.

Dr. Jamie Joseph and associates at Rutgers University conducted a study to investigate the relationship between disorganized schizophrenic symptoms and gender. The tools used to measure the relationship were specially designed perceptual organization tasks: the Countour Integration Task and the Ebbinghaus Illusion.

The participant sample consisted of 43 females and 66 males. The results showed that while females (with more relatively intact bottom-up grouping skills) performed more impressively on the Contour Integration Task, males (with more top-down-oriented grouping skills) performed better on the Ebbinghaus Illusion task. This supports the notion that sex differences are an important factor to consider when weighing in on the visual-perceptual impairments caused by schizophrenia.

3 Younger Schizophrenics Aren’t Being Treated As Effectively

Distressed Young Person
Psychological treatment has come a long way since the mid-19th century. These days, we tend to lean more toward the clinical communication and behavior-analysis method of approach than the “let’s try poking your crazy out with a literal icepick” approach. Despite the advancements in technology and basic human decency, there is some evidence to show that the relationship between age and quality of psychological care doesn’t necessarily improve in a linear fashion as one gets older.

In 2013, the Canadian Journal of Psychiatry published findings that show the results of medical-administrative data analyses run over adult schizophrenia patients in Quebec for two years. The results showed that 77 percent of patients aged 30 and over were receiving adequate pharmacological treatment, compared to only 47 percent of patients aged 18–29. The fact that schizophrenia has been documented to be better-treated in the earlier phases of the disease makes this a concerning discovery.

2 Schizophrenics Have Lower Sex Drive

Low Sex Drive
Scientists at the Clinic for Young Schizophrenics ran a study in 2014 measuring the psychosexual tendencies of 45 young adults with schizophrenia. The 45 young adults were compared to 61 young adults without the disease as a control group.

The results found that a smaller number of the schizophrenia patients had a sexual partner or had ever had sexual intercourse compared to the control group. More men with schizophrenia who were being treated with risperidone or olanzapine reported issues with arousal than men in the control group. Proportionally, the schizophrenia patient group demonstrated an increased chance of developing negative psychosexual tendencies compared to the control group.

This doesn’t mean that anyone should start substituting the word “schizophrenic” for prude—schizophrenia doesn’t erase sexual urges or instantly overstimulate them. These findings only serve to play down the misconception that mental pathology instantly implies hypersexuality.

1 Schizophrenia Is Related To Low Appetite Control

Hungry
At the Department of Psychiatry at the University of Montreal, a study was conducted in 2012 in order to examine appetite regulation and metabolic differences between schizophrenia patients and a healthy control group. Even if you don’t have a doctorate hanging on your wall, chances are that you’re somewhat familiar with the horror stories that center on metabolism dysfunctions caused by psychiatric treatment gone wrong; the study took this into account as well, measuring the relationship between food cravings and antipsychotic medication dosages.

The results showed that only schizophrenic patients demonstrated specific cerebral responses in the parahippocamus, thalamus, and middle frontal gyri to appetite stimulation. Schizophrenic patients’ parahippocampal activity and related hunger levels both increased linearly over time. It was found that medication dosage had a strong positive correlation with food cravings, and also that the severity of the disease was negatively correlated with dietary restraint.

The findings show that not only does schizophrenia lend itself to a weakened ability to control the appetite, but also that the antipsychotic drugs used to treat the disease may also drastically exacerbate the dietary symptoms.

I was raised on the nation’s capital’s concrete, spent three years eating swordfish and terrible barbeque in the Massachusetts mountains, took a nap, and woke up in this weird dimension they call Long Island to get a doctorate in psychology. I can give you a rough play-by-play on what’s happening in a Spanish soap opera and teach you a lot of Chinese curse words. Slam poetry, rock climbing, marathon running, Muay Thai, Buddhism, electronic music, and stupid YouTube videos keep me sane—if you like any of those, you go on the pretty cool person list. You can also check out my website, Instagram, or Twitter.

]]>
https://listorati.com/10-things-weve-learned-about-schizophrenia-in-the-21st-century/feed/ 0 14222
10 Lesser-Known Transport Disasters Of The 20th Century https://listorati.com/10-lesser-known-transport-disasters-of-the-20th-century/ https://listorati.com/10-lesser-known-transport-disasters-of-the-20th-century/#respond Mon, 10 Jun 2024 08:07:59 +0000 https://listorati.com/10-lesser-known-transport-disasters-of-the-20th-century/

The sinking of the Titanic, the collision of the SS Mont-Blanc, and the Hindenburg explosion are all well-known transport disasters that are always remembered and talked about. They’ve become icons, have been made into movies, and have ensured their place in history, never to be forgotten. But there are many more disasters out there that each one mattered just as much for the people involved. Each one made our world a safer place.

10 The Iolaire

HMS Iolaire

On January 1, 1919, two months after the end of World War I, British sailors who’d survived the perils of both the ocean and the war were returning to their families on the Isle of Lewis and Harris, only to tragically perish within miles of reaching home.

The Iolaire (which means “eagle” in Gaelic) was built as a luxury yacht in 1881. During the war, it was equipped with guns and performed anti-submarine and patrol work. The Isle of Lewis and Harris saw a fifth of its population of 30,000 killed in World War I; the crew of the Iolaire were the lucky ones, eager to celebrate the New Year with their families.

Before anyone could celebrate, the ship struck the rocks known as the Beasts of Holm. It was only meant to carry 100 people, but there were almost 300 aboard, with only 80 life jackets and two lifeboats. It was expected to dock in Stornoway Harbour, but due to low visibility, it struck the rocks at the entrance of the harbor and quickly sank, less than 1 kilometer (0.6 mi) from shore. While 205 perished, 40 were saved by a brave man who improvised a rescuing implement from a rope, and 39 more were able to make it to shore on their own.

A naval inquiry was held in private on January 8, its results not being released to the public until 1970. It reached the conclusion that due to the fact that no officers survived, “No opinion can be given as to whether blame is attributable to anyone in the matter.” Numerous other inquiries, both official and unofficial, were held, none of which settled the matter. The weather wasn’t very bad, but those in charge should have taken safety precautions, like slowing down while approaching the harbor and having more lifeboats.

The site of the wreck is marked today by a pillar that reminds everyone who enters Stornoway Harbour of the cruel irony that befell those who survived the war and were so close to enjoying peace.

9 USS Akron

USS Akron

Following the example of the Hindenburg, the US built two helium-filled airships, each 239 meters (784 ft) long and carrying enough fuel to travel 16,900 kilometers (10,500 mi). One of them was named the USS Akron and was commissioned by the US Navy in 1931. Its mission was to provide long-distance scouting in support of fleet operations, and after a number of trials, the airship was equipped with reconnaissance aircraft and a system designed for in-flight launch and recovery of Sparrowhawk biplanes.

On a routine mission, disaster struck. During the early hours of April 4, 1933, off the coast of New Jersey, a storm began, which caused the airship to strike the water with its tail. The Akron quickly broke apart. What’s intriguing is that it carried no life jackets and only one rubber raft, which dramatically diminished the crew’s chances of survival. Of the 76 onboard, 73 drowned or died of hypothermia.

Although the weather was certainly a factor, Captain Frank McCord is also considered responsible, for flying too low and not taking into account the length of his ship when he tried to climb higher. It is also believed that the barometric altimeter failed due to low pressure caused by the storm.

Akron’s sister ship, the USS Macon, was also lost off the California coast in 1935. Fortunately, that time, only two people perished. These events prompted the US to end its rigid airship program.

8 Junyo Maru Tragedy

iStock_000075026905_Small
The Japanese are remembered for being extremely cruel to their captives during World War II, especially to prisoners of war, who were moved around the Pacific in rusted ships and used for forced labor. The problem with these ships was that they were not marked with a red cross in order to be identified as prison ships per the Geneva Convention, which made them vulnerable to being sunk by Allied aircraft or submarines. The largest maritime disaster in World War II occurred because of this.

On September 18, 1944, the Junyo Maru was torpedoed in the Indian Ocean by the British submarine HMS Tradewind, which couldn’t have known what cargo the ship was carrying. Of the 6,500 Dutch, British, American, Australian, and Japanese slave laborers and POWs onboard, 5,620 died as a result. The Junyo Maru was sailing up the west coast of Java from Batavia (now called Jakarta) to Padang, where its prisoners were to be taken to work on the Sumatra Railway.

Conditions onboard were indescribably bad. Many people were literally packed into bamboo cages like sardines. Those in charge put their life jackets on as soon as they left, whereas the POWs could only count on two lifeboats and a few rafts.

Even more tragically, the approximately 700 POWs who were pulled from the water were still taken to work in the Sumatra Railway construction camps. Only about 100 survived.

7 MV Wilhelm Gustloff Disaster

Nazi Germany designed a state-controlled leisure organization in order to show its citizens the benefits of living in a national socialist regime. Working-class Germans were taken on tours for holidays aboard the MV Wilhelm Gustloff and the program, nicknamed Strength Through Joy, became the largest tour operator in the world in the 1930s.

This all ended when World War II began. In 1945, the Wilhelm Gustloff became part of Operation Hannibal, the German evacuation of over one million civilians and military personnel due to the advancing Red Army in Prussia. Over 10,000 people, 4,000 of whom were children, were crammed onto the ship, all of them desperate to reach safety in the West. The ship was only meant to carry 1,800 people.

The Wilhelm Gustloff set off on January 30, 1945, against the advice of military commander Wilhelm Zahn, who said it was best to sail close to shore and with no lights. Instead, Captain Friedrich Petersen decided to go for deep water. He later learned of a German minesweeper convoy which was heading their way and decided to turn on the navigation lights in order to avoid a collision in the dark. This would soon prove to be a fatal decision. The Gustloff was carrying anti-aircraft guns and military personnel but wasn’t marked as a hospital ship, which would have protected her. Soviet submarine S-13 needed no second invitation to torpedo the shiny target three times.

Ample rescue efforts were made, which saved approximately 1,230 people. Over 9,000 perished in the cold waters of the Baltic Sea, the largest loss of life in a single ship sinking.

6 Gillingham Bus Disaster


On the evening of December 4, 1951, 52 Royal Marine cadets, boys between 10 and 13 years old, were marching from a barrack in Gillingham, Kent, to one in Chatham to watch a box tournament. Their military uniforms were dark clothes and had nothing on them to make the cadets visible. The entrance to the Chatham Royal Naval Dockyard had a malfunctioning light, which made it impossible for the driver of an approaching double-decker bus to see the boys. He plunged right through them before stopping.

The driver, John Samson, had 40 years of experience behind the wheel, but inexplicably for the foggy weather, he didn’t have his headlights on. He claimed to have been traveling at no more than 32 kilometers per hour (20 mph). According to the only adult who was with the boys, Lieutenant Clarence Carter, Samson was going at least twice as fast.

Regardless of the bus’s speed, 17 boys died on the spot, with seven more sent to the hospital. Never before had there been such a tragic loss of life on British streets, and the victims were given a grand military funeral at Rochester Cathedral. Thousands of locals attended. The incident was ruled an accident despite the driver not turning on the headlights or braking until he was a few meters away. Samson was later fined £20 and had his right to drive revoked for three years.

Every such disaster is followed by improvements in order to prevent further loss of life. This time, it was decided that British military marchers will wear rear-facing red lights at night.

5 Harrow & Wealdstone Rail Crash

October 8, 1952, is remembered by Londoners as the day of the worst peacetime rail crash in the UK. It was only exceeded by the Gretna Green disaster during World War I in 1915, when 227 Scottish soldiers headed for the front perished. The Harrow & Wealdstone rail crash involved three trains—a local passenger train from Tring, a Perth night express, which was running late because of foggy conditions, and an express train from Euston.

The driver of the Perth train passed a distant yellow signal, which means “caution,” without slowing, possibly because he couldn’t see it due to the weather. He also passed a later semaphore, which indicated “stop.” He only hit the brakes when it was already too late. Meanwhile, the train from Tring was waiting at the Harrow & Wealdstone Station for its passengers to embark. The Perth train impacted at approximately 80 kilometers per hour (50 mph). The disaster wouldn’t stop there. The fast-moving express from Euston approaching on a different line hit the debris from the initial impact and derailed.

In total, 16 carriages were destroyed, 13 of which were compressed into a pile only 41 meters (134 ft) long, 16 meters (52 ft) wide, and 9 meters (30 ft) tall. The human casualties would total 112 (102 immediately after the accident and 10 more later at the hospital), and 340 were injured.

Although the exact causes and persons responsible were hard to determine, it is believed that a combination of fog, misread signals, and out-of-date equipment caused the horrific crash. All the equipment was working, and the drivers were experienced men; all they needed was an updated system to back them up. The accident sped up the process of introducing the Automated Warning System of the British Railways. The system works by giving a driver who passes a caution or danger signal automated feedback, whether he saw the signal or not, and automatically applying the brakes.

4 USS Thresher Sinking

USS Thresher

The USS Thresher was the first in a new fleet of nuclear-powered attack submarine. It was commissioned in 1961 and went through numerous sea trials to test its new technological systems. As if foreshadowing the disaster that was to strike later on, these trials were interrupted by the failure of the generator while the reactor was shut down, which caused the temperature in the hull to spike, prompting an evacuation. Another setback occurred when the Thresher was hit by a tug and needed extensive repairs.

On April 10, 1963, the sub was conducting drills in the Atlantic Ocean, off the coast of Cape Cod, when it suddenly plunged to the seafloor and broke apart. All 129 passengers were killed—96 sailors, 16 officers, and 17 civilians. During the investigation into the accident, a leak in one of the joints in the engine room was discovered, which caused a short circuit in the electrical system and made it impossible to resurface the Thresher. The sub had no other choice but to sink and implode due to increasing water pressure.

The disaster mobilized the US Navy to put more effort into SUBSAFE, a program designed to rigorously control the quality of nuclear submarine construction.

3 MV Derbyshire Sinking

The MV Derbyshire is the largest British bulk carrier lost at sea. Built in 1976, it was a majestic ship built in 1976 at 281 meters (922 ft) in length, 44 meters (144 ft) in width, and 24 meters (79 ft) in depth. It had been in service for only four years when it set sail toward its doom on July 11, 1980, carrying 150,000 tons of ore.

On September 9 or 10, Typhoon Orchid struck the Derbyshire in the East China Sea, just as the ship was approaching its destination. At the time, it was carrying 44 people, all of whom perished during the journey from Canada to Japan, where the ship was meant to transport its cargo.

What sets this disaster apart from others is that the ship seemed to be lost forever, with initial searches for the wreckage turning up nothing. The absence of any mayday call or distress signal beforehand was also intriguing to the families of those lost. A formal investigation was conducted seven years later in 1987. It concluded that no structural or other failures were to blame; the weather conditions were responsible.

The grieving families were not convinced, and they decided to from the Derbyshire Families Association (DFA) to work together toward the truth. They managed to raise enough funds to finally find what remained of the Derbyshire in 1994, lying on the seabed more than 4,000 meters (13,000 ft) down in the abyss. DFA members continued to push for a number of investigations, which resulted in increased ship safety over the years. While the 1970s were plagued by bulk carrier sinkings, with 17 lost each year. The numbers are much lower today.

2 Bihar Train Accident

iStock_000060299812_Small
Were it not for the British rule over India, which aimed to improve the transport system among other things, the Bihar train accident would have never happened. On June 6, 1981, a train with around 1,000 passengers crowed into nine coaches was traveling through the Indian state of Bihar, 400 kilometers (250 mi) away from Calcutta. It was the monsoon season in India, which meant that heavy rains made the tracks slippery, and the river below was swollen.

It is believed the tragedy that followed was caused by the driver, who saw a cow along the tracks and braked hard. Cows are sacred animals in the Hindu religion, and he was a devout follower. Due to the rain, the tracks were too slippery, and the wheels failed to grip, causing the carriages to plunge into the Baghmati River below, sinking fast. Rescue efforts were hours away, and by the time they arrived, almost 600 people had died, and another 300 remain missing.

1 Ufa Train Explosion

iStock_000018377864_Small
The 1980s were difficult times for Russian leader Mikhail Gorbachev, who was trying to hold together the Soviet Union and maintain the Communist Party’s commanding role. At the same time, a series of disasters couldn’t hide the fact that the country’s infrastructure was old and dangerous. One of these disasters happened on June 4, 1989.

Two Russian passenger trains with hundreds of people onboard were passing one another near the city of Ufa, close to the Ural Mountains, when they met an extremely flammable cloud of gas leaking from a nearby pipeline. Sparks released by their passing blew both trains to pieces. Seven carriages were reduced to dust, while 37 more were destroyed, along with the engines. More than 500 people perished, many of whom were children returning from a holiday on the Black Sea. The force of the explosion was estimated to be similar to 10 kilotons of TNT, which nearly equaled that of the atomic bomb dropped on Hiroshima. The fireball formed was 1.6 kilometers (1 mi) long and destroyed all trees in a 4-kilometer (2.4 mi) radius.

The pipeline going along the rail lines was full of propane, butane, and hydrocarbons, and the pressure within was high enough to keep it in a liquid state. On the morning of June 4, a drop in pressure was observed, but instead of checking it out, the people in charge increased the pressure. Consequently, clouds of heavier-than-air propane formed and left the pipe, traveling along the rails. All they needed was a spark.

As with many disasters, the Ufa train explosion happened because finishing something quickly at minimal cost was more important than long-term consequences. The pipeline had more than 50 leaks in three years, and the Soviet Ministry of Petroleum didn’t want to admit their negligence. Worse, railway traffic controllers didn’t have the authority to halt trains on the Trans-Siberian railway, even if they smelled gas.

Teo loves animals, chocolate, and constantly finding out more about this magnificent and diverse world.

]]>
https://listorati.com/10-lesser-known-transport-disasters-of-the-20th-century/feed/ 0 12894
Top 10 Bizarre Smells From 18th Century England https://listorati.com/top-10-bizarre-smells-from-18th-century-england/ https://listorati.com/top-10-bizarre-smells-from-18th-century-england/#respond Wed, 03 Apr 2024 06:18:25 +0000 https://listorati.com/top-10-bizarre-smells-from-18th-century-england/

The majority of people are aware of humanity’s less-than-hygienic history. Between fabricating eyebrows from the skin of a mouse, a British monarch’s belief that a bath would be detrimental to his health, the Romans’ use of lye (An ash and urine mixture) to wash clothes and taking more than 300,000 years to invent toilet paper, we’ve had far from a sterile track record.

What most people are not aware of, are the smells involved with such methods of ‘Hygiene’. The following list is a collection of delightful aromas from a period that coincided with the Romantic Era, but they are not nearly as charming. Brace yourself—It’s Georgian England.

10 Times Nature Smelled Like Something Totally Unexpected

The majority of people are aware of humanity’s less-than-hygienic history. Between fabricating eyebrows from the skin of a mouse, a British monarch’s belief that a bath would be detrimental to his health, the Romans’ use of lye (An ash and urine mixture) to wash clothes and taking more than 300,000 years to invent toilet paper, we’ve had far from a sterile track record.

What most people are not aware of, are the smells involved with such methods of ‘Hygiene’. The following list is a collection of delightful aromas from a period that coincided with the Romantic Era, but they are not nearly as charming. Brace yourself—It’s Georgian England.


The word ‘Perfume’ comes from the Latin for ‘To scent by smoking’. The first perfumes were used to protect against the plague, as it was believed that disease could be prevented by ‘Purifying the air and warding off bad odours’. By the second half of 18th century, Otto of Rose had become the most popular perfume. It was concocted by heating rose petals and water in a copper still, and then extracting the oil from the mixture.

Because of its popularity, there was great paranoia amongst the Georgian English about counterfeit perfume. According to an 1831 guide for servants, to check for fake perfume; “Drop a very little otto on a clean piece of writing paper and hold it to the fire. If the article be genuine, it will evaporate without leaving a mark on the paper; if otherwise, a grease-spot will detect the imposition.”

The Otto of Rose’s popularity was largely owed to the horror of prior perfumes. Civet was a perfume extracted from a gland near the anus of the civet cat. Seemingly Georgians began to feel that the wearing of substances removed from a cat’s bum was rather ungentlemanly.

9 Tobacco


During Georgian England, there was an utter explosion of social life. In the 17th century, men often congregated to smoke their pipes in coffee houses, but by the 1700s tobacco had earned itself an unsavoury connotation. The Georgians believed that women could not tolerate tobacco smoke which led to allegations of women leaving their husbands if they refused to part with their pipes. Puffing a pipe in public was also regarded as being impolite.

As a replacement, Georgians began to snuff; the snorting of finely ground tobacco up one’s nostril. While this was fashionable, some believed it to be abhorrent. There were a number of unpleasant side-effects that went along with it; coughing, grunting and spitting. The main benefit was that it didn’t invade other people’s personal space, as a wafting cloud of tobacco smoke would. That said, people gathered at church were reported as being disgruntled by the noises produced by those snuffing during mass!

8 Fish


Marketplaces in Georgian England were very different to the supermarkets of today. There were no food safety standards, packaging or use-by dates, so ‘Caveat Emptor’ (‘Let the buyer beware’) applied. Purchasing gone-off food could cause disease and could offend your house guests. To prevent this, household manuscripts were printed to inform Georgians how to test their food. Meat and fish were tested by sniffing; if they had a ‘slimy’ smell, they should be avoided.

Pheasants were examined around the neck to check that they didn’t have a ‘tainted’ smell. Butter also needed to be checked before purchase, but buyers were warned to bring their own knife to test, as a merchant could simply offer the best piece of the stick.

Billingsgate women, who sold fish, were notorious for being sweaty and angry at their customers. For some, testing their goods appeared insulting. It suggested that they were seen as not trustworthy. Because of this, some Georgians wouldn’t be cod dead participating in such practices.

7 Paint


Although this smell may indeed seem like a rather strange proposal, the smell of paint is one that is regularly referenced in Georgian diaries. Because redecorating was not a common occurrence, this smell was a memorable one. Georgian paints were concocted from a mixture of linseed oil and turpentine, and as a result they had a particularly noticeable and pungent odor.

Bernardino Ramazzini was an Italian physician who first suggested that the ingredients used in paint production often led to producers losing their sense of smell. In fact, he was so engrossed in smells that he believed that someone should write, ‘A natural and physical history of odors’!

6 Ammonia


The smell of ammonia is one that is particularly difficult to mistake. The substance itself is composed of a combination of hydrogen and nitrogen, most commonly found in fermenting urine.

This smell provokes our trigeminal nerve—A type of nerve associated with facial expressions. The Georgians became obsessed with nerves. A person with sensitive nerves was held to a higher regard in society. Women were believed to be particularly disposed to anxiety. Ammonia was used to ‘Revive the senses’. Georgian novels and dramas even depicted heroines sniffing corked bottles of ammonia! Georgians later believed that smelling salts could be used to revive people who had been drowned or asphyxiated.

However ammonia was not the only maddening technique that was tested—One method even involved pumping a tobacco smoke emena up one’s rear end—Indeed an unpleasant wake-up call.

10 Things You Didn’t Know You Could Smell

5 Marzipan


The Industrial Revolution occurred at the same time as Georgian England. At this time, there was massive urbanization. There was an opening opportunity for townspeople to purchase exotic ingredients and to create more sophisticated sweets. Marzipan became a particular favorite. Crafted from almonds, sugar and rose water, they were easily made and not unaffordable. Marzipan sweets were generally eaten at the end of a meal, and they had a distinct almond-like smell.

While they were delicious treats, marzipan was also used in sculpture—Including making people, animals and castle—A decidedly fashionable model. The creations were then left at the centre of the dining table, and became a hallmark of Georgian decoration.

4 Wigs


The Georgians may have used marzipan in sculpture, but the real artwork occurred with Georgian wigs. Hair was piled on top of pads and wire structures to create intricate masterpieces for the drawing room. For most, simply using their own hair was not enough and so it was infused with horse hair. 1760s styles including an egg-shape, but this later elongated into a classic pouf.

The Duchess of Devonshire became famous for her extravagant bouffant when she built a three-foot tower of hair including stuffed birds, waxed fruit and even model ships. These styles were incredibly expensive to make, and so they were worn for weeks on end without cleaning. Inevitably, creepy crawlies came to nest and Georgian women developed a scratching rod to ward off the pitter-patter of their miniature tenants.

3 Body Odor


The situations that Georgians inhabited did not easily coincide with cleanliness. Despite their glamorous aura, people were utterly filthy. Hands and faces received a daily dousing, but an entire immersive washing was regarded as bad for health. The dresses worn by women caused particular issues.

Because of their heavy material, they caused the wearer to sweat excessively. Deodorants were non-existent, and the resulting stink was horrendous. On top of this, clothes themselves were washed only washed once each month. Under garments were washed and changed more often. But they were cleansed using lye—The same ash and urine mixture used by the Romans. Classy.

2 Bad Breath


Furthermore, the Georgians were prone to carrying around a waft of rotting teeth. Cleansing-tooth powders had begun to be used, but these contained sulfuric-acid, which stripped enamel from teeth. The best methods of warding off even more stench was to use herbs or parsley. When a tooth became a lost cause, it was pulled from the gums with a plier. No anesthetics, of course.

To prevent them from ending up with a gummy smile, they sought porcelain replacements. Where possible though, they preferred to purchase live dentures. The poor often sold their teeth to support the market— A viable business proposal, for those so-inclined.

1 Bodily Fluids


The unsolved mystery of sanitary hygiene in Georgian women is one that has puzzled countless historians. With no knickers to attach any sort of protection, they were forced to rely on Mother Nature, or so it seems. What is more proven, is their toilet habits.

Ladies at the royal court relied upon a porcelain jug to carry out their business, a device called a bourdaloue. It was clenched between one’s thighs, beneath their skirt. It was not unheard of for a woman to continue conversing with those around her while she urinated!

These ten distinct scents are indeed revolting. Unfortunately however, some of them have prevailed through history. While scented wigs are usually less common nowadays, body odors are still unequivocally existent, as is bad breath. The difference is that today, there is in many countries widespread availability of showers and deodorants. That these smells can still be smelled goes to show just how disgusting we really are.

Top 10 Incredible Smells That Will blow Your Mind

]]>
https://listorati.com/top-10-bizarre-smells-from-18th-century-england/feed/ 0 11274
Top 10 Crazy Facts About Psychiatry In The 19th Century https://listorati.com/top-10-crazy-facts-about-psychiatry-in-the-19th-century/ https://listorati.com/top-10-crazy-facts-about-psychiatry-in-the-19th-century/#respond Sun, 17 Mar 2024 01:09:03 +0000 https://listorati.com/top-10-crazy-facts-about-psychiatry-in-the-19th-century/

The treatment of the mentally ill has a notorious past filled with misunderstanding, torture, and theology. With the dawn of the 19th century, the path to comprehension began to be paved, ultimately leading to the psychology breakthroughs of Sigmund Freud, and the study of neurology. This is not to discount the terrible therapies the poor souls had to endure but to take a closer look at how the 19th century leads us to where we are today, and to highlight those few who really tried to help the mentally ill.

10Moral Treatment

The period of Enlightenment changed how scientists, philosophers, and society looked at the world. Psychiatry faced this new enlightened look, and moral treatment came out of it. This treatment was a moral disciplinary approach to those with mental illness instead of using chains and abuse.

According to Dr. James W Trent of Gordon College, before moral treatment, people with psychiatric conditions were referred to as insane, and treated inhumanely. Philippe Pinel of France at Bicetre Hospital, in Paris, advocated for moral treatment of the mentally ill. In place of physical abuse, Pinel called for kindness and patience, which included recreation, walks, and pleasant conversation. Pinel made this change out of reading, observation, and reflection; rather than a result of accident or experiment.

Moral treatment began to spread around the world. In the United States, Benjamin Rush, a Philadelphia physician, began to practice moral treatment. Rush saw one of the causes of mental disease as the hustle and bustle of modern life, so taking those inflicted away from those stresses could help restore their minds.

Rush did employ some of the methods of moral treatment; conversely, he also used bloodletting and invented the tranquilizer chair.[1] Pinel had high hopes for his new type of therapy, but there were still those who used tortuous techniques to restrain those they considered mad.

9Booming Asylums

We all have a terrible image of insane asylums, and a number of us have heard a ghost story or two surrounding these foreboding buildings. Often cared for by their families or housed in almshouses or jails, the number of people sent to asylums began to dramatically increase in the 19th century.

During the start of the century, cities became more populated, and mental illness shifted from being a spiritual punishment from God to a social issue. Communities responded by building more and more institutions, which were set up to handle the growing numbers.[2] For example, in England, the number of “patients climbed from 10,000 in 1800 to ten times that in 1900.”

Historians have agreed upon three main bodies of thought to try and answer why the numbers catapulted in one century. The first is due to modernization and increased stresses that came with it. The second is the population grew more and more intolerant of disruptive behavior. The third is the growing power given to physicians and alienists—or mad-doctors. Being able to look back, we can see it looks to be a combination of all three.

With the booming numbers of asylums, lurid tales of torture and abuse began to leak out from the ominous structures placed on the outside of cities and towns. The mad-doctors set up a number of classifications, given to try and make the asylum “cure” those with psychiatric conditions. For example, men had to be separated from women, the curable from the incurable, etc. Nevertheless, with the rules and best of intentions, asylums earned their infamous nickname, Bedlam, “a byword for man’s inhumanity to man.”

8Rise in Research

Going to university to study a specific subject, has grown to be a commonality these days. In the 19th century, the increase in asylums and new treatments created a rise in those wanting to research psychiatry, and answer the plaguing question of why some people went “mad.”

For example, Oxford-educated physician Thomas Willis, who coined the term neurology, strove to pinpoint the mental functions that coordinate with particular parts of the brain. Willis modeled the idea that the central and peripheral nervous system depended on the operations of animal spirits, or, chemical intermediaries between the mind and the body.

Another doctor about this time, Archibald Pitcairn, who taught at Leiden in the Netherlands, treated mentally ill patients and argued that they suffered from “false ideas induced by the chaotic activities of those volatile animal spirits; these, in turn, fed back into the muscles to produce confused and uncontrolled movements in the limbs.”[3]

Today, we know the brain does not contain animal spirits, and they do not cause mental illness; rather, it is chemical imbalances in the brain. For a time when X-ray was being discovered, and the only way to study the brain was to pull it out of a person’s skull, these doctors set the foundation for modern neurology and present-day treatments.

7Nervous Disorders

Today when someone suffers from a nervous disorder, they are referring to high blood pressure, heart problems, trouble breathing, etc.[4] Back in the 19th century, nervous disorders referred to shattered nerves, nervous collapse, nervous exhaustion, or a nervous breakdown. The symptoms did not include heart problems or trouble breathing; but instead, a sense of emptiness, hopelessness, obsessive thoughts, sluggishness, and a general indifference.

This is where we got the saying about having “strong” or “weak” nerves. The idea of nervous disorders being a “functional illness” that only affected the “superior” people came from the scientific emphases that ran rampant during this time.

On both sides of the Atlantic, Victorian men were wallowing in hypochondria and Victorian women falling into hysteria. Private “nerve” clinics to treat this malady sprang up, where the rich could go the spa to recover from their nervous breakdowns. These disorders only glamorized mental illness and took away from the real understanding of what those poor people had to endure.

6Monomania

The 19th century was full of scientists seeking to find reasons and answers for why the mentally ill became that way. Doctors commonly believed that insanity was a defect of reason, and the person’s inability to rationally comprehend reality.

With the rise in research and study of the mentally ill, Jean Etienne Esquirol brought another hypothesis to try and answer why: Monomania. This is partial delirium, where the patient suffered from a false perception, which they then pursue with logical reasonings. These false perceptions can be illusions, hallucinations, or false convictions. Monomania is not an absence of reason, but the presence of a false idea.

For example, mentally ill patients can suffer from illusions and hallucinations, and it is these that convince the patients of an incorrect reality, to which they act out logically to this false perception.

Esquirol developed the diagnosis of monomania to explain paranoia disorders; such as, kleptomania, nymphomania, and pyromania, which all can be detected by a trained eye.[5] Monomania provided the needed foundation for scientists and doctors to discover concepts such as obsession and psychopathy.

5The M’Naghten Rules

On January 20, 1843, a Scottish craftsman, Daniel M’Naghten, believed Tories and Conservatives were intent on murdering him for his involvement in the early workers’ movement in Great Britain. In response, M’Naghten set out to kill the sitting prime minister, Robert Peel.[6] However, mistaking Peel’s secretary, Edward Drummond, for the government leader, M’Naghten shot and ultimately killed Drummond.

During the trial, M’Naghten pleaded not-guilty due to “moral insanity” in the form of monomania. This tactic worked, and M’Naghten was found not-guilty by reason of insanity.

Outraged, Queen Victoria and the public demanded this case be reviewed. As a result, many questions were posed to all the judges regarding the case and verdict. The responses have become known as the M’Naghten Rules, and serve as the basis for “determining legal insanity throughout many parts of England and the United States to this very day.”

4The Opal, the Lunatic’s Literary Journal

The moral treatment movement started by Pinel in Paris, gave rise to the opportunity for the patients in New York’s Lunatic Asylum Utica, to create their own literary journal, The Opal.

The first issue, in 1850, was only given to members of the asylum; however, the next issues were sold at an asylum fair, so by 1851, the journal was being published in the American Journal of Insanity, the professional forum for that time. At the end of the first year, The Opal had over 900 subscribers and was in circulation with 330 periodicals, and all the profits went into the asylum’s library.

Moral treatment calls for kindness, patience, and recreation. The creation of The Opal demonstrates an essential element to this treatment: preventing sickness and sorrow.[7] Along with fairs, theatrical shows, debating societies, and lectures, The Opal aroused the mind of the insane away from morbid trains of thoughts and into the rational, orderly, polite facilities of the mind.

The journal was an important outlet for the patients, and it provided them a platform to showcase their own voices. However, The Opal only lasted until 1860, when it to “fell victim to the demise of the moral treatment movement.”

3India’s Insane Asylums

Historically, Great Britain has colonized numerous countries around the world; India was one of them, during the 19th century. As the number of the mentally ill grew in Europe and the United States, so did the numbers in India.

As the asylums began going up, the British Crown initiated the same treatment styles as Pinel and Esquirol for their Indian asylums. Even so, the local British colonials and authorities considered themselves superior to the locals, and they were unwilling to share facilities with them. Letting prejudice and bigotry take hold, the physicians separated the locals from the British. Those deemed “insane” in India were now sent to decrepit, public institutions.

The Superintendent, Surgeon R.F. Hutchinson, MD, of Patna Lunatic Asylum sent a report to the Inspector General, explaining the need for more space, and better sanitary conditions in one of these asylums. He explained that the population was already overcrowded at 138, and the number went up to 151 without larger buildings to accommodate the growing population. Hutchinson also stated that due to the drainage sending everything to low ground, where the Indian people reside, parts of buildings were unusable and unfit for occupation.

In his report, Hutchinson puts it rather bluntly when he states, “this evil cannot, of course, be remedied without either raising the plinth or removing the Asylum bodily to a higher site.”[8] Hutchinson was one man, struggling to care for his growing population of mentally ill, and instead of sitting back, he did what he could to try and make their life a bit better.

2Phrenology

Many us have seen the pictures with words written all over a human head. Some of us might even have one as a knick-knack; however, none of us will pull it off the shelf and use it to define how a perfect stranger will act. Back in the 19th century, this was a popular study known as Phrenology.

This is the study of the relationships between character and the shape of the skull. Austrian physician, Franz Joseph Gall, a founder of modern neurology, put Phrenology together and held the belief that the shape of the skull influences behavior.[9] Gall studied mathematicians, coachmen, sculptors, and the like searching for commonalities between the shapes of all their skulls.

However, Gall faced two significant problems with his theory. One, he based his claims on a single happenstance. For example, “cautiousness” was above the ears because he felt a large bump there on a cautious priest. Two, Gall only looked for cases that conformed to his hypothesis and simply ignored those that contradicted him.

Gall was way off when it came to Phrenology and how the brain actually works, yet he still laid the foundations for future neurologists to pick up and better understand the miraculous organ.

1Dorothea Dix

The 19th century brought about insane asylums, increased research, advanced treatments, and new studies of thought. While some of these provided good to those in need; nonetheless, numerous brought more misery to the mentally ill, than they did comfort.

An amazing lady, Dorothea Dix, saw the suffering of those in asylums, poorhouses, and jails, and sought to expose the cruelties of their confinement.[10] Taking her fight to Boston, Dix found powerful allies, including Rev. William Ellrey Channing, the leader of Unitarianism that sought for social reforms.

In 1841, Dix began traveling around Massachusetts, examining the conditions those with mental illness had to live in. She found them in “cages, closets, cellars, stalls pens! Chained, naked, beaten with rods, and lashed into obedience.”

By January 1843, Dix took her petition to the state and sought increase funding for these establishments. However, her’s was the only voice seeking sympathy and help for these people. But she did not give up, and eventually, the state passed legislation to expand the state insane asylum in Worcester.

Dix did not stop there. She went on to lobby for the better treatment of the mentally ill in many states. At a time when those deemed “insane” or “mad” were treated worse than animals, Dorothea Dix was their voice, and she fought for them.

Just a person who loves to write

]]>
https://listorati.com/top-10-crazy-facts-about-psychiatry-in-the-19th-century/feed/ 0 10857
Top 10 Unknown History Lessons Of The 20th Century https://listorati.com/top-10-unknown-history-lessons-of-the-20th-century/ https://listorati.com/top-10-unknown-history-lessons-of-the-20th-century/#respond Mon, 04 Mar 2024 00:12:28 +0000 https://listorati.com/top-10-unknown-history-lessons-of-the-20th-century/

The 20th century was a time of immense global achievement but also of global unrest. It saw the fall and rise of empires, borders redivided, and nations created. It was a time when the world had a better standard of living then all human civilizations across all eras of time, but it was on the tip of a nuclear apocalypse.

You all knew this, but what about the lessons that you didn’t know? From one man’s attempt to create better human beings through DNA to the vote of a single man deciding the fate of the entire human race, these are the history lessons that you did not know but have left their mark on the entire world.

10 The Curse Of Timur

Timur (aka Tamerlane) declared himself a great khan in the year 1369 and made it his mission to rebuild the Mongol Empire into what it was at the time of Genghis Khan. After his death, he was buried with an alleged curse on his tomb stating that whoever shall disturb it will face an invader far worse than himself. According to local lore, the curse was said to take three days to affect the cursed.

In 1941, under the command of Joseph Stalin, Soviet anthropologist Mikhail Gerasimov excavated the tomb for the purpose of recreating Timur’s face. Three days after the excavation, the Nazis executed Operation Barbarossa, invading the Soviet Union without warning. This resulted in some of the deadliest battles in human history and many Soviet defeats. Only after Joseph Stalin ordered the reburial of Timur did the tides of war change in favor of the Soviets, allowing them to gain the upper hand against Nazi Germany, push their forces west, and turn the tide of the war in favor of the allies.[1]

9 South African Nuclear Disarmament

In a session of the South African parliament in 1993, President F.W. de Klerk revealed that at one point, the nation had developed a small arsenal of nuclear weapons. This confirmed the fears of many other African nations that the South African government had done so. However, in 1990, all South African nuclear warheads were dismantled in order to create “international cooperation and trust,” in the words of President de Klerk.

This was the first time in history that a nation voluntarily dismantled its nuclear arms and one of only four instances in history. The other nations dismantled theirs because they could no longer keep up with the maintenance after the collapse of the Soviet Union. South Africa’s decision, however, was made in order to extend an olive branch to the global community and show the nation’s devotion to peace.[2]

8 The Carnation Revolution

The Carnation Revolution took place on April 25, 1974, when the Armed Forces Movement, led by General Antonio Spinola and backed by many Portuguese military members and prominent civilians, ended the Estado Novo regime in Portugal.[3] The regime was replaced with a democratic system which restored many civil rights in Portugal. Over the course of the next ten years, a constitution was established, and a stable two-party system was formed.

One of the revolution’s main triggers was a large portion of Portugal’s budget being spent on fighting wars in the African colonies. After many years of fighting, the Portuguese military had become fed up with the large loss of life in an unwinnable war, thus motivating the coup. After the revolution, the new government gave independence to Portugal’s colonies.

7 1993 Russian Constitutional Crisis

Not since the 1917 Communist revolution had Russia seen the level of political violence that it did during this crisis. It began due to the growing tensions between Russia’s parliament and President Boris Yeltsin. The peak of the crisis came on September 21, 1993, when President Yeltsin attempted to dissolve the parliament despite not having the authority to do so. As a direct response, the parliament impeached Yeltsin and named Vice President Aleksandr Rutskoy as the acting president.

The situation escalated on October 3, when demonstrators attacked Russian police surrounding the parliament, invaded mayors’ offices, and attempted to invade a Russian television network. The conflict ended on October 4, when the previously neutral army stormed Moscow’s White House (where the members of parliament were holed up) and arrested the leaders of the resistance under the orders of Yeltsin.[4]

6 George Wallace And Arthur Bremer

George Wallace was the 45th governor of Alabama and the final independent presidential candidate to win promised electoral college votes. George Wallace was also one of the most controversial politicians in US history due to his extreme segregationist views. During his third presidential campaign in 1972, he was shot by Arthur Bremer during a campaign rally.[5] At the time of the assassination attempt, Wallace was showing positive support numbers due to his more moderate platform, but the shooting derailed his campaign.

Bremer served 35 years in prison before being released on parole in 2007. Wallace returned to Alabama and served out his remaining term as governor, despite being paralyzed from the waist down. He later tried unsuccessfully to run for president again. Wallace was later elected as Alabama’s governor for a final term before retiring from politics.

5 Rhodesian Bush War

Founded as a British colony in 1890 by South African entrepreneur Cecil Rhodes, Rhodesia was nicknamed “The Breadbasket of Africa,” due to the extremely fertile farmland and hospitable climate. These features attracted many European settlers, primarily from Britain, during the 20th century. Rhodesia developed quickly into one of the most prosperous and developed regions in Africa and produced large amounts of natural resources, such as chrome and nickel. Rhodesians served Britain in both World Wars, and many Rhodesian men served as mercenaries during the Congo Crisis.

In spite of the development in Rhodesia, much of the native population felt disenfranchised by the white minority government, which largely excluded them from the political process. This gave rise to communist black nationalist groups, which waged guerrilla war on Rhodesia. During the conflict, Rhodesia unilaterally declared independence from Britain in 1965, resulting in heavy international sanctions.[6] The rebel government, led by Prime Minister Ian Smith, received aid from South Africa as well as the Portuguese colonies in the region, allowing the Rhodesian forces to fight for as long as they did. After the Portuguese colonies fell and the US government successfully pressured South Africa to stop aiding Ian Smith and his government, Rhodesia was left on its own.

The war lasted from 1965 to 1980, when the Lancaster Agreement gave the African natives control of the government, resulting in Rhodesia being renamed Zimbabwe and President Robert Mugabe coming into power, never to step down.

4 B-59 Submarine Incident

During the Cuban Missile Crisis, a Russian B-59 submarine was operating in the Caribbean while US military forces were dropping depth charges as part of their Cuban blockade. Panicked, the captain and senior officers debated what to do. On board this B-59 was a nuclear warhead, and as the bombs fell, the submarine’s captain felt it was necessary to launch the warhead.

Thankfully, according to Soviet protocol, all senior officers aboard must approve any launch of a nuclear warhead. Two of the officers voted in favor of a launch, but one senior officer named Vasili Arkhipov voted against it, thus saving the world from nuclear destruction.[7]

3 Genius Babies


From 1980 to 1999, over 200 babies were born using the sperm of Nobel Prize winners, high-IQ individuals, and athletes. The project was founded by a Southern California tycoon named Robert K. Graham. Graham’s goal was to create a better generation through the use of positive eugenics. At the time, the project was controversial, with opinions ranging from it being elitist to straight-up genocidal. Nevertheless, the effort lasted for almost 20 years, with many babies being born.

Although no evidence has been brought forward to suggest that the children are above average in any way, shape, or form, largely due to the secrecy of the project, the families who have spoken about their children have described them as “wonderful.”[8]

2 Attack Of The Dead Men

In 1915, World War I was in full rage, with never-before-seen weapons and tactics being employed by both sides of the conflict. According to one story, during a German offensive on the Russian Fort of Osowiec, located in modern-day Poland, the German forces employed chemical weapons on the Russians. The effects of the deadly gas on the Russian soldiers were catastrophic, causing them to cough up blood onto their uniforms.

When the Russians launched a counteroffensive, the invading German forces were horrified when they saw what appeared to be zombie-like soldiers. This resulted in the terrified German army retreating, despite having superior numbers against the Russian forces.[9] This story, although virtually unknown in the West, is a symbol of Russian military power and is commonly told and taught in Russia.

1 The Wall Street Putsch


The months between the election and inauguration of President Franklin Delano Roosevelt was a time when American democracy hung in the balance. This was the Great Depression, and President Roosevelt’s plan to minimize its effects angered legislators on both the left and the right. The left argued that he hadn’t gone far enough, while the right believed that the new president’s policies were evidence that he was a socialist or communist, with some even going so far as to say that due to his Dutch descent, Roosevelt was a Jew and a part of a larger Jewish plot. This led to many calling for an end to American democracy and the institution of a communist or fascist regime.

These calls were taken a step further by a group of right-wing financiers. They hoped to convince President Roosevelt to step down and leave a military-led fascist government in his place. This group was able to gather millions in funds and also stockpile weapons in preparation for their new government. Their plan was derailed when they approached former Marine general Smedley Darlington Butler to lead their forces. Instead of joining the conspiracy, Butler reported the conspirators to Congress, recognizing them as traitors, thus putting an end to their plot.[10]

I am just a guy writing lists and trying to make money.

]]>
https://listorati.com/top-10-unknown-history-lessons-of-the-20th-century/feed/ 0 10553
10 Reasons Why Life Sucked In The 19th Century https://listorati.com/10-reasons-why-life-sucked-in-the-19th-century/ https://listorati.com/10-reasons-why-life-sucked-in-the-19th-century/#respond Thu, 29 Feb 2024 23:38:26 +0000 https://listorati.com/10-reasons-why-life-sucked-in-the-19th-century/

People pine for the good old days, when humans somehow lived better, more fulfilling lives than they do today. The sad fact is that there were never any “good old days.” The only thing that has changed over time is our ability to express compassion for other living beings and the safety measures that we have put in place to help protect lives.

As a whole, we’ve forgotten what life was really like long ago. The 1800s, for example, were dangerous times when disease and lack of education could kill the innocent, the vulnerable, and even the strongest among us. Life was fragile, and death was always right around the corner.

10 Mangled By Machinery


Working in the mills and factories before the age of safety regulations was deadly. Newspapers reported numerous instances of women, children, and men being mangled by exposed machinery.

Most of the accidents could have been prevented with appropriate clothing and safety barriers. For example, a young Wisconsin woman was inspecting the machinery in a flour mill in 1861 when “her clothing came in contact with an upright shaft.” She could not break free, and by the time word got out to shut down the mill, her body was “horribly mangled.”

In a report published in 1892, we learn that a young man was ground to death in a California paste factory. When he began to fix the “dough,” the wheel inside the paste tub spun and caught his hand. He was pulled in between the tub and the grindstone, where he was ground to death.[1]

9 Strychnine Ale


Strychnine was considered a tonic in the 1800s and was used as such well into the 20th century. It was also added to beer, in small amounts, of course, as a flavoring. However, there were plenty of instances where too much strychnine was used, and the beer drinkers would become sick and sometimes die.

Such was the case in 1880, when two men ordered some beer in Prahran, Victoria, Australia. A bottle of ale was obtained from a store owner, and the men poured it into two glasses. When they took a drink of the stuff, it proved to be too bitter to finish off. Soon afterward, the men began to feel sick and showed signs of strychnine poisoning. They were taken to the hospital, and under good medical care, they survived the poisoning. When the brewer was informed of the incident, he was able to remove all bottles of his ale from the stores, thus preventing any more poisonings from the bad batch.

In 1892, Catherine Waddell of Maryborough, Queensland, was not so fortunate. After drinking a small quantity of very bitter ale, she panicked. She believed she had been poisoned by strychnine and shortly died.

A postmortem examination convinced a doctor that the silly woman had died from fear, and the case might have been dismissed if law enforcement hadn’t collected the bottle of ale. It was found to contain the equivalent of 12 grains of strychnine. A half-grain of strychnine was enough to kill a healthy person, so the deceased woman was not wrong when she announced that she had been poisoned.

Further investigation into her death showed that the bottle had not been properly washed at the brewery and that it must have had the strychnine residue in it when the ale was bottled.[2]

8 Hydrophobia: Not Real

Hydrophobia and rabies were often used interchangeably during the 1800s, but what is most fascinating about this deadly disease is that there were doctors during this time period who believed that there was no such thing as hydrophobia. For example, in 1897, a paper was read by Dr. Irving C. Rosse before the American Neurological Association, and the doctor “did not hesitate to speak of hydrophobia as a purely imaginary disease, with no more realty to rest upon than . . . witchcraft . . . ”[3]

In spite of the doubt as to the existence of rabies, cases were being reported in the newspapers, especially when it came to pets and wild animals. By 1899, doctors were publishing articles once again, assuring the public that hydrophobia was indeed a real disease and that it could be spread from animal to animal and animal to man.

It is not known how many people died from rabies simply because so many doctors did not believe that the disease actually existed.

7 Drowning Dogs


An article published in a Wisconsin newspaper in 1876 gave the following description of “healthy” boys in nature:

The boy is a part of Nature. [ . . . ] He uses things roughly and without sentiment. The coolness with which boys will drown dogs or cats, or hang them to trees, or murder young birds, or torture frogs or squirrels, is like Nature’s own mercilessness.

With this attitude, it is little wonder that drowning dogs was a common method for getting rid of abandoned or lost pets.

The local dog catcher of Saint Paul, Minnesota, announced in 1893 that he was no longer going to kill unlicensed dogs with “charcoal gas.” Instead, he was going back to drowning them.[4] The US wasn’t the only country drowning unwanted dogs. It was reported in 1891 that stray dogs found in South Brisbane would also be drowned.

6 Infanticide


A Melbourne newspaper published an article in 1897 asking what the government could possibly do to stop the growing trend of killing unwanted babies.[5] Whether it was family members murdering the infants or their lives being taken by the baby farms, something certainly had to be done because the bodies of babies were being discovered at an alarming rate on land and in water.

In 1873, a young boy fishing in Tasmania got his line caught on something. He struggled with it and eventually pulled up a wooden box held together by a bit of chain. When opened, the body of an infant was discovered inside.

Three infants were discovered in New South Wales in 1887 inside a single day. The first body was less than a week old and was wrapped in shirting before being left in the roadway. The second body was that of a five-day-old female, left in a paddock. The third infant was a newborn male, left on a vacant lot. All three of the infants had either string or tape wrapped around their necks to cut off their air supply. Fortunately, the third infant was still struggling to breathe when he was found and was immediately revived and taken to a hospital.

5 The Grinning Death

Lockjaw, more commonly known as tetanus, was not a preventable disease until the early 20th century. Before the invention of the vaccine, people died horrible “grinning deaths” when the tetanus bacteria entered their blood stream. Victims of lockjaw would be overcome with vicious muscle spasms and seizures, until death gave them mercy.

A lockjaw epidemic was reported in the summer of 1899 in New York. Between July 4 and July 22, there were 83 deaths from the disease, caused by “careless handling of fireworks and toy pistols.”[6] The mortality rates at that time were anywhere from 85 to 90 percent, meaning that anyone who was punctured by contaminated material was highly likely to die.

Doctors were searching for a cure for the disease, but it was with little success. One doctor in Tours, France, reported that “the symptoms of tetanus were relieved immediately by nerve stretching,” but the patient died a few hours after the ordeal.

4 Swallowing Pins


Women kept a large assortment of pins handy in the 19th century. While mending clothes, they would often hold the pins in their mouths, leading to numerous reports of people accidentally swallowing them. For example, in 1897, a 56-year-old housemaid swallowed a brass pin. She was taken to the hospital but died six weeks later after the pin had perforated her intestines.

Children were also victims of pin swallowing, but the subject was treated almost nonchalantly in newspaper reports. For example, in 1881, it was reported that a boy just coughed up a pin he had swallowed six years previously.

In another case, also reported in 1897, an infant swallowed an open brass safety pin. The parents watched over him for the first few days but quickly forgot about the whole thing until six months later, when their boy started coughing. When the baby was picked up, “he coughed up considerable blood, and with it came the long looked for pin. The pin was badly corroded and blackened.”[7]

3 Carcasses Dumped Into Bay

New York City had a tremendous problem with animal carcasses, as reported in 1870. The New York Rendering Company and other contractors would collect the bodies of cats, dogs, horses, and the remnants left over from the butcher shops and dump them all into the Lower Bay. There were so many dead animals that they were washing up on the shores. Tenants who lived along the Hudson River were getting sick. At any time, up to 15 dead horses could be seen floating, bloated, in the water.

People started to complain about the awful smell and gruesome sights. It was then decided that the carcasses had to be dumped outside the city limits, but they continued to wash up on shore, and “Gothamites who go down the bay for a sail often [had] a very disagreeable experience of dead horse odors after they [returned].”[8]

2 Gruesome Experiments On People And Animals


There was very little oversight when it came to medical experiments in the 19th century. Both people and animals, either voluntarily or involuntarily, were used in prodecures that we would rightly view as cruel or gruesome by today’s standards.

In 1893 in France, a 45-year-old woman suffered from “a tumor in the frontal bone.” Her doctor had to cut open her skull and remove the tumor. He was then faced with the problem of what to use in place of the original skull bone. As part of a novel experiment, he had a piece of skull bone removed from a living dog and, “taking antiseptic precautions,” fitted it into the woman’s head.

In 1889, there was also a growing experimental trend of injecting people with “matter from certain glands of the lower animals.” This was done to increase vitality in aging people.

Animals were at the mercy of medical doctors. While in some countries, there were laws against certain cruelties to animals, it was still being decided if the laws should apply to doctors.

In one case that went to trial in 1888 in Victoria, Australia, a doctor was experimenting on dogs. He would make an extract from meat and inject it under the dogs’ skin. His goal was to see if dogs could forego ingesting food through the stomach. The dogs were given as much water as they wanted, and the doctor claimed that the dogs were not experiencing any pain.

At the close of the trial, it was decided that although some cruelty had been inflicted on the dogs, the bench could not determine the exact extent of the suffering involved. The doctor was told to register and pay fees to continue his experimentation on animals.[9]

1 Wearing Items Made Of Human Skin


Wearing gloves or belts made of human skin is something that would make most of us shudder, but it was actually quite common long ago. An article published in 1899 tells us that the skin was taken from the bodies of the poor who were not claimed by friends or relatives when they passed on.

Unclaimed bodies were often handed over to the medical schools, where they were dissected. Medical students would then collect the skin and sell it to tanners and jewelers. There was a high demand for items made of human skin in the United States, and the skin sold for a good price because it was in short supply.

Perhaps one of the more gruesome stories of wearing leather from human skin was published in 1888. A physician living in New South Wales had his shoes made from the skin of Africans. According to him, Africans made the softest and most durable leather.

The man had no ill feelings toward Africans and was a foreign-born US citizen who fought in the Civil War to free African Americans from slavery. In his own words, “I would use a white man’s skin for the same purpose if it were sufficiently thick and if anyone has a desire to wear my epidermis upon his feet after I had drawn my last breath, he has my ante-mortem permission.”[10]

Elizabeth, a former Pennsylvania native, recently moved to the beautiful state of Massachusetts where she is currently involved in researching early American history. She writes and travels in her spare time.

]]>
https://listorati.com/10-reasons-why-life-sucked-in-the-19th-century/feed/ 0 10493
10 Important Wars of the 20th Century https://listorati.com/10-important-wars-of-the-20th-century/ https://listorati.com/10-important-wars-of-the-20th-century/#respond Thu, 25 Jan 2024 09:42:33 +0000 https://listorati.com/10-important-wars-of-the-20th-century/

You’d think that as the world got more modern, we’d find ways to solve our differences that didn’t involve blowing each other’s brains out. And, luckily, that’s true! But there was still plenty of horrific violence, including the deadliest wars in human history, in the 20th century, when mankind entered a period of shocking technological advancement and increasing levels of interconnectedness. Let’s take a look at some of the wars that defined the 1900s. 

10. World War I 

Often referred to as the Great War, World War I (1914-1918) was triggered by the chain reaction activation of a complex web of political alliances, militarism, and imperial rivalries, all kicked off by the assassination of Archduke Franz Ferdinand of Austria-Hungary. 

Nationalistic pride and 19th century tactics, however, soon clashed with the reality of modern warfare. The carnage was so severe that it forced men on all sides to dig down into the mud in order to survive. Trench warfare has thus come to symbolize the conflict as a whole. Machine guns, advanced artillery, poison gas, tanks, and aircraft all made their battlefield debuts and contributed to staggering casualties and a seemingly endless succession of failed offensives on all sides. The Eastern Front witnessed fluid and dynamic battles, while the conflict in the Middle East, Africa, and Asia added global dimensions. The war ended with an Allied victory in 1918, the signing of an Armistice, and the Treaty of Versailles in 1919. To this day, the war is often looked at as even more wholly unnecessary and tragic than other wars.  

9. Russian Civil War 

The Russian Civil War unfolded between 1918 and 1922, and was kicked off by the Russian Revolution of 1917. It marked a struggle for power and ideology among various factions, primarily the Bolshevik Red Army, anti-Bolshevik White Army, regional nationalist forces, and foreign interventionist troops.

The conflict began with the Bolsheviks, led by Vladimir Lenin, trying to consolidate national power after the October Revolution in 1917, during World War I. Opposition to the Bolsheviks coalesced into the White Army, composed of diverse elements ranging from monarchists to liberal democrats, seeking to resist a communist takeover. 

The frontlines of the Russian Civil War stretched across vast expanses, from the Western borders to Siberia. The Red Army, despite facing internal and external challenges and numerous setbacks, ultimately secured victory. The Russian empire therefore fell. The Soviet Union took its place, and would last until the end of the Cold War in 1991. 

8. Spanish Civil War

In the same way that the Mexican American War served as a proving ground for many of the tactics and commanders who’d later define the American Civil War, the Spanish Civil War – fought from 1936-1939 – gave the great powers of Europe a chance to test their mettle before being thrown into the furnace of World War II. 

The war erupted when General Francisco Franco, leading a coalition of conservative, monarchist, and fascist forces, sought to overthrow the democratically elected Second Spanish Republic. The conflict was characterized by a deep ideological divide, with the Republicans, a coalition of left-wing and anti-fascist forces, opposing Franco’s Nationalists. The International Brigades, composed of volunteers from various countries, including anti-fascist activists and intellectuals, joined the Republicans. Meanwhile, Nazi Germany and Fascist Italy supported Franco’s Nationalists. 

Like all wars in the first half of the 20th century, this one was particularly savage. It was defined by roughly equal numbers and frontline stalemate until close to the end of the conflict, when the nationalists surged forward and destroyed the remaining Republicans. It was one of the first times the world saw the brutality of fascism firsthand. 

7. World War II 

The largest, most widespread, and deadliest war in history, World War II (1939-1945) was defined by the violent expansion of the Axis Powers (Nazi Germany, Fascist Italy, Imperial Japan, and their allies) followed by counterattacks by the Allies (Britain & Commonwealth forces, France, the Soviet Union, China, the United States, and their allies), and Allied victory. The vast majority of the world was involved. As many as 85 million people, a large majority of them Allied civilians, were killed. Attacks against civilians were carried out by both sides. The Allies did this mainly via aerial bombing, including firebombing and nuclear bombing, as a way of breaking the enemy’s will to resist. The Axis mainly used ground forces to carry out bloody genocides, including the Holocaust, in which 6 million Jews were murdered by the Nazis and their partners. 

The war raged from the Pacific Ocean to the jungles of southeast Asia, the Russian steppe, Sahara, and the beaches of France. It saw massive technological leaps forward, brought an end to centuries of European geopolitical dominance, and saw the rise of the United States and the Soviet Union as rival superpowers, making it arguably the single most significant event in human history. 

6. Chinese Civil War

The Chinese Civil War raged between 1927 and 1949, between the Nationalist Party and the Communists. The initial phase of the Chinese Civil War began in 1927, when Chiang Kai-shek, leader of the Nationalists, turned against the Communists, leading to a violent purge known as the Shanghai Massacre. The conflict then entered a period of intermittent truces and alliances, with both sides nominally cooperating against the Japanese invasion during the Second Sino-Japanese War (1937-1945).

After Japan’s defeat in World War II, the resumption of the civil war in 1946 saw the Nationalists and Communists vie for control of China. The Communists, led by Mao Zedong, garnered widespread support among peasants, while the Nationalists struggled with corruption and inefficiency. The decisive turning point came in 1949 when the Communists emerged victorious, leading to the establishment of the People’s Republic of China on October 1, 1949. Chiang Kai-shek’s Nationalists retreated to Taiwan, where they continued to rule, while the mainland underwent significant political, economic, and social transformation under Communist rule. The Chinese Civil War had profound implications for the course of Chinese history and the global balance of power during the Cold War era and beyond. 

5. Korean War

The Korean War unfolded from 1950 to 1953 on the Korean Peninsula. The war began when Communist North Korean forces, backed by the Soviet Union and China, invaded South Korea, which was supported by the United Nations and the United States.

The conflict was triggered by the political division of Korea after Japanese occupation during World War II, with the Soviets occupying the north and the United States occupying the south along the 38th parallel. The North, led by Kim Il-sung, sought to reunify the peninsula under communist rule, while the South, led by Syngman Rhee, aimed to maintain independence.

The war saw significant involvement from international forces, with the United Nations sending a multinational coalition, primarily composed of U.S. troops, to support South Korea. In response, China intervened on the side of North Korea, escalating the conflict. The war’s frontlines fluctuated along the 38th parallel, with intense fighting and trench warfare reminiscent of World War I. The armistice agreement signed in 1953 established a demilitarized zone near the original border, solidifying the division between North and South Korea. However, a formal peace treaty was never signed, and the Korean Peninsula remains divided to this day. 

4. Six-Day War

The Six-Day War, a brief but transformative conflict in the Middle East, took place from June 5 to June 10, 1967, involving Israel and its neighboring Arab states. The tensions leading to the war had been escalating due to territorial disputes, political tensions, and military buildups in the region.

The immediate catalyst for the war was the closure of the Straits of Tiran by Egypt, effectively cutting off Israel’s access to the Red Sea. Additionally, Arab rhetoric and troop movements had heightened the sense of an impending conflict. In a pre-emptive strike, Israel launched Operation Focus, targeting Egyptian airfields, which resulted in the destruction of a significant portion of the Egyptian air force.

In the ensuing six days, Israel swiftly secured victories on multiple fronts. Israeli forces seized the Sinai Peninsula from Egypt, the West Bank and East Jerusalem from Jordan, and the Golan Heights from Syria. The war reshaped the political and territorial landscape of the region, marking a turning point in the Arab-Israeli conflict.

The aftermath of the Six-Day War had lasting implications. Israel’s victory significantly expanded its territorial control, leading to occupation and settlement activities in the captured territories. However, the war also intensified regional hostilities. 

3. Iran-Iraq War 

The Iran-Iraq War, one of the longest and bloodiest conflicts of the 20th century, took place between 1980 and 1988, involving the Islamic Republic of Iran and Saddam Hussein’s Iraq. The war had deep-seated roots in territorial disputes, historical grievances, and ideological differences.

The conflict began when Iraq, under Saddam Hussein, invaded Iran in September 1980, seeking to exploit what it perceived as Iran’s weakened position after the Iranian Revolution and the subsequent political turmoil. The war quickly escalated, with both sides engaging in WWI-style trench warfare. The conflict saw the extensive use of chemical weapons, causing significant human suffering and long-term health consequences.

The war’s dynamics were complex, with shifting alliances and international involvement. Various countries supported either Iran or Iraq, with the United States and the Soviet Union supplying arms to Iraq (weirdly enough) at different points in the conflict. The war finally concluded in 1988 with a UN-brokered ceasefire.

The Iran-Iraq War had profound consequences for both nations. It resulted in immense human and economic losses, with estimates of casualties ranging from hundreds of thousands to over a million. 

2. Vietnam War

The Vietnam War, spanning from 1955 to 1975, was a protracted conflict involving Ho Chi Minh’s Communist North Vietnam, supported by the Soviet Union and China, and Ngo Dinh Diem’s South Vietnam, backed by the United States and its allies. The war was rooted in the struggle for control of Vietnam, complicated by Cold War geopolitics and ideological differences.

The war witnessed guerrilla warfare tactics by the communist forces, including the Viet Cong, and intensive bombing campaigns by the United States. The use of chemical defoliants, most notably Agent Orange, had severe environmental and health consequences. The conflict also spilled over into neighboring countries, with the US conducting arguably illegal bombings in Laos and Cambodia.

As public opposition to the war grew in the United States, a gradual withdrawal of American troops began in the early 1970s. The Paris Peace Accords in 1973 aimed to end US involvement, leading to a ceasefire. However, the war continued between North and South Vietnam, culminating in the fall of Saigon in 1975, leading to reunification under communist rule.

1. Cold War 

Although (thankfully) not a war in the traditional sense, the Cold War was a geopolitical, ideological, and military standoff between the United States and its allies, representing the democratic and capitalist Western bloc, and the Soviet Union and its allies, representing the communist Eastern bloc, that persisted from the end of World War II in 1945 until the dissolution of the Soviet Union in 1991. This ideological confrontation was characterized by intense political and military rivalry, violent proxy wars around the globe, a nuclear arms race that had the world on edge for decades, a space race, and ideological competition between capitalism and communism.

The origins of the Cold War can be traced to the differing post-war visions of the Allies. While they had been wartime allies against Nazi Germany, the ideological differences and spheres of influence soon emerged. The division of Germany, the establishment of the Iron Curtain in Europe, and the containment policy formulated by the United States, deepened tensions.

The Cold War concluded with the dissolution of the Soviet Union in 1991, symbolized by the fall of the Berlin Wall in 1989. The United States emerged as the sole superpower and the beginning of a new era of international relations.

]]>
https://listorati.com/10-important-wars-of-the-20th-century/feed/ 0 9644
10 Unusual Deaths of 21st Century https://listorati.com/10-unusual-deaths-of-21st-century/ https://listorati.com/10-unusual-deaths-of-21st-century/#respond Fri, 17 Nov 2023 13:53:32 +0000 https://listorati.com/10-unusual-deaths-of-21st-century/

This is a list of 10 unusual deaths. This list includes unique or extremely rare circumstances of death recorded throughout history,  noted as being unusual by multiple sources.

The 10 truly unusual deaths of 21st century.

10. Death of Brittanie Cecil by a hockey puck shot.

Brittanie Nichole Cecil

Brittanie Nichole Cecil, 13-year-old American, was a hockey fan who died from injuries suffered when a puck was deflected into the stands and struck her in the left temple at Nationwide Arena in Columbus, Ohio. on March 16, 2002. It was the first fan fatality in the NHL’s 85-year history.

9. A girl died when she was sucked down the intake pipe of pool.

swimming  pool of a water park in Saitama

Erika Tomanu, a seven-year-old girl in Saitama, Japan, died when she was sucked head first 10 meters down the intake pipe by the powerful pump of the water park’s current pool in Saitama, Japan. Despite of the grille not covering the inlet when it came off, lifeguards considered the pool safe to swim. It took 6 hours for the rescuers to recover the her body.

8. S Korean man died after playing the video game online for 50 consecutive hours.

Unusual Deaths
Lee Seung Seop, a 28-year-old from South Korea, died after playing the video game StarCraft online for almost 50 consecutive hours.

In 2005, Lee visited an Internet cafe in the city of Taegu and played StarCraft almost continuously for fifty hours. He went into cardiac arrest, and died at a local hospital. A friend reported: “…he was a game addict. We all knew about it. He couldn’t stop himself.” About six weeks before his death, his girlfriend, also an avid gamer, broke up with him, in addition to him being fired from his job.

7. British man in webcam suicide.

Unusual Deaths webcam suicide

Kevin Whitrick, a 42-year-old British man, committed suicide by hanging himself live in front of a webcam during an Internet chat session.

He was in a chatroom on PalTalk and was joined by about 60 other users in a special “insult” chatroom where people “have a go at each other”. He stood on a chair, punched a hole in his ceiling and placed a rope around a joist. and then tied the other end around his neck, then stepped off the chair.

Some people thought this was a prank, until his face started turning blue. Some people in the chat room egged him on while others tried desperately to find his address. A member in the room contacted the police, who arrived at the scene two minutes later.

6. Woman died of heart attack caused by shock of waking up at her OWN funeral.

Unusual Deaths
Fagilyu Mukhametzyanov pictured with her husband Fagili.

Fagilyu Mukhametzyanov from Kazan, Russia had been wrongly declared deceased by doctors. The horrified 49-year-old began screaming when she realized they were getting ready to bury her. She was rushed back to the hospital, where doctors had declared her dead from a suspected heart attack.

Now her husband is suing the Hospital. “I am very angry and want some answers. She wasn’t dead when they said she was and they could have saved her,” he said.

5. A man who was killed by his own lawn mower.

Swedish man that was killed by his own lawn mower

A man died while mowing a lawn outside a train station in Kungsbacka in southern Sweden. We think he was mowing the grass on what turned out to be a severely steep incline, a witness said. The man, who fell with the machine, went under the mower and was ravaged badly by the blades.

4. The bride who drowned herself while doing her nuptial photo shoot.

Unusual Deaths
Maria Pantazopoulos and her new husband Billy laugh with friends during their wedding.  But just over two months later, she drowned.

A bride who drowned during a photo shoot in her wedding dress was purposefully in the water to take part in a popular ritual called ‘Trash the Dress’, it has emerged. During August of 2012, while getting her wedding pictures, tumbling down from a cliff into a waterfall while she was wearing her wedding dress.

Her body was recovered about four hours after she slipped from a rock and fell into Darwin Falls in Rawdon, north of Montreal. She had chosen the site as the backdrop for her wedding pictures. She was on the verge of getting married within days. (source).

3. Newlywed woman killed in freak explosion caused by horse.

Unusual Deaths
Erica Marshall with Husband on their wedding.

Erica Marshall, a 28-year-old British veterinarian in Ocala, Florida, died when the horse she was treating in a hyperbaric chamber kicked the wall, released a spark from its horseshoes and triggered an explosion.

2. A husband who was ‘raped to death’ by five wives.

A husband who was 'raped to death' by five wives

Uroko Onoja, a Nigerian businessman, died after being forced by five of his six wives to have s.x with each of them. Onoja was caught making love with his youngest wife by the remaining five, who were jealous of him paying her more attention.

The remaining wives demanded that he also have s.. with each of them. They threatening him with knives and sticks. He had ___ with four of them in succession, but stopped breathing before having __ with the fifth.

1. University student beheaded in go-kart crash.

Unusual Deaths

Tuğba Erdoğan, a 23-year old Turkish university student , died on 8 February 2013, in a go-cart accident at a circuit located in Adapazarı, a town in the west of Istanbul. Her scarf was stuck on the rear wheel of the car causing her head to decapitate.

]]>
https://listorati.com/10-unusual-deaths-of-21st-century/feed/ 0 8544
10 Ghastly Prison Practices Of The 19th Century https://listorati.com/10-ghastly-prison-practices-of-the-19th-century/ https://listorati.com/10-ghastly-prison-practices-of-the-19th-century/#respond Sat, 07 Oct 2023 11:39:38 +0000 https://listorati.com/10-ghastly-prison-practices-of-the-19th-century/

Most of us are pretty keen on staying out of prison. We have good reason, too. Part of the idea behind imprisonment is to deter criminal offenses. But this wasn’t always the case in Western societies. In the 16th and 17th centuries, a prison existed only as a place to hold offenders before their trial was conducted. Once punishment, be it corporal or capital, was carried out, the prisoner was no longer held. The closest thing to a modern prison was a house of correction, a place to reform beggars and unwed mothers, or a debtors’ prison, a place to keep people until their debts were paid. But in the late 18th century, the prison population of Great Britain exploded.[1] The Revolutionary War in America cut them off from their prisoner dumping grounds, and it was roughly another decade before Australia’s biggest import became convicts.

During this decade in which the British had to keep their prisoners to themselves, the public began to take notice of the condition of the prison system. In 1777, John Howard took an inventory of prisons and reported that the whole system was a mess. Prisoners were heaped on top of one another regardless of gender, age, or illness. Many died from violent attacks or the rampant spread of disease. Jailers were corrupt. They charged prisoners exorbitant fees while keeping them locked up with no way to make a living. Howard suggested the model that would inform imprisonment during the 19th century, focusing on security, health, separation, and reform.

An outcry for prison reform would drastically shape the establishments of that century toward reforming convicts rather than keeping them locked up indefinitely or physically punishing them. Ideas would be batted back and forth across the pond and would lead to interesting new ways of keeping prisoners. However, the new ways wouldn’t necessarily be less brutal or exploitative than the old. In fact, they would be much, much worse in ways that the 18th-century prisoner could never have imagined.

10 Sanitation


Despite the push to safeguard prisoner health, the squalid conditions persisted long into the Victorian era. Towns and cities were growing at rates that caused huge infrastructure problems in already cramped places like London. The biggest question on everyone’s mind was sanitation as human waste literally filled the streets. The second biggest question on everyone’s mind was how to control the criminal population as they also filled the streets. More people meant more anonymity for criminals, a luxury most were experiencing for the first time. Before the majority of the population started to worry about things like prison reform, they were worried about imprisoning as many criminals as possible as fast as possible. This led, initially, to cramped quarters where prisoners were practically on top of one another with little to no waste removal or clean water.

Outbreaks of typhus ravaged the small holding prisons known as gaols, so thoroughly that it earned the nickname “gaol fever.” A prisoner’s chances of dying doubled when they entered the building. Even though the 19th century brought reform, the prisons built then weren’t much more sanitary. Sing Sing, opened in New York in the 1820s, started off poorly from the very beginning.[2] Situated in a hollow between the Hudson and a hillside, it was doomed to be damp even when it wasn’t flooding. Prisoners were kept in tiny cells of stale air with a bucket of their own waste in the corner. Worse still, no pipes in the building had a double bend to stop filthy air coming back in, or proper ventilation to let it out. In the winter, when windows were closed, the only air supply came from sewage pipes. Every whiff of air in the place would have been suffused with human filth. Besides being a huge health hazard, it must have been a true olfactory nightmare.

9 Overcrowding


The poor sanitation stemmed directly from the overcrowding in 19th-century prisons. Initially, overpopulation was solved in London by shipping inmates to far-off colonies. But by the 1830s, both Australia and the United States refused to be dumps for Great Britain’s criminals. That was one more thing they didn’t need to worry about while settling new communities and unsettling indigenous peoples. As this form of exile was taken off the table for Great Britain, imprisonment itself was becoming an acceptable form of punishment. As can be imagined, this didn’t help the overcrowding in the least. Ninety new or expanded prisons cropped up between 1842 and 1877.[3]

Around the mid-Victorian period, two types of prisons had formed. The first was the county and shire gaols, small lockups and houses of correction administered by justices of the peace. The second type of prison was the convict gaol. These were bigger prisons run by the central government in London. They were initially large buildings set into the heart of London but were gradually built more and more near ports. This was because Great Britain had a unique solution to their overcrowding problem that efficiently recycled their earlier system of exile. When prisons on land became too stuffed to fit another inmate, massive decommissioned warships were refitted to house prisoners. They were aptly named “hulks.”

8 Hulks

These huge ships-turned-prisons didn’t disappear when banishing criminals to colonies became impossible. Instead, they evolved into traveling labor camps that operated on much of the same protocol as naval vessels.[4] Prisoners were not locked in tiny cells but instead were locked in communal decks at night where they slept in hammocks. Inside, they were free to walk around, converse, argue, have sex, and trade illegal goods among themselves. Later on in the operation of many hulks, evening classes where convicts learned to read and write became standard. During the day, the inmates of the warships would be mustered about to bathe, clean the ship, cook, eat, and go ashore to work at the ports. The work would easily qualify as hard labor. Prisoners would unload ships and dredge canals while wearing leg irons.

Given the nightly freedom and the chance to learn new skills, many free men contemplated getting arrested for the opportunity to work on a hulk. Prisoners got three meals a day and sometimes got pay for their work when they were released. However, conditions aboard the ships were not more sanitary than off, and the food was disgusting and monotonous. Breakfast would be toasted bread and a cup of cocoa, and dinner would be 6 ounces of meat with a large helping of bread and potatoes. Fruits and vegetables were rarely part of the plan. The water used to make cocoa and clean off was often pulled up from the Thames, which isn’t exactly known for its sparkling, clean waters. Deaths from cholera and work injuries were common, but officials refused to admit it. The freedom enjoyed by prisoners had its drawbacks as well—being unsupervised in a large deck stuffed with various criminals isn’t everyone’s idea of a good time.

In 1857, the last of the hulks was decommissioned and burned. Originally brought in to help with overcrowding, hulks actually ended up making the problem worse. Many assumed that the hulks would always be there to take on excess prisoners, so jails and prisons on land were often built too small and cramped. Bedford was one prison where the staff had assumed they would always be able to cart local convicts off somewhere. When exile and prison hulks disappeared, the gaol was far too small for the local population.

7 Debt Spirals


Debtors’ prison was a fear somewhere in the mind of every person, poor or wealthy, in the 19th century. These prisons were run for profit and were viewed as an investment by those who built them. Thus, they were run like businesses. Prisoners were given the opportunity to pay for better lodgings and food while working on their debts, but the poorest were forced into damp cells with no windows. These prisoners would often be children or the mentally ill.[5] Sometimes, entire families would end up in debtors’ prison only to be separated by gender, age, and monetary value. The spread of disease went entirely unchecked, and sanitation was less than a passing daydream. If the debtor was lucky enough to pay back their debts, they would still have to pay off the jailer’s fees.

Yes, being in debtors’ prison meant accruing fees during your stay. On top of paying for better lodging and food if one could, prisoners had to pay even if they couldn’t afford better accommodations. This meant that debtors would be racking up new debts constantly, including the rent on the damp, disease-ridden cells and board for the diet of bread dissolved in water. The jailers weren’t paid by the owner of the jail or the state, so their pay came from fees imposed on the prisoners. This led to a system of corruption in which prisoners were forced to pay for every single service provided, from having food and water delivered to being shackled in irons as punishment. The fees weren’t limited to the debtors’ prison, either. Every prison, gaol, and lockup at the time had a system of fees that ensured the destitute would die in a debtors’ prison.

6 The Separate System


This system came from the United States, where there had been significant debate on the matter of prison reform. There was a prevailing fear that the institutions of society and family were breaking down in the 1820s, which fueled a push toward rehabilitation of convicts. Most thought that criminals were simply lacking in discipline, so the focus shifted to teaching it.[6] Earlier prison reformers had called for humanitarian efforts to improve the conditions of prisoners. Reformers felt that those efforts had failed to make any change, so harsher methods were needed. The harsher methods that they had in mind were a little like a grown-up time-out. Prisons were reformed so that inmates were forced to sit alone and think about what they’d done.

Proponents of this system felt that criminals had come to their life of crime through wicked influences, so the solution was to limit their influences going forward. Prisoners were isolated from their peers under this system. Other inmates were considered a wicked influence, and officials did their absolute best to limit that influence. Their absolute best was bizarre and dehumanizing. When gathered for exercise, prisoners were forced to wear caps that covered their faces, and they were assigned numbers to replace their names. Prison yards kept long ropes knotted at 4.6-meter (15 ft) intervals. That was how far away from one another they wanted inmates at all times, even when they exercised in silence. The second leg of the separate system was to expose inmates liberally to good influences. In the 19th century, that meant Christianity. Chapel was mandatory, and the only time prisoners were allowed to use their voices was while singing hymns. But even there, inmates weren’t able to sit next to one another. Instead, they were seated in tiny cubicles with a wall between each two inmates. This system, unsurprisingly, led to more than a few cases of insanity, delusions, and suicides.

5 The Silent System

The silent system existed alongside the separate system. Silent system proponents didn’t believe that criminals would be or even could be reformed. Their hope was that prisons could scare potential offenders and scar repeat offenders so much that they would rethink their choices in the future. The assistant director of prisons in the United Kingdom, Sir Edmund du Cane, made a promise to the public that prisoners would get three things during their incarceration: hard bed, hard fare, and hard labor. The usual hammocks prisoners used before were swapped for hard planks of wood with minimal padding, and the food was intentionally bland. Hard labor was the most prominent feature of silent system prisons, and it was mandatory whether or not there was actual work to be done.

The Auburn Prison in New York was a model for the prisons that would adopt this system and eventually became known for its own unique take on it.[7] The Auburn System, as it was later recognized, involved intentionally breaking a prisoner’s spirit. The sort of striped uniforms you generally only see in costume shops now originated at Auburn as a way to distinguish the prisoners from the guards and humiliate them at the same time. Silence was enforced with a whip, prisoners were marched around in lockstep so that they couldn’t look at each other, and even the prison’s keepers never spoke to prisoners. Instead, they gave orders by tapping their canes on the ground. The work was often considered worse than the whippings. A short day was ten hours, and the work was always monotonous and sometimes even pointless. Eventually, the beatings and whippings were outlawed. The officials at Auburn were quick to replace them with more creative punishments, including ice water showers and tying inmates’ hands to a yoke hung behind the neck.

Auburn was considered wildly successful at the time. The prisoners were well and truly broken by the monotony and silence, so rebellions were few and far between. Prison shops were lucrative, and the profit often covered the upkeep. Prison officials from all over the US and Europe visited to take notes on the system used there. Civilians also flocked to Auburn to observe the silent prison for themselves. The officials quickly started to charge admission fees, which they had to double just to cut down on the number of visitors. This brutal system wasn’t abolished entirely until 1914.

4 The Rotary Prison

The change from keeping prisoners crowded together in larger rooms to keeping every inmate in their own tiny isolation chamber meant that new architecture had to be explored. Early on, in 1791, Jeremy Bentham published plans for the panopticon, a round prison with cells facing inward and a guard tower in the center. It allowed fewer guards to keep an eye on more prisoners at one time, minimizing the chances of escape. Just shy of a century later, the rotary jail was introduced. It was a complete inversion of the panopticon, in that the cells were at the center, and each was shaped like a single slice of a whole pie. As the name “rotary jail” suggests, the cells could be rotated by a hand crank. Only one cell could be opened at a time because the door would need to line up with the opening in the bars. The biggest and most famous of these merry-go-round jails was the Pottawattamie County Squirrel Cage Jail in Council Bluffs, Iowa.[8]

The Squirrel Cage was the perfect example of a rotary jail in every way. A small town population in a rural, lawless stretch of the US didn’t want to pay for a huge conventional jailhouse that would need to be staffed by several jailers in perpetuity. Instead, they opted for one jailer and a massive 45-ton rotating drum cut up into cells like slices of a layer cake. The town boasted the biggest of the rotary jails, having three levels instead of the usual two. That proved to be one of its biggest problems, and it was the reason that the jail was dubbed a huge failure within the first two years of its existence. The 45-ton drum balanced precariously on a 0.9-meter (3 ft) square base that was itself balanced precariously on the bare soil. Whenever the ground shifted under its massive weight, the entire thing would jam and trap inmates in their cells.

But that wasn’t nearly where the problems with the rotary jail started. The tiny spaces and isolation was still driving inmates insane, but things were even more dire when the jail had to pack people two to a cell. Petty criminals were housed right alongside ax murderers. Being trapped in a tiny, wedge-shaped cell would probably get to anyone, but being literally stuck with a vicious murderer when the drum jammed would easily have been a waking nightmare. Inmates would do anything to get out of the Squirrel Cage. One inmate, Willie Brown, died by eating glass while trying to get a medical transfer to anywhere else. Others stuck their arms or legs through the bars while the rotary jail moved to injure or amputate the limb. Still others reached through the bars in their sleep only to be rudely awakened when the limb was lopped clean off.

With the residents of County Bluffs still reluctant to pay for a proper modern jail, this place existed and ran right up to 1960, when the fire marshal officially shut it down. An inmate had died in a cell, and due to the jammed drum, no one could reach the body for two days. The residents, by the way, still didn’t want to pay for a new jail and just let inmates run free in the building while the jailer watched TV in his office.

3 The Treadmill

Rotary jails were the strange and complicated answer to prisoner isolation and limited resources, but another rotating device would be used to provide never-ending, monotonous work for inmates under the silent system. The treadmill, now better known as the most boring exercise machine in any gym, once put that mind-numbing effect to use for torture.[9] The first treadmills were huge and resembled a StairMaster more than a running machine. It was invented in England in 1818 as a punishment designed to be just shy of death. Famously, the treadmill almost killed Oscar Wilde. He got out of prison weak and lingered just three years before he died. What made the treadmills of the 19th century so different from our own?

Most notably, the shifts were about six to eight hours long. That’s more than ten times the length of a brisk 30-minute workout. Inmates also didn’t have the luxury of setting their own pace or incline. Each climbed 762 meters (2,500 ft) per hour with no exceptions. That on its own for six to eight hours could have killed, but the prisoners worked the treadmill in pairs. One would climb for two minutes, and then the other would climb while the first rested. Instead of truly resting the prisoners, this seemed to keep them on death’s door without pushing them over the threshold.

The treadmill was initially a literal mill that could grind grain into flour to help support the prison system, but many did nothing at all. This monotony and pointlessness was exactly the aim of the treadmill. Prison guard James Hardie credited the contraption with breaking even the most defiant inmates in New York. He wrote chillingly that it was the machine’s “monotonous steadiness, and not its severity, which constitutes its terror.” Convicts would step off of a shift on the treadmill weakened and vacant, only to go back to their tiny, isolated cells. Despite the glowing reviews from prison staff, American prisons phased the treadmill out in favor of bricklaying, rock-breaking, and cotton-picking. The practice was outlawed in England in 1902 once it was noticed that it was extremely cruel.

2 Picking Oakum


Installing a huge building full of stair-driven mills was pretty expensive. Not every prison could afford to add in a treadmill or a proper shop. But for prisons still looking to make a profit, convicts could be made to pick junk into oakum. “Junk” referred to old ropes coated in waterproof tar that could be teased out into bunches of fiber. The fiber would then be mixed with more tar or even grease to make a waterproofing paste. That paste was used to fill the gaps in the hull of a wooden ship.

Able-bodied prisoners would have to cut ropes into 0.6-meter (2 ft) sections and then beat those lengths with a mallet until the tar broke up. Once the tar was shattered, the junk would often be passed along to inmates who were deemed weaker: the elderly, women, and children.[10] They would be tasked with breaking the rope up into fibers. First, it would be attached to an iron hook that was held between the thighs. Then, inmates would use an iron nail, a scrap of tin, a knife, or, more often than not, their bare hands to break up the fibers. The ropes had to be uncoiled, unraveled, picked apart, and then shredded. Prisoners quickly learned that fingers made the task go fastest but left them with tar-stained fingers and open, dirty sores. Since oakum had to be traded for food, most opted to suffer the pain rather than starve.

Oakum-picking was often done in a workhouse, so some prison officials felt it allowed for too much socializing. Inmates often sat in rows under the watchful eye of a warder with a whip. There wasn’t much room for socializing already, but the paranoia of prison officials was hard to calm. Many prisons across England would adopt the treadmill regardless of the expense, and picking oakum would be relegated to women and children. The mass switch to treadmills did happen to coincide with the switch to iron ships. Where wooden ships were made of planks with gaps that needed to be sealed, the new metal ships could be welded shut. The demand for oakum plummeted, and prison staff happened to decide at just that moment that stair-climbing was more moral than shredding one’s fingers on old rope.

1 The Crank

Some prison administrations felt that having inmates occupy the same space to work the treadmill or pick oakum was far too much mingling. When they wanted to keep them properly isolated, inmates had to do work alone in their cells. But officials had also noticed something they found very interesting: Inmates hated a pointless task more than a meaningful one. This presented them with an obvious solution: the crank.[11]

The crank was literally a crank that stuck out of a small wooden box that was usually set on a table or pedestal in the inmate’s cell. Despite its innocuous description, it was a truly soul-crushing monstrosity designed to exhaust inmates mentally and physically. Inside the box was a drum or paddle that turned nothing but sand and rocks. The axle on which the crank turned had a screw, which warders could tighten or loosen depending on how much punishment they wanted to mete out or, possibly, their mood that day. The screw would make the crank easier or harder to turn, and warders who came in to adjust it earned themselves the nickname “screws” for the suffering they brought with them.

A prisoner left in isolation with the crank usually didn’t have to worry about a beating if they just ignored the machine. Instead, they could worry about starvation. Each crank had a counter somewhere on the box that kept up with the number of turns. An inmate had to reach a certain number of turns before they were allowed to do basic things like eat and sleep. Most were expected to make at least 10,000 rotations a day—2,000 for breakfast, 3,000 each for lunch and dinner, and 2,000 more before bed. Some prisons would keep the inmate in the isolated crank cell well into the night if they had not completed the number of turns required, meaning that the inmate would miss supper and get very little sleep. The next day, that prisoner would have to operate the crank again while hungry and exhausted. Ultimately, the crank would be outlawed along with the treadmill, but not before it jellied the brains of many a Victorian prisoner.

Renee is an Atlanta-based graphic designer and writer.

]]>
https://listorati.com/10-ghastly-prison-practices-of-the-19th-century/feed/ 0 7979