Werent – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Sun, 26 May 2024 07:21:16 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Werent – Listorati https://listorati.com 32 32 215494684 10 Times The Olympic Games Weren’t So Noble https://listorati.com/10-times-the-olympic-games-werent-so-noble/ https://listorati.com/10-times-the-olympic-games-werent-so-noble/#respond Sun, 26 May 2024 07:21:16 +0000 https://listorati.com/10-times-the-olympic-games-werent-so-noble/

Every four years, the Olympic Games attract millions of viewers across the globe who watch the best men and women go head-to-head in the one of the greatest battles of mental and physical toughness on the planet.

There is something magical about the fact that any competitor who enters the stadium, ring, pitch, or velodrome could be crowned the champion athlete in their chosen discipline. But over its 120-year history, the games have seen many people try to unbalance the level playing field or prevent equal participation before the competition even began.

10 Pierre de Coubertin

10-pierre-de-coubertin

Pierre de Coubertin, known by many as the founder of the modern Olympiad, has an international reputation for embodying fair play and the Olympic spirit. However, the International Olympic Committee (IOC) doesn’t like to talk about Coubertin’s rather offensive view of female competitors.

Along with a few other IOC members, Coubertin tried his best to make sure that women were not invited to compete at the Olympic Games. In a 1912 letter, he wrote, “In our view, this feminine semi-Olympiad is impractical, uninteresting, ungainly, and, I do not hesitate to add, improper.”

Coubertin’s sexist ideas stemmed from the view that the ancient Olympic Games tested male strength, endurance, and mentality and women had no place interfering in that. However, Coubertin didn’t get his way for long.

In the second Olympic Games in 1900, women were allowed entry into tennis and golf. His feeling that “the Olympic Games must be reserved for men” was finally put to rest when women took part in boxing, the final male-only sport, in the 2012 Olympics.

9 British Cheating In London

9-1908-london-team-darkened

The British team expected to win everything at the 1908 London Olympic Games and got a bit miffed when they didn’t. Biased officials, heavily weighted shoes in the tug-of-war, and a suspicious rerunning of the 400m after the Americans looked set to win all contributed to the cries of cheating from other nations.

Despite this, the Americans dominated in most events. One American newspaper reported, “The American victory at the Olympic Games in London, won in spite of unfairness and in some cases downright cheating, will be celebrated by a national welcome to the athletes on their return to New York.”

8 The Banning Of The Women’s 800m

8-1928-olympics-female-runners

In 1928, women were allowed in the Olympic Games for the first time. Although the men had been competing on the track since the first modern games in 1896, women had fought for inclusion and celebrated joyfully when the Olympic stadium was opened up to them after 32 years.

However, the women didn’t expect that one of the events, the 800m race, would be banned for 32 years after the final in Amsterdam. The cause of the drama: women falling over with exhaustion after crossing the finish line.

Newspaper reports claimed that the women lay in desperate states after the race, having pushed their bodies to the absolute limit. Coaches and officials called for the event to be pulled, claiming that they were looking out for the safety of female competitors who could seriously damage their weak, feminine bodies if they ran too hard.

It wasn’t until the 1960 Olympics that everyone realized that they were being rather stupid and allowed the middle-distance competitors back on the track.

7 Cycling In The Marathon

7-fred-lorz

It’s every athlete’s dream to win an Olympic gold medal in front of a home crowd. In 1904, American Fred Lorz appeared to do just that when he crossed the finish line of the marathon in first place in front of a St Louis crowd. But everything was not quite as it seemed.

The sweltering temperatures and dusty conditions on the road caused vomiting, cramps, bleeding, and dehydration among many of the 32 competitors. Lorz also was not doing well at the 9-mile mark and was forced to slow to a walk when he was lucky enough for a car to pass by—which offered to give him a lift.

At mile 11, he felt recovered enough to go it alone. A few hours later, he crossed the line first to rapturous applause. The jubilant home crowd celebrated as Lorz was presented with the winner’s wreath by President Roosevelt’s daughter Alice.

Then an official interrupted the proceedings to reveal the truth. Lorz claimed that it was all a joke and that he had never really intended to accept the victory. But the officials didn’t see the funny side and banned him from athletics.

6 Losing To Win In Badminton

You usually expect competitors in the Olympics to do everything in their power to win, but four pairs in the 2012 women’s doubles in badminton didn’t quite agree. Due to the round-robin format of the first stage of Olympic doubles, teams were well aware that they had a better chance of advancing to the final if they lost an early round and faced an easier competitor in the following round.

Two South Korean, one Chinese, and one Indonesian team had already qualified for the knockout stage of the tournament. They attempted to purposefully lose their final round-robin matches to best position themselves in the match play rounds.

Despite the protests of these teams, the Badminton World Federation deemed their behavior unsporting and dishonest and threw them out of the tournament.

5 Marathon Legend Banned For Accepting Expenses

5-Paavo-Nurmi

It is not often that athletes will campaign to reinstate a fellow competitor, but that is just what the marathon runners of the 1932 Olympics in Los Angeles did. Long-distance legend Paavo Nurmi was disqualified from the Olympic Games after officials claimed that he had accepted too much money for travel expenses and was now a professional athlete.

During Nurmi’s incredible long-distance running career, he had become the first athlete to win five Olympic gold medals at one Olympic Games. But his success made him a global superstar, which caused suspicions over the payments that he was receiving to appear at competitions.

After he was branded a professional in a competition reserved for amateur athletes, Nurmi was suspended and never competed in the Olympics again.

4 Unwanted Violence In Tae Kwon Do

4-angel-matos-kicks-referee

Photo credit: DG–VISION via YouTube

Kicking is usually encouraged in martial arts. But Cuban competitor Angel Matos was disqualified from the 2008 Olympic Games and banned for life for kicking a referee in the face.

During his bronze medal match, Matos took too long during a medical time-out and was disqualified by referee Chakir Chelbat as a result. However, Matos wasn’t impressed by the ruling and unleashed a powerful kick right in the face of the referee.

3 A Controversial Fine In Cycling

3-Arie-van-Vliet

The 1936 Berlin Olympics was filled with controversy. However, among the political and racial problems on the track, a bizarre incident occurred on the velodrome.

Toni Merkens, a German cyclist, was competing in the sprint match final against Arie van Vliet of the Netherlands when Merkens blatantly interfered with van Vliet’s line. No foul was called, and Merkens went on to win the gold medal.

The Dutch teamed protested. But after much debate, it was decided that Merkens should still receive his medal and simply pay a fine of 100 marks.

2 Dodgy Refereeing In Boxing

If a fighter in the boxing ring fell to the ground five times in one round, you would think that he probably wouldn’t win his match. That’s what the whole stadium in ExCel London thought during the match between Azerbaijan’s Magomed Abdulhamidov and Japan’s Satoshi Shimizu in 2012.

However, Ishanguly Meretnyyazov, the referee from Turkmenistan, thought differently and declared Abdulhamidov the winner after ignoring three knockdowns and helping him to fix his headgear. Shimizu was reinstated after appeal, and the referee was sent home the next day by the International Boxing Association.

1 Cheating Track Twins

Identical twins can get up to some clever tricks when people can’t tell the difference between them. In that vein, Madeline and Margaret de Jesus pulled off a spectacular illusion at the 1984 Olympics when Margaret posed as Madeline in front of the eyes of the world.

It all started after the long jump when Madeline injured herself and couldn’t run in the heats of the 4x400m relay in which she was scheduled to participate. With the sisters looking so alike that their own coach couldn’t tell them apart, it was easy for Margaret to step in for the heats and help the Puerto Rican team qualify for the final.

They twins nearly got away with their plan. But after finding out about the scheme, their own coach pulled the team from the final.

Natalie is a history student at St. Andrews University with a keen interest in all things sport.

]]>
https://listorati.com/10-times-the-olympic-games-werent-so-noble/feed/ 0 12577
Ten Countries That Weren’t Countries for Very Long https://listorati.com/ten-countries-that-werent-countries-for-very-long/ https://listorati.com/ten-countries-that-werent-countries-for-very-long/#respond Mon, 04 Mar 2024 00:10:59 +0000 https://listorati.com/ten-countries-that-werent-countries-for-very-long/

The world’s oldest countries take great pride in how long they have been successful nations, nation-states, or republics. Even if they aren’t the same countries now that they were way back when, the history of a place is held in high regard when that place has had such a formative impact on society.

Take ancient Greece, for example, or ancient Rome. There is still understandably and rightfully a great deal of pride in present-day Greece and Italy for the impact of those two cultures and the legacy they leave behind. And even relatively “young” countries across the world (looking at you, here, USA!) have proud histories and very vocal supporters for all that they have achieved.

But what happens when a country only exists for a few short years? Or even shorter than that? Not all independent nations were made to last. Some were annexed, absorbed, invaded, overtaken, or otherwise destroyed in an incredibly short amount of time after declaring independence and sovereignty. And in this list today, that’s what we’re going to take a look at! The following ten countries—if you can even call them that—were not countries for very long at all. But they made this list, and they made a (small) mark on history, so that counts for something, we suppose!

Related: Top 10 Countries Held Back By Their Geography

10 The Republic of West Florida (1810)

The Republic of West Florida was a very short-lived nation of a region in what is now far-west Florida, its panhandle, and further to the west that was at the time known as the “Florida Parishes.” That region had recently been acquired by the United States as part of the Louisiana Purchase. Still, the people there didn’t care much for the governments around them—either the U.S. or the Spanish, who were on the way out.

Before the Spanish could go, though, in September 1810, the residents of these so-called “Florida Parishes” in that state’s current panhandle took up arms and violently overthrew the Spaniards once and for all. In turn, they declared themselves an independent nation, naming it the Republic of West Florida.

That didn’t last very long, though. Up in Washington, D.C., the Americans were watching the situation carefully. They did not care for an armed insurrection going on in what had now become their borders. Even if it was an insurrection against the Spanish, the Americans didn’t really want to encourage off-shoot movements and other fledgling nations. So, they moved in quickly.

The West Floridians named their capital city St. Francisville, and they even elected a president named Fulwar Skipwith (yes, really) to run the new country. But by December 1810, that was all over. The area was forcibly annexed by the United States, and the Republic of West Florida was no more.[1]

9 The Paris Commune (1871)

The Paris Commune was an independent socialist government that was abruptly and violently set up—and then abruptly and violently quelled—during the spring and early summer of 1871 in Paris, France. This whole thing began during the Franco-Prussian War of 1870. By early 1871, the French National Guard had successfully defended Paris. But there was major discontent within the ranks of their soldiers.

In September 1870, French leaders established the Third Republic. But it didn’t stand very long. On March 18, 1871, French soldiers of the National Guard seized control of Paris. They killed two French army generals and then refused to submit themselves to the authority of the Third Republic. Instead, they established an independent government and declared sovereignty as the Paris Commune.

Over the next two months, the Paris Commune governed the famous city. The soldiers established a series of mostly progressive policies that appealed to them from several different schools of 19th-century political science thought. Those policies included the separation of church and state, the abolition of child labor, self-policing, and more pro-worker labor beliefs. All Roman Catholic churches and schools were shut down, too.

But whatever the Paris Commune hoped to achieve simply didn’t happen. It only took two months and three days, and on May 21, 1871, the “Bloody Week” began. Still known in France as “La Semaine Sanglante,” the “Bloody Week” saw national French Army leaders suppress and destroy the short-lived Paris Commune nation once and for all.[2]

8 The Republic of Mahabad (1946)

The Republic of Mahabad was a Kurdish ethnic state that existed very briefly in Iran—for most of 1946, in fact, and not a second longer. Also sometimes called the Republic of Kurdistan, this short-lived and self-governing nation began in the western portion of present-day Iran on January 22, 1946.

With World War II having just ended and the Soviet Union exploring its geopolitical options in the Middle East at that point, the Republic of Mahabad caught early financial, logistical, and political support from the Soviets. And they weren’t the only nation in that area. There was also a very short-lived (and entirely unrecognized) Soviet puppet state called the Azerbaijan People’s Government, which also functioned for a time in that area. But the Republic of Mahabad was a bit more significant—and they had more significant dreams.

The Republic of Mahabad didn’t have much territory to its name, covering just a section of what is present-day northwestern Iran and running down the western side of that nation. But they had some formidable cities within that area, including Oshnavieh, Bukan, Naghadeh, and Piranshahr. They also claimed three other contested cities—Urmia, Khoy, and Salmas. The people who backed this Kurdish state were wildly patriotic for their cause, too.

But just about two months into the Mahabad experiment, in late March 1946, the United States and other Western powers put pressure on the Soviets to leave the region. The Soviet Union acquiesced, and just like that, Mahabad’s biggest ally was gone. Iran soon re-asserted its power over the rest of the region, isolating Mahabad economically and socially. By the middle of December, the government had imploded, and the nation’s brief life was snuffed out.[3]

7 The Republic of South Maluku (1950)

At the end of World War II, the Netherlands began the process of pulling out of their colonies in what is present-day Indonesia and relinquishing control of those islands, the surrounding territory, and their half of New Guinea. During that process, Indonesia, as we know it today, gained independence in 1949. There was just one (big) problem with that.

Indonesia is composed of tons and tons of islands, both large and small, and various ethnic groups who did not see eye-to-eye with the early rulers Indonesia had installed in their new nation. That included a group of Moluccan people who, in 1950, created the independent, sovereign Republic of South Maluku.

Neither the Dutch nor the Indonesians cared for this, having very much feared the potential destructive power of separatist states. And when it came time to disband old Dutch colonial forces that had been serving in Indonesia, the fate of several thousand pro-Moluccan soldiers who wanted to fight for their local region’s independence was suddenly a big worry.

Interestingly, with the Republic of South Maluku officially declared, the Indonesians were now forced to do something to get them back in line. So they made a fascinating move with all those Moluccan soldiers: They transferred them thousands of miles away to the Netherlands. More than 12,500 Moluccans were sent forcibly to live in the Netherlands.

This created a massive problem for Indonesia, as it fanned the flames of the Republic of South Maluku’s desire for full-time and eternal independence. It also created a major problem for the Dutch, who now had to house thousands of Moluccan immigrants in Amsterdam and other cities that had only just recently been absolutely decimated by World War II.

As far as the fate of the Republic of South Maluku goes, that independent state was quickly and forcibly brought back under the control of Indonesia before the end of 1950. Today, in Indonesia, various separatist groups spring up from time to time (most recently and most violently, the ones in West Papua), but the Republic of South Maluku itself is no more.

Here’s where things get really interesting: Most of the 12,500 Moluccans who were moved to the Netherlands in 1950 never went back home! Today, census estimates are hard to come by since all their descendants are full Dutch citizens. However, estimates hold that there are somewhere around 40,000 or 50,000 Moluccans who are now two, three, and four generations into calling the Netherlands their (no longer adopted) homeland![4]

6 The State of Katanga (1960–1963)

On July 11, 1960, a man named Moïse Tshombe and his very influential political party in the southern region of the Congo declared that they were beginning a new, independent nation. Called the State of Katanga, this breakaway nation did not want to have any part of the Congo after the Belgian colonial government left and the craziness of independence set in.

Katanga, you see, was a very mineral-rich area in the far southern section of what is now known as the Democratic Republic of the Congo. There, minerals in the “Copperbelt” were mined ruthlessly by all kinds of international conglomerates, which then picked up big profits off the backs of Katangan laborers.

Tshombe wisely understood this, smartly recognized that his region of the world had some financial potential, and wanted out of the craziness of the Congo. Upon declaring independence on that July 11, 1960 day, Tshombe infamously said, “We are seceding from chaos,” which was a direct rebuke of the lawlessness that was rife across the rest of the Congo.

There was just one problem: Literally, nobody else in the entire world wanted the State of Katanga to exist. Not the US government, not the CIA, not the KGB, not the Soviet Union, and not any other fledgling African nation that was worried about secessionist elements within their own borders. International diamond, copper, and other mining corporations were making far too much money in the area at the time to risk political strife, too.

As such, Tshombe’s idea for full independence was a bad one as far as everybody but him and his supporters were concerned. By 1963, Tshombe was driven into exile in Spain—he reportedly took over a million gold bars with him on the way out. While he eventually returned a few years later as the Prime Minister of the Congo, the State of Katanga didn’t survive long enough to see 1964.[5]

5 The Republic of Biafra (1967–1970)

The Republic of Biafra was a short-lived independent state in present-day Nigeria that seceded from that nation after major ethnic friction. Historically, the north of Nigeria was far more prosperous and economically connected than the south and west. The north was also full of Hausa ethnic people, while the minority Igbo people were vastly outnumbered there.

By late 1966, tens of thousands of Igbo people had been massacred in Nigeria’s north, and the area was devolving quickly into a full-scale civil war. Upon another group of Igbo people being outright expelled from northern and eastern Nigeria, a Lieutenant Colonel (who later became a general) named Odumegwu Ojukwu declared a new independent nation had been formed under the name of the Republic of Biafra.

General Yakubu Gowon, the head of Nigeria’s federal government, flat-out refused to acknowledge Biafra as an independent state. Others did, though. Many African nations, including Côte d’Ivoire, Gabon, Tanzania, and Zambia, officially opened up democratic relationships with Biafra in early 1967. France even sent the new nation a major stash of weapons with which to defend themselves.

That was the other thing, too; because Nigeria didn’t want Biafra to secede, the whole thing cruised toward a brutal internal struggle. Thus began the awful Nigerian Civil War, which ran through the rest of the 1960s and claimed at least a half-million lives—and possibly more than three million, according to some estimates.

Through it all, Biafra was not meant to be. The region was landlocked, and without shipping lanes of its own, it struggled to complete basic economic trading patterns during the time of war. Worse yet, it was very hard to get supplies to Biafra and its people. By 1969, famine and disease began to horribly ravage the area as the civil war raged on interminably.

Nigerian forces were finally able to completely rout Biafran forces in a series of key battles in December 1969 and January 1970. Fearing for his life, Ojukwu fled to Côte d’Ivoire. On January 15, 1970, with Biafra on the brink of collapse anyway, its remaining generals fully surrendered to Nigeria.[6]

4 The Republic of Formosa (1895)

The Republic of Formosa was a very short-lived nation that existed for just a sliver of time before it was swallowed up by the Japanese. And when we say “short-lived,” we really do mean short-lived! In the year 1895, the emperor of the Qing dynasty of China formally ceded the island of Taiwan to the Empire of Japan. As part of the Treaty of Shimonoseki, the island was meant to be taken over and occupied by Japanese troops.

From there, the Japanese would go on and administer it full time, taking over the job that had previously belonged to China and the emperor of its Qing dynasty. But while the Japanese were interested in Taiwan (then known as Formosa to many in the West), the locals were very much NOT interested in having Japan come through and administer their lives.

So, on May 23, 1895, locals in Taiwan proclaimed the beginning of their new nation, known as the Republic of Formosa. A democratically elected government was installed, which at that time and in that area of the world was a notable rarity. But it didn’t have any staying power. On October 21, 1895—just 151 days after the birth of the Republic of Formosa was declared—the Japanese landed troops on the island and almost immediately took over the capital city, Tainan. And thus, that was that for the Republic of Formosa.

There is one interesting (and minor) sidenote here for all of you history buffs out there. As we already noted in this section, people like to pounce on the Republic of Formosa’s democratic leanings as a point of pride. That’s great, but some go so far as to proclaim it to be the first East Asian republic ever formed—and that part is not true.

The Lanfang Republic in Borneo was established way back in 1777 and lasted for a very, very long time. The Republic of Ezo in Japan was formed in 1869 and sustained itself for a long time, too. Still, the Republic of Formosa is an important part of Taiwanese history. And it’s one of the shortest-lived nations in all of world history![7]

3 East Timor (1975–1976)

East Timor was a breakaway region within Indonesia that declared its independence from the surrounding island nation in late 1975. Historically, while most of the rest of Indonesia had been administered by the Dutch before its independence from the colonial rule of the Netherlands in 1949, East Timor was forever separate.

Centuries before, the Portuguese had landed in East Timor, and even as the Dutch took over rule across the rest of Indonesia, East Timor remained a Portuguese colony. But in 1974, the Carnation Revolution way back in Portugal led to a series of colonial consequences—most notably, the Portuguese completely pulling out of East Timor.

In turn, the Timorese people had absolutely no desire to live under the rule of the Indonesians. So they didn’t, and in late 1975, they declared East Timor an independent nation. The Indonesians acted swiftly. On December 7, 1975, Indonesian troops occupied East Timor. Over the next several months, they completely dismantled the already powerless government that East Timor had hastily installed. By early 1976, East Timor completely ceased to be an independent nation, and it was swallowed up in the whole by Indonesia.

Now, if you are a geography buff, you may be saying to yourself, “I’m pretty sure East Timor exists as a country right now, though.” And you’d be right! For the next 23 years after early 1976, the Indonesians brutally administered the area and committed wanton acts of violence. By 1999, a referendum called for East Timor to become independent again. And by 2002, it was so.

Today, East Timor (also commonly called Timor-Leste) is a sovereign nation once more, and a stable one at that. It was that first go-around in 1975, though, where their time as a nation lasted only a few months before total destruction.[8]

2 The Republic of Hatay (1938–1939)

For about nine months, the Republic of Hatay existed as an independent state and a completely sovereign nation within what is commonly known then and today as Turkey. It all started on September 2, 1938, when an assembly within the breakaway region of Hatay proclaimed that the Sanjak of Alexandretta was formed and the Hatay State was official.

Alexandretta was named the capital city, and for a while, things were peaceful. The French and Turkish even oversaw joint military supervision over the state as it got its bearings together as a country and tried to figure out how to exist while no longer part of Turkey.

Unfortunately for the Republic of Hatay, though, things weren’t meant to be. On June 29, 1939—only about nine months after the nation was officially first formed—the Hatay legislature voted to disestablish Hatay State following a public referendum. That referendum came through overwhelmingly for the Hatay region to rejoin Turkey.

Both at the time and in years since, observers and historians have wondered whether the referendum was “phony” or “rigged” in the first place. Regardless, the French saw Hatay’s reunion with Turkey as a possible way to keep Turkey from allying with Nazi Germany as the rumblings of World War II were picking up. And no matter how legit or not, the referendum ended Hatay’s nine-month run as a sovereign nation just like that.[9]

1 The Republic of Slovene Styria (1941)

World War II was a total cluster all over Europe. Millions of people were killed—both soldiers and civilians—and the sheer displacement of and mistreatment of populations for years on end was absolutely staggering. There were also major political upheavals across the continent, even beyond the ones most often taught in your history books. Take, for example, the case of Slovene Styria.

That region is roughly analogous to the modern-day nation of Slovenia. At the time, just before World War II broke out, it was part of Yugoslavia as decreed by the re-done Yugoslav Constitution of 1931. And for a while, things worked out just fine like that for Slovene Styria. But in April of 1941, Nazi Germany invaded Yugoslavia. When they did that, they immediately annexed Slovene Styria as their own territory. As you might expect, the Slovene locals didn’t care for that at all.

What followed from spring 1941 through late May 1942 was a vicious battle within the greater battle of World War II. Nazi units went all over Slovene Styria and prohibited the use of the Slovene language or any historically Slovene-related cultural relics. They demanded everyone speak German and pledge their undying support for Hitler.

Intellectuals, clergymen, and other public figures were expelled or killed. But the Slovene people fought back. They declared themselves a sovereign state and put together battalions of loyal troops. Over the next year, they fought viciously against the Nazis as a Republic bent on guarding their land and their way of life.

Of course, we know how things eventually ended for the Nazis. But it wasn’t before tens of thousands of Slovene men gave their lives for their (very short-lived) nation. After World War II ended, the Slovenes were content to be reorganized within the reformed Yugoslavia, pointing to Ljubljana as their capital city.

There, Slovene Styria came to be known as an integral and economically prosperous part of Yugoslavia and was officially called the Socialist Republic of Slovenia. Today, the area is a nation once again—known solely as Slovenia—and it remains one of the best-kept travel secrets in all of Europe.[10]

]]>
https://listorati.com/ten-countries-that-werent-countries-for-very-long/feed/ 0 10551
Top 10 Scientific Theories That Were Certain (And Then Weren’t) https://listorati.com/top-10-scientific-theories-that-were-certain-and-then-werent/ https://listorati.com/top-10-scientific-theories-that-were-certain-and-then-werent/#respond Sat, 17 Feb 2024 01:32:57 +0000 https://listorati.com/top-10-scientific-theories-that-were-certain-and-then-werent/

We often think of science as absolute. Something either is or it isn’t and through scientific study we have proof to back up that stark black and white. The reality though, is that the universe as measured through science is much more complex and unusual. Sometimes the facts just refuse to be consistent at all. Sometimes the results of one study directly or indirectly contradict the findings of another. This is a collection of such scientific contradictions.

Top 10 Ridiculously Common Science Myths

This is probably also a good time to remind people of the fallacy of argumentum ad populum – the theory that because a majority believes something, it is true. “Scientific consensus” is often used as a kind of proof for something, when it is not at all. In fact, when a person uses the “consensus” argument you should scrutinize everything they say as potentially questionable, because a person using one fallacy to “prove” a point is very possibly clinging to many other fallacies also.

10 Beer: Health Food and Poison


There’s nothing we love more than to hear our vices are actually virtues. What else causes more sensational headlines than hearing that science has proven beer is ultimately good for us and that we should drink more of it? Thankfully, this isn’t just limited to headlines. There is very real science that suggests various health benefits linked to the world’s favorite hoppy drink.

One such example was published by the International Journal of Endocrinology where evidence of a link between silicon and bone strength was recorded. The theory presented was that silicon dioxide (SiO2) helps with the body’s ability to calcify. Whatever the exact method, rats with a healthy supply of silicon had improved calcium incorporation in their bones compared to rats that were deficient in silicon. Silicon is found in such foods as grains, cereals, green beans, and of course—beer. In short, beer makes bones stronger.[1]

Aside from Silicon there are other useful chemical properties to beer. A paper published in Mutation Research/Fundamental and Molecular Mechanisms of Mutagenesis outlines the effects of the chemical Xanthohumol, which is also present in beer. What makes this worthwhile is that Xanthohumol has demonstratively been shown to protect the liver and colon from cancer causing[2] mutagens found in cooked food.[3] In other words, beer helps fight cancer.

Other studies of beer found evidence that moderate amounts stop inflammation,[4] helps prevent kidney stones,[5] and the same silicon from beer also helps fend off Alzheimer’s Disease.[6] One would think beer is the perfect health food, except–

In 2018 an exhaustive study was published about the effects of alcohol on the health of individuals and entire populations. It included 500 collaborators assisting from 40 different nations and 694 different sources of data covering huge swaths of Earth’s total population. What this study concluded was that despite the limited health benefits discussed above, 3 million people died from Alcohol related health problems in 2016 alone. In fact, for males between the ages of 15-49 Alcohol was responsible for 12 percent of all deaths. If you take all people together as one group, Alcohol was the 7th largest cause of death in the world.[7]

The data was summarized by it’s senior author, Dr. Emmanuela Gakidou, who said “The health risks associated with alcohol are massive. Our findings are consistent with other recent research, which found clear and convincing correlations between drinking and premature death, cancer, and cardiovascular problems.”

What amount of alcohol would be safe? She concluded, “Zero alcohol consumption minimizes the overall risk of health loss.” In other words, there is no amount of alcohol that will not increase your chance of premature death.[8]

9 Coffee Will Both Give and Protect you from Glaucoma


A research study published in the Journal of Agricultural and Food Chemistry showed the positive effect of one of the main components in raw coffee –chlorogenic acid– shields eyes against the retinal degeneration caused by glaucoma, aging or diabetes. This protection can slow down eyesight deterioration and even blindness. To discover this, researchers exposes mice’s eyes to nitric oxide, which causes retinal degeneration, but mice treated with chlorogenic acid suffered no ill effects, unlike the mice with no prior treatment.[9]

The chair of the American Osteopathic Association, Dr. Robert Bittel said of this study, “As with any study that cites commonly used food items as therapeutic in some way, caution has to be taken so that the public understands the negative as well as the positive potential implications of drinking coffee.”[10]

While coffee is shown to protect against the effects of glaucoma, it also has the dubious side effect of increasing the likelihood of developing the disease in the first place among certain groups of people. Thankfully, for most of us the increase is not statistically significant, but one study published in Graefe’s Archive for Clinical and Experimental Ophthalmology showed that for those who have already developed glaucoma the intake of coffee made the condition worse.[11] Yet another set of research showed that woman with a family history of Glaucoma (but have not yet developed it themselves) had an increased risk of getting the disease if they were coffee drinkers.[12]

For some people then, coffee is both the poison and the cure.

8 Stretching Before Exercise Either Hurts or Does Nothing To Performance


For years stretching before exercise was a given. Everyone acknowledged the benefits and it was even taught as part of physical education in schools, however there was little to no research found to back up the supposed benefits. When research started being conducted on the matter though, the common knowledge was upended. For example, one study had two groups of trained athletes run one mile on three different occasions. One group performed a series of 6 different lower body static stretches and the other group sat without stretching. What they found was that the group that did not perform the stretches finished their mile significantly faster (about half a minute sooner on average, compared to the stretching group). The study concluded, “Study findings indicate that static stretching decreases performance in short endurance bouts…Coaches and athletes may be at risk for decreased performance after a static stretching bout. Therefore, static stretching should be avoided before a short endurance bout.”[13]

Meanwhile, another study published in Medicine & Science in Sports & Exercise sought to learn the effects of stretching on more than just running. This study had 20 participants who did a well rounded series of stretches and warm ups that included seven lower body and two upper body regions alongside a control group. Afterward these athletes were put through a battery of testing exercises that measured for flexibility, run times, vertical jumps, and even ability to pivot directions. The study concluded that the stretching had no measurable impact on the athletic ability of any of the participants.

Interestingly enough though, it did have a mental effect. Those who stretched believed it would improve their performance significantly compared to if they hadn’t stretched, but besides boosting their confidence the stretches did no such thing. According to this study then, we can still stretch to feel good, but don’t expect it to give you any edge.[14]

7 Picking Your Nose Is Harmful, Eating Your Boogers Is Healthy


Though widely considered a disgusting habit, more of us pick our noses than we might like to admit. A quick survey among 200 Indian teenagers showed that literally all of them commonly participated in rhinotillexomania (the medical word for nose picking).[15] But this is concerning for more reasons than just social protocol. A study published in Infection Control & Hospital Epidemiology tested and questioned 238 healthy patients and 86 hospital employees about their nose picking habits. The test showed that frequent nose pickers had an increased presence of the dangerous bacteria Staphylococcus aureus in their nasal passages.

Though about 30% of the population carry the Staph bacteria with them, usually with no ill effects, if a wound[16] allows the bacteria into the body it can cause potentially fatal infections.[17] This study shows nose picking is bad for you, because it increases the chances at having one of these dangerous infections.

But what if we don’t just stop with picking our nose? Imagine if we put what we find to good use. One study, titled “Salivary Mucins Protect Surfaces from Colonization by Cariogenic Bacteria” showed the positive impact of mucins throughout the body. This mucus helps, among other things, to protect the surface of our teeth from the myriad of attacking bacteria that strive to dismantle them. Where in our body is a healthy supply of salivary mucins? In our dried nasal mucus aka. Boogers.[18] Not only does this mucus protect our teeth if eaten, but there’s also evidence to suggest that this may help prevent respiratory infection, stomach ulcers, and HIV.[19]

An Australian lung specialist named Friedrich Bischinger commented on the findings of this study and said, “In terms of the immune system, the nose is a filter in which a great deal of bacteria are collected, and when this mixture arrives in the intestines, it works just like a medicine.”

Whether or not these benefits outweigh the increased risk of staph infections is up to you, but the author of the mucus eating article also suggests that we could make an artificial version of the salivary mucus to get similar benefits.[20] Eventually we could have our boogers and eat them too.

6 Chocolate is a Miracle Food That Ruins Your Health


Chocolate is a world favorite food with some 72 million metric tons of it being consumed every year.[21] It’s no surprise then how well studied this food is and how often we’ll hear reports about its medical benefits. When browsing scientific papers there seem to be no end to the health benefits of chocolate.

Some studies that have examined this sweet treat concluded that it can help fend off cardiometabolic disorders and cardiovascular diseases,[22] improves cognitive functions in older adults,[23] lower blood pressure,[24] and even protect your skin from UV-induced erythema.[25]

One study showed chocolate as slowing colon cancer in rats![26] Chocolate has dozens of minor health benefits to enjoy.

However, alongside chocolate’s many tiny health benefits there are substantial consequences to eating chocolate. Most notably could be its high sugar and fat content that leads to obesity. One study concluded that for postmenopausal woman, every 1oz of chocolate eaten a week increased weight gain by 1kg over a period of 3 years. The more chocolate consumed, the higher 3 year weight gain.[27] This is concerning because obesity can result in diabetes, hypertension, heart disease, respiratory disorders, cancer, cerebrovascular disease, stroke, and many other conditions.[28]

Meanwhile, rats may avoid colon cancer by eating chocolate, but in humans chocolate eating seems to be linked with higher chances of prostate cancer.[29] Like so many foods, chocolate is both good for us and bad for us in different ways.

Alice H. Lichtenstein, professor of nutrition science and policy at Tufts University in Boston summed it up nicely when she said, “If you enjoy chocolate the important thing to do is choose the type you enjoy the most and eat it in moderation because you like it, not because you think it is good for you.”[30]

10 Facts We All Get Wrong About Colors

5 Self Control Can and Can’t Be Depleted


Ego depletion is a concept in psychology that has been widely understood and tested. This theory suggests that self control is a resource that can be stockpiled as well as depleted.[31] The study that first suggested this theory had students participant in multiple tasks that required self control. First they were shown two foods, radishes and chocolate cookies. The paper says, the “Chocolate chip cookies were baked in the room in a small oven, and, as a result, the laboratory was filled with the delicious aroma of fresh chocolate and baking.” Some participants were instructed to eat only the radishes, some were told to eat only the cookies, and another group was shown no food at all. The group told to eat only radishes had to exercise self control so as not to eat the cookies. Afterward, the subjects were given an impossible puzzle to solve (but they weren’t told that) and given a bell to signal to the researchers if and when they wanted to give up. It would require self control to continue working on a task with no positive results.

Ultimately the study showed that the group that first had to exercise self control by eating only radishes and not the tasty looking cookies also gave up sooner on the impossible puzzle. The conclusion reached was that their self control had been depleted slightly by the first test and that resulted in them having less self control to use in the second.[32] The same concept had been replicated with different factors. Some labs found self control could be reduced by forcing people to make purchasing decisions or talking racial politics with someone of a different race. Some labs even tested for Ego depletion in dogs and found it.[33]

On the other hand, a more recent study choose to definitively test ego depletion and sought a task that would require self control that wasn’t affected by things like personal taste (after all, what if someone hates Chocolate chip cookies?) or culture. It involved 24 labs from Australia, Belgium, Canada, France, Germany, Indonesia, the Netherlands, New Zealand, Sweden, Switzerland, and the United States. Rather than cookies and radishes it involved computerized tasks that required self control. Namely they had participants play digital games that required quick responses, but the answers weren’t immediately obvious. The participants had to control their impulses and instead find the correct answer. This study found that there was no significant depletion of performance from task to task associated with self control.[34]

4 Red Meat is Unhealthy. Maybe. We’re Not Sure


A rack of ribs during a summer barbecue or a hotdog during a spring baseball game. Red meat is a staple in many a diet and to some its taste elevates it above other foods. There’s something truly satisfying about a perfectly cooked and seasoned steak, but our reverence for red meat has historically been met with caution from the scientific community. Studies have shown that processed red meats like hotdogs[35] increase the risk of glioma, a tumor that occurs in the brain and spinal cord.[36] Still other findings showed increased risk of colorectal cancer from eating red meat.[37] Yet another negative of the food is the accumulation of trimethylamine N-oxide[38] which is a cause of heart disease.[39] All of this led to most health groups recommending we limit our intake of red meats, especially the processed variety.

However a recent and controversial meta-analysis study (a scientific study that is actually an examination of many different studies on the topic and comes to a conclusion based on all of the results put together)[40] published in Annals of Internal Medicine argued that there was not enough scientific evidence to support the recommendations to eat less red meat. Based on this meta-analysis they concluded, “the certainty of evidence for the potential adverse health outcomes associated with meat consumption was low to very low” and “there was a very small and often trivial absolute risk reduction based on a realistic decrease of 3 servings of red or processed meat per week”.[41] Their conclusion was not that red meat was healthy, but that there was not yet significant enough proof that it was harmful to suggest limiting red meat intake for health reasons.

3 Video Games Improves or Impairs Children’s Social Skills


For gamers it is an age old frustration to hear that “video games rot your brain”. This sentiment has followed the industry since the first wave of video games hit the arcades, but while it was well recited by some there was no evidence to back it up. With time the scientific community was able to turn their attention to video games and study the effects on children for themselves. One paper published in the Social Psychiatry and Psychiatric Epidemiology studied children ages 6—11. The study measured each child’s video game play time per day and compared it to data gathered from questionnaires given to their parents, teachers, and the children themselves. They also looked at each child’s academic performance. After certain factors were accounted for they found that higher video game play time among the children was associated with 1.75 times the odds of high intellectual functioning and 1.88 times the odds of high overall school competence, as well as less relationship problems with their peers.[42]

Katherine M. Keyes, PhD, the assistant professor of Epidemiology at the Mailman School of Public Health commented on the results of this study, saying, “Video game playing is often a collaborative leisure time activity for school-aged children. These results indicate that children who frequently play video games may be socially cohesive with peers and integrated into the school community. We caution against over interpretation, however, as setting limits on screen usage remains and important component of parental responsibility as an overall strategy for student success.”[43]

And she was wise to suggest parents still be involved in limiting children’s screen time, because another study also examined the effect of video games on children, but this study examined their lives over the course of six years, starting when they were six years old.[44] 873 Norwegian school children were involved and every two years their parents reported how much time was spent video gaming and teachers evaluated the children’s social competence using such factors as how well they followed directions, controlled their behavior, and showed confidence in social situations.

The results showed that poor social performance was linked to an increase of video gaming in the future, but that video gaming itself did not lower social skills in the future, except for one notable group. 10 year old girls in the study that had high levels of video game play time were found to be less socially competent at 12 years old then the girls who didn’t have high video game play time.[45] While video games are shown to be helpful for most, it isn’t a universal truth. For some it increased social skills, but for others it depletes them.

2 Early Rising Is a Blessing and a Bane


“The early bird gets the worm” is a common idiom often heard from the mouth’s of early risers, some waking up and being productive before the sun. What they do, they do for good reason, as one survey has shown. Published in the Journal of Applied Social Psychology one study questioned 367 university students about their sleeping habits and proactivity. It included such statements for them to agree or disagree with such as “I spend time identifying long-range goals for myself” and “I feel in charge of making things happen.”

Ultimately the survey found that, “Morning people were more proactive than evening types, and people with small differences in rise time between weekdays and free days were also more proactive persons.”[46] The author of the survey said, “When it comes to business success, morning people hold the important cards. My earlier research showed that they tend to get better grades in school, which get them into better colleges, which then lead to better job opportunities. Morning people also anticipate problems and try to minimize them, my survey showed. They’re proactive. A number of studies have linked this trait, proactivity, with better job performance, greater career success, and higher wages.”[47]

On the other side of the coin a study published in The Journal of Clinical Endocrinology & Meetabolism examined 447 men and women between the ages of 30—54, who worked at least 25 hours outside their home and found found that their early rising did not line up with their natural circadian rhythm.[48]

Patricia M. Wong, MS, from the University of Pittsburgh said about this study, “Social jetlag refers to the mismatch between an individual’s biological circadian rhythm and their socially imposed sleep schedules. Other researchers have found that social jetlag relates to obesity and some indicators of cardiovascular function. However, this is the first study to extend upon that work and show that even among healthy, working adults who experience a less extreme range of mismatches in their sleep schedule, social jetlag can contribute to metabolic problems. These metabolic changes can contribute to the development of obesity, diabetes and cardiovascular disease.”[49]

1 Eating Eggs Does and Doesn’t Contribute to Cardiovascular Disease


Eggs are a dietary staple for much of the world. In fact 73% of adults are considered whole egg consumers.[50] Naturally then, the effects eggs have on our health is a topic of concern for many and so has been a topic of study for the scientific community. The trouble is the information hasn’t been entirely conclusive.

Historically one of the major health complaints against eggs is the 185 milligrams of cholesterol in the yolk. Certain forms of cholesterol are shown to increase the risk of heart disease[51] and science seems to support this concern. A 2019 study that monitored participants over a 17.5 year period found that each additional half egg consumed by an adult per day increased the likelihood of cardiovascular disease by 6% and even increased general mortality rate by 8%.[52][53]

Confoundingly though the science is not always consistent. One the same subject during the same year a separate study examined the effect of eggs on the likelihood of cardiovascular disease and it found no statistically significant link.[54] An author of the study, Maria Luz Fernandez, professor of nutritional sciences at the University of Connecticut, described eggs as having high cholesterol, but low saturated fat. The point being was, she said, “While the cholesterol in eggs is much higher than in meat and other animal products, saturated fat increases blood cholesterol. This has been demonstrated by lots of studies for many years,”

In other words, the cholesterol in eggs may not be the killer we thought.

“There are systems in place so that, for most people, dietary cholesterol isn’t a problem,” said research associate professor of nutritional sciences at Tufts University, Elizabeth Johnson.[55]

10 Sex Myths We All Believe

]]>
https://listorati.com/top-10-scientific-theories-that-were-certain-and-then-werent/feed/ 0 10164
10 Popular TV Characters That Weren’t Part of the Original Cast https://listorati.com/10-popular-tv-characters-that-werent-part-of-the-original-cast/ https://listorati.com/10-popular-tv-characters-that-werent-part-of-the-original-cast/#respond Mon, 22 May 2023 07:28:03 +0000 https://listorati.com/10-popular-tv-characters-that-werent-part-of-the-original-cast/

Adding new characters to a long-running television show is no easy task. In fact, there’s an entire trope called the Cousin Oliver, named after the character from the 1970’s classic The Brady Bunch. This is an extensive list of TV characters that were created late in the game to “spice things up.” This trope usually has a negative connotation, but not all late-stage characters are denounced by fans. For every Scrappy-Doo, there’s a gem that goes on to become a fan and critics darling.

Let’s look at ten fan-favorite TV characters who weren’t a part of the show’s original cast. But be warned, there are a few spoilers as well.

Related: 10 Iconic Characters Who First Appeared In Ads

10 Frank Reynolds: It’s Always Sunny in Philadelphia

The hit FX show It’s Always Sunny in Philadelphia is one of the longest-running comedies on television. As of 2022, it has 15 seasons under its belt, with no sign of slowing. Considering its humble beginnings, it’s easy to call this the little show that could. In the early 2000s, Charlie Day, Glenn Howerton, and Rob McElhenney were aspiring actors who crossed paths while auditioning for other films and TV shows. They eventually started shooting their own home movies on a Panasonic DVX100A, out of which the idea for It’s Always Sunny was born.

After shooting a pilot on a camcorder, it was picked up by the cable channel FX. The show was slow to attract an audience at first, but the execs at FX believed in it. They realized something was missing from the cast and decided to add a big name.

Enter Danny DeVito.

Despite the cast’s initial hesitation, Devito was added in season 2 as Dennis and Dee’s stepfather Frank Reynolds. The character is the polar opposite of the lovable persona DeVito is known for publicly. Instead, Frank is crass, profane, and cynical, making him the perfect addition to this dark comedy. This addition pulled the show back from the brink of cancellation. Most long-time fans of the show agree that DeVito’s character was the cherry on top that elevated a good show into greatness.[1]

9 Ben Linus: Lost

For fans of the hit ABC show Lost, it’s almost hard to remember that Benjamin Linus was not part of the original cast. This serialized drama had audiences hooked from the get-go with its unfolding mysteries. Beyond its successful first season, the show only continued to grow in popularity with its sophomore outing. And much of that growth is thanks to the addition of actor Michael Emerson in the role of Ben.

For most of the second season, Ben was held prisoner and fooled the main group into thinking he was a man named Henry Gale. When his lies are unearthed at the end of the season, it’s revealed that Ben is actually the leader of The Others, a shadowy group that inhabits the unexplored side of the island. Worst of all, Ben spent most of season 2 exactly where he wanted to be—observing the group. Ben’s grey morality, thirst for power, and often murderous tendencies are just a few of the qualities that make him a fan favorite.[2]

8 Fin Tutuola: Law & Order: Special Victims Unit

Much like Danny DeVito in It’s Always Sunny in Philadelphia, Ice-T was a well-known entertainer before joining the cast of Law & Order: Special Victims Unit at the beginning of its second season. In the show, he plays Odafin “Fin” Tutuola, a street-wise cop who transfers to the SVU from narcotics. His character has a tough exterior but a passion for helping abused children and victims of rape and assault. He is initially paired with Munch, a cynical and jaded older detective with a penchant for conspiracy theories. Despite being polar opposites on the surface, many SVU fans felt these two characters had a chemistry that matched that of Benson and Stabler, the show’s leads at the time.

Currently, Fin has been a main character on SVU for 22 years, making him the longest-tenured non-orignial castmember on this list. After Elliot Stabler left the squad back in 2011, Fin became the now-Captain Benson’s right-hand man and longest-running supporter.[3]

7 Rafael Barba: Law & Order: Special Victims Unit

The role of the assistant district attorney on Law & Order: Special Victim Unit has long been a revolving door. It’s tough to say which has been the most popular with fans, considering how loved Alexandra Cabot and Casey Novak were, but Rafael Barba easily gave them a run for their money.

Barba, played by Broadway vet Raul Esparza, first appeared in the season 14 episode “Twenty-Five Acts,” making him the latest season-joining character on this list. The actor was bumped up to a series regular the following season. Barba was known for being a no-nonsense strategic thinker who looked sharp in a three-piece suit. Fans of the show quickly embraced him for his wit, sass, and charisma. His character was the first male ADA to join the main cast.

In 2018, Esparza decided to leave the show and revive his stage career. His character received a rather divisive send-off in the episode “The Undiscovered Country” but has since made guest starring appearances across the 21st through 23rd seasons.[4]

6 Desmond Hume: Lost

Desmond Hume is one of the most enigmatic characters to come out of the show Lost, and that’s saying a lot. His first scene alone is considered one of the show’s most iconic when he is revealed to be among the contents of the hatch, one of the central mysteries of the show’s first season. Despite appearing in the first scene of season 2, his character takes off and isn’t seen again until the season finale. Desmond becomes a regular cast member the following year.

At first, Desmond appears to have lost his sanity, which is unsurprising since he has spent years in solitary confinement, thinking the world outside the island no longer exists. But as the series progresses, we learn more about his backstory, and a beautiful love story between him and his wife Penelope unfolds. His character is the main focus of the much beloved season 4 episode, “The Constant.” This surreal episode ties “Through the Looking Glass” as the top rated of the entire series, according to fans on IMDB. Desmond’s story is quite different from the vast majority of characters on Lost, but that uniqueness, paired with his affable and kind nature, is what makes him a favorite.[5]

5 Tommy Oliver aka the Green/White Ranger: Power Rangers

The first season of Mighty Morphin Power Rangers became a surprise smash hit with millennials back in the mid-1990s. Kids flocked to these five karate chopping, color-coded superhero teens. But the show shocked fans when it introduced an evil ranger in the 17th episode of the first season. Tommy Oliver was the new kid on the block, who also happened to be under the spell of Rita Repulsa, the arch nemesis of the show’s heroes.

What made Tommy popular with audiences was not just his long hair and bad boy looks—Power Rangers fans also empathized with his quest to find family and belonging. Beyond this, the show’s writers flexed their skills by crafting a great redemption arc in which he eventually regains control of his mind and goes on to lead the group as the White Ranger. The character also had an epic romance with Kimberly, the Pink Ranger.[6]

4 Michonne: The Walking Dead

Michonne Hawthorne is one of the most popular, and most lethal, characters from the hit AMC show The Walking Dead. However, many long-time fans of TWD forget that this katana-wielding assassin was not part of the original cast—despite making a brief cameo as a cloaked figure at the end of season 2. However, she does not become a regular cast member until the following season.

In the beginning, Michonne travels with Andrea, one of the main characters from the first two seasons. But the two quickly part ways when Andrea decides to stay in the mysterious suburban community of Woodbury, which Michonne rightfully doesn’t trust. So she goes out on her own and happens to cross paths with Rick Grimes and the rest of Andrea’s original group of survivors. Despite initial hesitance to trust her, Michonne quickly befriends Rick’s son Carl and eventually ends up in a relationship with Rick himself.

Michonne, played by Black Panther star Danai Gurira, remained a main cast member on The Walking Dead until its 10th season. She is believed to have a kill count that rivals that of Rick Grimes and Daryl Dixon.[7]

3 Lexa: The 100

There aren’t many characters on this list whose death almost tanked the entire show. The 100 is a post-apocalyptic teen series that aired on the CW from 2014 to 2020. The show focused on a group of 101 juvenile delinquents sent down to Earth from a space station 97 years after the end of the world. While never becoming a ratings juggernaut, the show maintained a fairly healthy viewership and garnered a passionate online fan base throughout its seven-year run.

In its second, and arguably best, season, The 100 introduced what would eventually become their most iconic character—a warrior queen named Lexa, played by Alycia Debnam-Carey. Lexa is introduced in a similar manner as Ben from Lost. The audience is led to believe she’s a limping servant girl, but it’s soon revealed that she is the leader of the grounders, the main antagonists (and eventual allies) of the first 2 seasons.

Lexa quickly became the love interest of Clarke, the show’s main character. In the third season, her character was killed by a stray bullet meant for Clarke soon after the two consummated their relationship. Off-screen, Debnam-Carey was simultaneously cast as a lead in the AMC series Fear the Walking Dead and was unable to continue shooting The 100. Unfortunately, many fans were furious and the show received a lot of public backlash. This also resulted in a drop in viewership and the show losing sponsors. Despite limping on for four more seasons, The 100 never quite regained its popularity.[8]

2 Spike: Buffy the Vampire Slayer

Buffy the Vampire Slayer is easily one of the most critically acclaimed supernatural teen dramas to ever grace the airwaves. The show originally was conceived as a retooled version of a movie of the same name, both by writer and director Joss Whedon.

In the third episode of the second season, the hit WB show introduced a handsome, bleach-blond vampire named Spike, played by actor James Marsters. Spike is a fast-talking, charismatic bad boy who dons a leather jacket and rides in on a motorcycle. He is, in many ways, the antithesis of his old friend Angel, who is Buffy’s boyfriend and the only vampire with a soul. While Angel tries to live up to his name, Spike, on the other hand, revels in being bad.

Despite his edgy exterior, Spike is a hopeless romantic at heart who believes in the beauty of love and poetry. Spike also has a contentious and controversial romance arc with the lead character Buffy, which is something that continues to divide the fanbase decades later. The character not only spent six seasons on Buffy the Vampire Slayer, but he also spent time as a lead character on the spinoff show Angel.[9]

1 Klaus Michaelson: The Vampire Diaries

Rounding out the list is yet another vampire—well, hybrid, to be correct. The Vampire Diaries quickly became must-see television for teens when it premiered back in 2009. And despite its successful and fast-paced first season, this CW outing massively upped its game in season two by centering the plot around a family of vampires called the Originals.

Klaus is essentially the patriarch of the Originals, who are the original family of vampires within TVD lore. This makes him different from your run-of-the-mill vampire. While most vampires in The Vampire Diaries universe can be killed with any wooden stake, original vampires can only be killed with a stake made of wood from an ancient tree. Klaus also becomes part-werewolf, making him the first hybrid in this cinematic universe.

Power and strength aren’t the only things that made Klaus such an unforgettable character. For one thing, he’s played by classically trained actor Joseph Morgan, who many say is one of the best actors to grace the CW. On top of being a dominant alpha, Klaus is a tortured artist who puts family before everything. His character became so popular that he was chosen to star in his own spinoff series, aptly titled The Originals.[10]

]]>
https://listorati.com/10-popular-tv-characters-that-werent-part-of-the-original-cast/feed/ 0 5895
Ten Horrific Shipwrecks That Weren’t the Titanic https://listorati.com/ten-horrific-shipwrecks-that-werent-the-titanic/ https://listorati.com/ten-horrific-shipwrecks-that-werent-the-titanic/#respond Sat, 11 Feb 2023 19:37:19 +0000 https://listorati.com/ten-horrific-shipwrecks-that-werent-the-titanic/

While less well known than the sinking of the Titanic, the ten nautical disasters on this list often eclipse the Titanic story in terms of sheer horror, scandal, and loss of life. With human nature itself proving either the salvation or doom of the castaways, here are tales of heroism, cannibalism, endurance, murder, and disappearance without a trace.

10 SS Arctic, 1854

If you are familiar with the sinking of the Titanic, then you are aware of the principle of “women and children first.” But what if that principle was ignored? On September 27th, 1854, the SS Arctic, a passenger paddle steamship of the Collins Line, entered a dense fog off the Newfoundland coast and collided with the French fishing vessel the Vesta. Attempts to patch the hole in the hull with sailcloth and mattresses failed, and over the course of four agonizing hours, the sea crept in, finally extinguishing the ship’s boilers and, with them, the pumps.

With 250 passengers and 150 crew on board, the Arctic’s six lifeboats were woefully inadequate to carry more than 180. At first, the process of loading the women and children went as planned—until panic began to spread amongst the ship’s crew. As discipline broke down, a wild melee ensued, and one boat after another was swarmed by mobs of men. One tipped over, sending most of its dozen occupants (mostly women) into the sea to drown. Desperate to restore discipline, the captain attempted to launch another boat on the opposite side of the ship, only to see it too filled with male crew rather than women and children.

The two remaining boats (and a makeshift raft built by loyal officers) were likewise taken by the ship’s crew, one boat stolen by the engineering staff who, brandishing firearms, told the crowd that they needed the boat to patch the hole in the ship. No sooner had the boat launched (only half full) when it rowed away, leaving the waiting women and children to their fate. Of the 400 aboard, only 85 survived (61 crew and 24 male passengers). All the women and children drowned.[1]

9 SS Pacific, 1856

As if things could not get worse for Collins Line founder Edward Collins, who had lost his wife and two children in the sinking of the SS Arctic, her sister ship, the SS Pacific, disappeared into the Atlantic without a trace in January of 1856. Leaving Liverpool for New York City with 45 passengers and 141 crew, no word of the ship’s fate was ever heard again, save for a message in a bottle washed up on the coast of the Hebrides islands in 1861. Whether authentic or a hoax, the message within offers us one possible explanation for the Pacific’s destruction:

“On board the Pacific, from L’pool to N. York. Ship going down. Great confusion on board—icebergs all around us on every side. I know I cannot escape. I write the cause of our loss that friends may not live in suspense. The finder of this will please get it published.”[2]

8 Empress of Ireland, 1914

Among the beneficiaries of updated lifeboat regulations in the wake of the Titanic disaster was the ocean liner RMS Empress of Ireland of the Canadian Pacific Steamship Company. Equipped with watertight doors and enough lifeboats to accommodate 280 more people than the ship was built to carry, the fact remains that when she collided with the Norwegian ship Storstad in a fog at the mouth of the Saint Lawrence River on the night of May 29th, 1914, she sank in scarcely 15 minutes, taking 1012 of the 1477 people aboard to their deaths.

The water poured into her side so quickly that there was no time to shut the watertight doors, and the list to starboard increased so quickly that it nullified the port side lifeboats, which could not be lowered. Many passengers sleeping on the starboard side drowned in their cabins, but some who made it to the boat deck were able to successfully launch five of the lifeboats.

Some five minutes after the collision, the power failed, plunging the ship into total darkness. Five minutes after that, with the useable lifeboats gone, the Empress of Ireland rolled onto her starboard side, allowing hundreds of the doomed to take refuge on the exposed port side hull, where they sat for a few agonizing minutes watching the frigid water slowly creep up the hull to claim them “like sitting on a beach watching the tide come in,” as one survivor put it.[3]

7 Essex, 1820

While falling far short of the death toll of the Titanicor any other entry in this list, the tale of the whaling ship Essex eclipses all the rest in terms of sheer horror. The real-world inspiration for Herman Melville’s Moby-Dick, the Essex, was twice rammed by a sperm whale in November 1820, some 2,000 miles west of the South American coast. The twenty-man crew was forced to take to three whaleboats with what food and water they could carry and set off to reach South America.

In two weeks, the food was gone, and water so scarce that they were forced to drink their urine. They were temporarily saved by water and food foraged on barren Henderson Island, but after eating the island dry, they set off once more in the boats, less three men who decided to stay. By January, the men in the boats began to die. The first two corpses were consigned to the sea, but when a third died, the men were so hungry they decided to resort to cannibalism. When more men died, they did likewise. Soon, even this dire infusion of food became insufficient, and the surviving men drew lots to see who was to be killed and eaten next.

A young 18-year-old named Owen Coffin drew the black spot and was soon shot and butchered by the others, one of whom died ten days later and was likewise consumed. It would not be until late February 1821 that the five dazed survivors were rescued off the coast of Chile, having eaten no less than seven of their comrades.[4]

6 Sultana, 1865

Imagine this. You have just spent years in Andersonville, the notorious Confederate prison where starvation, disease, and ill-treatment have killed some 13,000 of your comrades (a staggering death rate of 29%). You have just learned the Civil War is over after four brutal years, and you have just been told that you are finally going home. And so some 1,953 released Union prisoners of war were crowded onto the groaning decks of the Sultana, a northbound Mississippi river steamboat designed to carry only 376 passengers. With some 177 additional passengers and crew aboard as well, the Sultana crept slowly up a Mississippi swollen by one of the worst floods in living memory.

All went well until 2 am on the 27th of April, 1865, when both of its faulty boilers suddenly exploded under the strain. The jet of scalding hot steam blew out the center of the boat, destroying the pilothouse and knocking down the smokestacks, trapping hundreds in the wreckage that soon caught fire. Those trapped under the collapsing decks were scalded or burned to death, while the hundreds of ex-prisoners who jumped overboard quickly drowned, unable to keep afloat in their weakened state. When the hulk of the Sultana finally sank by the Arkansas shore around 7 am, some 1,169 men had died, making this the greatest maritime disaster in U.S. history.[5]

5 SS Central America, 1857

On September 9, 1857, the SS Central America, carrying 477 passengers, 101 crew, and over nine tons of newly mined gold from the California Gold Rush, found itself trapped in a hurricane off the coast of the Carolinas. For two days, she rode out the storm, her steam-powered paddle wheels keeping her pointed into the 100 mph (62 km/h) winds. But by September 11, the boilers were failing, the sails were torn to ribbons, and leaks had developed, which threatened to overwhelm the pumps.

When the boilers finally failed, the engines and pumps fell silent, and the ship was adrift at the mercy of the storm. Red-eyed passengers spent the long night passing buckets of water up through the dark ship, but they were fighting a losing battle with the sea. The eye of the hurricane brought momentary calm, allowing the doomed to contemplate their fate, but when the storm returned, the ship continued to sink by the stern.

In the morning light, another ship was sighted, and women and children were loaded into the lifeboats and set off through the perilous sea. In this way, some 153 people were saved, but when the Central America finally sank after its three-day struggle, it took some 425 souls with it.[6]

4 SS Princess Alice, 1878

There could be nothing more pleasant than taking an evening excursion by paddle steamer up the river Thames, which is what some 700 Londoners were doing on the evening of September 3, 1878. Then the SS Princess Alice was cut in two by the oncoming collier SS Bywell Castle in Galleon’s Reach, just east of London. Those who had been below decks at the time of the collision had no chance of survival, as it took a mere four minutes for the broken ship to slip beneath the river.

Despite launching boats from both the Bywell Castle and riverfront residences and factories, hundreds of people, weighed down by Victorian clothing, were washed under and away by the currents. Terrible as this was, what happened next transformed the scene into an unfathomable horror. The pumping stations for the London sewer system output their raw sewage into the Thames at the very spot where the Princess Alice sank, and a mere hour before the disaster, over 90 million gallons of raw sewage had been dumped into waters already polluted by local gas works and chemical factories.

The Times cited a local chemist who reported the outflow as “two continuous columns of decomposed fermenting sewage, hissing like soda-water with baneful gases, so black that the water is stained for miles and discharging a corrupt charnel-house odour.” The toxic slime proved fatal even to those who did not drown in it. Of the 130 survivors of the disaster, some 16 died later from ingesting the putrid waters.[7]

3 SS Atlantic, 1873

Prior to the 1912 loss of the Titanic, the White Star Line’s greatest catastrophe was the loss of the SS Atlantic on a different April night some 39 years earlier. En route to New York from Liverpool with 952 passengers and crew, the Atlantic was diverted to Halifax, Nova Scotia, to load more coal. Approaching what they believed to be the harbor entrance in a howling storm, the ship was, in fact, over 12 miles (19 kilometers) off-course, heading straight for underwater rocks.

Failing to spot a familiar lighthouse west of the harbor, the helmsman relayed his concerns to the officer of the bridge, only to be told to stay the course. When the ship struck the rocks and the hull was smashed inward, the passengers clung to the listing vessel and watched as one after another of the 10 lifeboats were launched, only to be crushed against the hull or swept away by the raging sea. With no other way off the swiftly capsizing ship, crewman John Speakman swam to nearby rocks with a line of rope, creating a lifeline by which the strongest were able to pull themselves to shore.

In this way, some 429 passengers and crew survived to watch the remaining 535 people drown, including all 156 women and 188 of the 189 children aboard the ship. Commemorated in artwork by Winslow Homer and Currier & Ives, the loss of the Atlantic was the deadliest civilian maritime disaster of its day, only eclipsed 25 years later by our next entry.[8]

2 SS La Bourgogne, 1898

Speeding through a fog bank southeast of Halifax, Nova Scotia, in the pre-dawn hours of July 4, 1898, the SS La Bourgogne, a French ocean liner bound from Le Havre to New York, was struck midship by the iron-hulled sailing vessel Cromartyshire. Those passengers sleeping on the starboard side either had no chance of escape from their berths or woke to find their compartments rapidly filling with water.

With the starboard side lifeboats damaged or destroyed by the collision, the crew attempted to launch the port side boats, only to find the task imperiled as the list to starboard increased and the port side rolled up into the air. As discipline collapsed, passengers and crew fought to gain space in the undamaged lifeboats, and within 30 minutes, the ship had settled and slipped stern first under the waves.

It was only when the sun rose, and the fog lifted that the crew of the Cromartyshire (still afloat) realized that the La Bourgogne had been far more damaged than herself and began to render assistance to the survivors. But it was too late. Of the 726 souls aboard, only 173 survived, and of those, all but 70 were male crew members. Of the 300 women aboard, all but one would perish, along with each and every one of the children.[9]

1 Batavia, 1629

In June 1629, the Dutch East India Company’s ship Batavia struck a reef off Beacon Island, a remote coral island 50 miles (80.5 kilometers) west of Western Australia. While her fate was a common enough occurrence in the age of sail, it is what happened next that earned the Batavia a spot on this list. Though 40 people drowned, the rest of the 322 passengers and crew got ashore on a desert island only to find no fresh water and nothing to eat but birds.

When the captain, senior officers, and some crew embarked in the longboat on a 33-day journey to Batavia (modern-day Jakarta, Indonesia) to seek help, the hundreds of survivors elected one Jeronimus Cornelisz, a senior company merchant, to leadership. They could not have made a worse choice.

He ordered 20 of the soldiers to explore a nearby island, ostensibly to search for food, but then abandoned them to die. Then, confiscating all weapons and then all the food, he began a two-month reign of terror, marooning more of his rivals on nearby islands and forcing seven of the surviving women into sexual slavery. Then, with food becoming scarce, he began to openly murder the survivors. Around 110 men, women, and children were drowned, hacked, strangled, or beaten to death before the 20 soldiers, having refused to die on their desert island, set up a fort and refuge from the mutineers.

Cornelisz declared war on the soldiers, and a battle ensued. It was in the midst of this inter-island war that the Batavia’s captain returned in the rescue ship, arrested the mutineers, and tortured them into a confession. Cornelisz and his followers were executed, and the nightmare was finally over for the 122 souls that remained.[10]

]]>
https://listorati.com/ten-horrific-shipwrecks-that-werent-the-titanic/feed/ 0 2473
10 Actors Who Got Paid for Films They Weren’t In https://listorati.com/10-actors-who-got-paid-for-films-they-werent-in/ https://listorati.com/10-actors-who-got-paid-for-films-they-werent-in/#respond Tue, 07 Feb 2023 18:13:50 +0000 https://listorati.com/10-actors-who-got-paid-for-films-they-werent-in/

Actors are some of the most well-compensated members of society, or at least those who make it to Hollywood are. And all actors who work on Screen Actors Guild movies (which is pretty much everyone in Hollywood) get paid, whether they appear in the finished product or not. Thanks to pay-or-play contracts, many top-tier actors get a check even if they don’t actually work at all.

But the reasons actors end up not starring in movies they’ve been paid for are many. Some are cut out of films, some get fired before the director has ever called action, and some, perplexingly, never have anything to do with the production in the first place.

10 Shailene Woodley, The Amazing Spider-Man 2

Before Tom Holland donned the red spandex, Andrew Garfield was the world’s principal web-swinger in a film series that began with The Amazing Spider-Man in 2012 and ended with The Amazing Spider-Man 2 a mere two years later. Garfield’s fifteen minutes of Spidey fame have been looked upon more fondly in recent years, but some of the series’ stars didn’t even get fifteen seconds!

Shailene Woodley, best known for the Divergent series, signed on to play Mary-Jane Watson in the second Amazing Spider-Man, but her feet hardly touched the ground before she was whisked out the door again. Director Marc Webb (the puns write themselves) shot three scenes with the actress, intending MJ to be a minor presence and secondary romantic interest for Garfield’s Peter Parker. But ultimately, he decided to leave her parts on the cutting room floor when streamlining an already 142-minute movie. Even though her time on set was brief, and the public never got to see the actress on screen, Woodley still got paid for paying her dues.[1]

9 Johnny Depp, Fantastic Beasts: The Secrets of Dumbledore

The third installment in the Fantastic Beasts series, The Secrets of Dumbledore, exploded onto screens in 2022, but most of the series’ magic had already evaporated. The writing was lax, the film felt superfluous, and several public controversies plagued the film, including Potterverse author and Fantastic Beasts screenwriter JK Rowling’s ongoing friction with some of the trans community. But few controversies were as large as the domestic abuse allegations made by Amber Heard against Johnny Depp (who starred as antagonist Gellert Grindelwald in the first two Fantastic Beasts films) and the subsequent libel case that he lost against UK tabloid newspaper The Sun.

Seeking to cut their losses and save face following the outcome of the court case, in which the judge found that 12 of the 14 domestic abuse allegations had occurred, Fantastic Beasts studio Warner Bros. cut ties with Depp and hired Mads Mikkelsen to replace him. Lucky for Depp, he had only filmed one scene for The Secrets of Dumbledore; unlucky for Warner, he had a pay-or-play contract entitling him to his full salary for the film—an eye-watering $16 million.[2]

8 Bob Hoskins, The Untouchables

The production for Brian De Palma’s The Untouchables, a film about the team who took down Al Capone, brought together four of the 20th century’s biggest acting talents in Sean Connery, Kevin Costner, Robert De Niro, and Andy Garcia. The film has gone down in history as one of the best crime films set in the Prohibition era.

While the entire cast is pitch-perfect, De Niro’s Al Capone is a particular highlight, with the veteran gangster actor carrying off both the physicality and voice of Capone with ease. Though De Niro was the first choice for the part, the actor was in high demand, and De Palma hedged his bets by getting English actor Bob Hoskins to agree to take the role if De Niro couldn’t make it. When De Niro took the part, Hoskins thought nothing of it until a check for $200,000 arrived in the mail with a note saying, “Thanks for your time. Love, Brian.”[3]

7 Paul Rudd, Bridesmaids

It’s not often a big-name actor gets cut from a film for little to no reason, but that’s precisely what happened during the editing process for Paul Feig’s 2011 all-star smash hit comedy Bridesmaids. Bringing together comedians from SNL, various comedy shows, and the films of Judd Apatow (who served as producer on Bridesmaids), the film is a who’s-who of the U.S. comedy scene at the turn of the decade.

But some of its best comedic talent didn’t even make it onto the screen. Paul Rudd was originally set to cameo in the film as protagonist Annie’s (Kristen Wiig) blind date, who loses touch with reality when a child skates over his fingers. Rudd shot with the crew for a day and, by all accounts, threw himself into a sequence that saw him pratfall across an ice rink and swear liberally at children. But in terms of screen time, it amounted to nothing, as Feig cut him from the film to keep the narrative tight and give more screen time to child actor Blake Garrett who was, by all accounts, comedy gold. Thanks to the Screen Actors Guild, though, Rudd still got paid.[4]

6 Kevin Spacey, All the Money in the World

Fantastic Beasts was not the first film to drop an actor thanks to controversies going on away from the set, but few films have gone as far as All the Money in the World did when it comes to eradicating a star from the celluloid. Filming had completed, and the film was in the end stages of post-production when sexual misconduct allegations were made against star Kevin Spacey by fellow actor Anthony Rapp.

Recognizing that this could sound the death knell for his film, director Ridley Scott sprang into action, used his industry clout to secure an extra $10 million in financing, and hired Christopher Plummer to replace Spacey as billionaire oil tycoon J. Paul Getty. They filmed an additional 400 shots over nine days in two countries, and Scott used digital and practical effects teams extensively to help blend the new actor into the existing picture. As Spacey had completed his work on the film and fulfilled his contract, both actors ended up getting paid despite only one appearing in the final film.[5]

5 Tim Roth, Once Upon a Time… in Hollywood

Once Upon a Time… in Hollywood broke many of director Quentin Tarantino’s long-established tropes and filmic traditions, serving up a tale about the Manson family murders that contained surprisingly little violence and more tension and meditative contemplation than merchandisable quick quips. One of the telltale signs of a Tarantino flick, however, was the presence of a familiar cast of actors, bringing together Leonardo DiCaprio, Brad Pitt, Kurt Russell, Bruce Dern, and others from the director’s previous features.

Conspicuously absent, however, was Tim Roth—who has been with Tarantino since Reservoir Dogs—despite being paid to be in the film. Roth filmed sequences as Hollywood hair stylist Jay Sebring’s (Emile Hirsch) British butler, serving little more than a cameo role, which is ultimately why Tarantino decided to drop his part altogether. Given the film is already nearly three hours long, perhaps it’s a good thing he did.[6]

4 Tobey Maguire, Life of Pi

It’s not often you end up being booted from a film for being too well-known, but that’s what happened to Tobey Maguire in Life of Pi. Ang Lee’s adaptation of Yann Martel’s Booker Prize-winning novel sees a young Indian boy, Pi (Suraj Sharma), stranded in the middle of the Pacific Ocean aboard a lifeboat with only a tiger named Richard Parker for company. In the film, the story is told by a grown-up Pi (Irrfan Khan) to a fictionalized version of the author, known only as The Writer.

Tobey Maguire was originally on board to play the fictionalized Martel, but after filming had begun, Lee didn’t feel Maguire was a good fit for the part and replaced him with the far less recognizable Rafe Spall. It should come as no great surprise that Lee changed his mind to “be consistent with other casting choices made for the film” because the majority of the cast, save perhaps for Gérard Depardieu’s Cook, were not as well known to Western audiences.[7]

3 Harrison Ford, E.T. the Extra-Terrestrial

There isn’t an adult or child across the established world who doesn’t associate the name Steven Spielberg with his family-friendly alien flick E.T., but back when it was being made, the director was better known for more adult-oriented films like Jaws, Close Encounters of the Third Kind and Raiders of the Lost Ark. It makes sense then that he wanted to carry over some of what he knew from those films into this, and while subject parallels with the likes of Close Encounters are easy to draw, E.T. doesn’t bear a whole lot of resemblance to Indiana Jones.

Nonetheless, Spielberg’s filmmaking comfort food in this feature was Harrison Ford himself. The director cast Ford against type as an uptight school principal who reprimands young protagonist Elliott (Henry Thomas) in his office. Shot over Ford’s shoulder and in dim lighting, this cameo was always meant to be an Easter egg for fans, but it didn’t make it into the film at all.

A notoriously unsentimental editor, Spielberg didn’t think the scene added anything to the film and left it on the cutting room floor. While details of the sum Ford actually got paid are scarce, it wouldn’t have been spare change, given he was just concluding a run starring in Raiders, The Empire Strikes Back, and Blade Runner.[8]

2 Michael Biehn, Alien 3

David Fincher’s Alien 3 received a lambasting from fans and critics upon release for a multitude of reasons. The film didn’t serve the series trajectory well, the studio interfered in the process throughout and refused Fincher’s final cut, and above all else, James Cameron’s and Ridley Scott’s (director of the first two Alien films) footsteps were awfully big for a first-time director to follow in.

Among the film’s issues was the killing of legacy characters Newt (Carrie Henn) and Hicks (Michael Biehn) at the beginning of the film when an alien egg hatches on board their ship, and their escape pod crashes into a nearby planet. Neither Biehn nor Henn had signed on for another film, so Fincher couldn’t use the actors, but the production team initially worked with a facial cast of Biehn taken in Aliens to create a prosthetic doppelganger. Biehn got wind of this and put his agent onto the studio, saying they couldn’t use his image. Ultimately, he agreed to let them use his likeness, but only for a hefty fee.[9]

1 Eric Stoltz, Back to the Future

Back to the Future remains one of the foremost time travel sci-fi films of the 20th century, and its conception of time travel has come to be the go-to exemplar for the single, alterable timeline, in which changes to the past affect changes in the future. As a result, Back to the Future deals with the idea that if we went back in time and changed things, we might end up accidentally erasing ourselves. Unfortunately for Eric Stoltz, his approach to Robert Zemeckis’s script resulted in him being erased from the production.

Originally cast as teen hero Marty McFly, Stoltz brought an air of melancholy and darkness to the role, treating the part as a tragedy rather than the family-friendly comic stylings Zemeckis had in mind. Not getting laughs from the dailies and concerned that this casting decision could sink the whole film, Zemeckis opted to fire Stoltz six weeks into filming, costing the studio two actors’ salaries and adding millions of dollars to the budget. But Michael J Fox took over as McFly, and the rest is history, or the future, or something like that…[10]

]]>
https://listorati.com/10-actors-who-got-paid-for-films-they-werent-in/feed/ 0 2200