Top 10 things illustrate how humanity’s confidence often outpaces reality. As history shows, we’re not nearly as smart as we think we are. Witch‑burning, bloodletting, and eugenics are just a few examples of how once‑established theories proved not only incorrect but outright insane.
Top 10 Things We’ve Learned From History
10 Pain, Pain, Go Away
It began with noble intentions. Roughly thirty years ago, a concerted push emerged to discover drugs that could more effectively quell chronic aches—think back problems, arthritis—and also smooth the recovery curve after surgery or ease the final throes of terminal illnesses such as late‑stage cancer.
Fast forward to 1995 when Purdue Pharma rolled out what was quickly hailed as a wonder‑drug: OxyContin. Soon after, Percocet and Vicodin joined the party, and throughout the late ’90s pharmaceutical firms reassured physicians that prescription opioid painkillers would not hook patients. Consequently, doctors started writing them at an ever‑escalating pace.
The pills proved deadly efficient at both dulling pain and delivering death. Opioid misuse and overdose fatalities surged. By 2017 an estimated 1.7 million Americans were living with a substance‑use disorder tied to prescription opioid pain relievers. At least, their arthritis wasn’t acting up, right?
That same year, the prestigious Mayo Clinic published a post‑mortem analysis of the opioid crisis’s sweeping… well, mortality. The abstract summed it up succinctly: “Good intentions to improve pain and suffering led to increased prescribing of opioids, which contributed to misuse of opioids and even death.” Perhaps the original premise—that pain should be eliminated entirely—was flawed from day one.
In the twelve‑month span ending September 2020, a record 87,000 Americans died from drug overdoses. By October 2020, Purdue Pharma pleaded guilty to federal criminal charges, part of an $8 billion‑plus settlement. Those numbers are no coincidence.
9 Back in the USSR: Russia’s Relapse
It’s almost unbelievable that, barely two decades ago, optimism ran high across the West that Russia—fresh from the ashes of the Soviet Union’s 1991 collapse—would embrace its former rivals and integrate into the community of mature democracies.
The honeymoon period looked bright. After President Ronald Reagan thawed the Cold War chill and forged a working rapport with Soviet leader Mikhail Gorbachev, his successor, George H. W. Bush, oversaw the USSR’s disintegration with a dignified, gloat‑free poise—effectively hitting the diplomatic reset button. (In fact, many rank Bush among the finest one‑term U.S. presidents for his handling of the Soviet breakup.)
Regrettably, Russia’s inaugural post‑Soviet leader proved a naïve, in‑ebriated figure. President Boris Yeltsin attempted to thrust the nation’s state‑run economy into near‑total capitalism at breakneck speed. The fallout was brutal: throughout the 1990s Russia’s GDP shrank by roughly half, whole sectors vanished, inequality and unemployment spiked dramatically, and average incomes plummeted.
Yeltsin, the first democratically elected Russian president, ended up being both the nation’s poorest and its last democratic leader. In 2000, his Prime Minister—a young ex‑KGB operative named Vladimir Putin—took the helm. Two decades of dissent‑squashing, Syrian meddling, Ukrainian invasion, and election interference later, Russia stands as the nuclear‑armed menace it once was throughout much of the late twentieth century.
8 Not So Fast
Throughout the ages, humanity has clung to wildly inaccurate notions about physics. At one point people believed the Earth was both flat and the center of the cosmos. Even the brilliant John Adams dismissed human flight until he saw one of the inaugural hot‑air balloon ascents over Paris in 1783.
A less‑known 19th‑century myth revolved around the advent of rail travel. Many asserted that speed itself could be lethal—specifically, that fetuses would die.
New technologies invariably spawn baseless anxieties (remember the cell‑phone‑brain‑tumor scare?). When railways arrived, they promised to move people far faster than ever before. Women, however, became the focus of a bizarre panic: at speeds around 50 mph, some claimed a woman’s uterus would simply drop out. Talk about a Plan B!
Cultural anthropologist Genevieve Bell points out that misogyny has long drummed behind tech‑centric moral panics. When electric lighting first spread, “experts” warned that illuminated homes would endanger women and children by alerting predators to their presence. Later, as automobiles roared onto streets, many argued that women—perceived as faint‑prone, physically weak, and prone to hysteria—couldn’t handle high‑speed responsibility, and should be barred from driving. Saudi Arabia concurred for over a century.
7 The Original Social Distancing
Remember the universal optimism that social media would knit humanity closer together?
Various forms of online gathering have existed since the internet’s infancy, but platforms like AOL chatrooms, Meetup, and MySpace merely paved the way. It was Facebook’s 2004 debut that truly mainstreamed social networking.
Facebook’s declared mission sounded innocent enough. Suddenly, we could curate a custom list of “friends,” grant them access to our thoughts and photos, share links to articles we found intriguing, crowdsource recommendations from trusted contacts, organize hobby‑based events, and stay in touch with acquaintances we seldom meet in person.
Seventeen years and 2.8 billion users later, Facebook—alongside its 280‑character sibling, Twitter—has become one of the chief culprits fueling the relentless culture wars gripping much of the Western world, especially the U.S. and U.K. Sites designed for togetherness have achieved the opposite, driven by a simple motive: money. To keep users glued, algorithms feed each person more of what they already like. That’s harmless when a pregnancy article triggers a stroller ad, but in politics it’s disastrous. Liberals are bombarded with woke nonsense and flawed logic; conservatives are flooded with anti‑liberal memes and cancellation campaigns, plunging us into an online uncivil war.
6 High Crimes
“Instead of a war on poverty,” Tupac Shakur rapped, “they got a war on drugs so the police can bother me.” The War on Drugs represents a U.S. government initiative aimed at curbing illegal drug use and distribution by dramatically stiffening prison sentences for both dealers and users.
Among its many outcomes, the campaign succeeded in stigmatizing and worsening drug abuse while propelling America to the world’s highest incarceration rate—outpacing nations like El Salvador and Turkmenistan. Not all substances were treated equally. When the crack epidemic first erupted—largely confined to Black neighborhoods—the 1986 Anti‑Drug Abuse Act introduced a notorious 100:1 sentencing disparity: merely five grams of crack triggered a mandatory five‑year term, whereas the same penalty for powder cocaine—a drug more prevalent among white users—required five hundred grams.
Although the law originated under Republican Ronald Reagan, the most flagrant misstep arrived in 1994 when a bill signed by Democratic President Bill Clinton and authored by then‑Senator Joe Biden (who would later become president) created the Violent Crime Control & Law Enforcement Act. This legislation funneled billions to states for new prisons and set up grant programs that incentivized police to pursue more drug‑related arrests, even for low‑priority substances like marijuana—effectively feeding the prison‑industrial complex.
5 The Facemask Facepalm
Hello, medical science—call me Common Sense. Have we met? No one faction in the COVID‑19 saga holds a monopoly on virus‑combatting protocols. Liberals advocated lockdowns until both the virus and the global economy vanished, while conservatives decried the simple act of donning a fabric mask as an unforgivable infringement on freedom.
Nonetheless, it falls to experts to issue health recommendations, and when they tarnish their own reputations early in a crisis, the fallout can be lethal. Such a misstep occurred when, as the virus gained momentum in the U.S. in late February and early March, Dr. Anthony Fauci and fellow authorities declared that wearing facemasks was unnecessary.
Fauci later clarified, “We were not aware that 40 to 45 percent of people were asymptomatic, nor were we aware that a substantial proportion of people get infected from people who are without symptoms.”
That explanation falls short. By late February it was evident a novel virus was spreading with a speed and ferocity unseen in decades. Any seasoned contagious‑disease expert should have recognized that such rapid proliferation unmistakably signaled airborne, asymptomatic transmission. Common sense dictates that a virus would struggle to achieve that scale without such a route. This miscalculation stands as the most consequential early error in responding to a disease that has claimed over three million lives.
4 Smoke and Mirrors
“If excessive smoking actually plays a role in the production of lung cancer, it seems to be a minor one.”
You’d assume such a dismissive statement would hail from a tobacco‑industry executive, yet it was actually voiced by Wilhelm Carl Hueper, director of the Environmental Cancer Section at the National Cancer Institute, who gave the public the green light to light up in 1954.
From the 1930s through the 1950s, a potent advertising slogan—“doctors recommend”—helped peddle what would become one of humanity’s deadliest consumer products: cigarettes. One Camel advertisement boldly proclaimed, “Give your throat a vacation, smoke a fresh cigarette.”
Even as a persistent smoker’s cough spread and concerns grew about the health implications of inhaling smoke, Camel bolstered its claims by asserting that “more doctors smoke Camels than any other cigarette.” In fact, the company even conducted a survey of physicians, compensating participants with—what else?—Camel cigarettes themselves.
Before the mid‑20th century, such health‑conscious claims seemed unnecessary because, oddly enough, societies worldwide believed that no long‑term physical harm could arise from sucking on burning sticks all day. It wasn’t until the 1940s that international teams of epidemiologists began linking smoking to lung cancer, spurred by the clear parallel between soaring cigarette consumption and a surge in what had once been a rare terminal disease.
3 America’s Delusional Decade
Many look back on the 1990s as the quintessential American decade. Starting roughly with the November 1989 fall of the Berlin Wall—a death knell for the collapsing Soviet Union—the ’90s seemed a brief but brilliant era in which Western civilization triumphed over Soviet socialism and totalitarianism, leaving the United States as the world’s sole superpower.
Perhaps the height of our hubris was captured in Francis Fukuyama’s 1992 bestseller, The End of History. The influential work argued that the rise of Western liberal democracy represented not merely the conclusion of a post‑war epoch, but the very endpoint of humanity’s ideological evolution, ushering in a universalized liberal democracy as the final form of government.
That proclamation encouraged complacency. With the belief that history had reached its terminus, Western powers felt free to lower their guard—neglecting, for instance, the security of commercial airline cockpit doors, which remained as flimsy as a mall restroom.
First President Clinton, then President (W.) Bush largely ignored the burgeoning Islamic terrorism threat. Sitting atop the global food chain and buffered by two oceans, America seemed untouchable… until, with a plane crash followed by another just 18 minutes later, the decade ended a year late but abruptly on September 11, 2001. The rest, as they say, is history—and certainly not the end of history.
2 An Invincible Earth
Perhaps the starkest illustration of humanity’s hubris is the belief that our planet is so extraordinary we can inflict countless wounds without sealing its fate.
The irony deepens because we once knew better. By the 1960s, developed nations recognized the dire side effects of the Industrial Revolution. Air quality in cities was abysmal, and rivers and lakes reeked with pollution.
Collective action followed, as the issue wasn’t yet controversial. In 1970, Republican President Richard Nixon established the U.S. Environmental Protection Agency. Two years later, he signed the Clean Water Act into law. Nations worldwide launched similar initiatives to heal a wounded Earth.
Fast‑forward half a century, and we find ourselves still treating the planet as if landfills were infinite, still razing rainforests—vital carbon sinks and oxygen factories—for agriculture, oil drilling, and other short‑sighted pursuits. Meanwhile, we cling to the notion that we can continue burning fossil fuels, producing massive pollution, even as cleaner alternatives like nuclear power beckon.
1 The Food Pyramid Scheme
In 1992, the U.S. Department of Agriculture proudly rolled out the Food Pyramid—a triangular chart meant to guide nutritional balance. Its design was rooted in the disastrously misguided dietary recommendations of a non‑scientific committee chaired by Democratic Senator George McGovern, who had also run for president in 1972.
At the pyramid’s summit sat a narrow tip labeled “Fats, Oils & Sweets,” indicating foods to be consumed sparingly, with no specific daily allowance provided. Descending the pyramid, the base broadened to display food groups that should be eaten liberally, encouraging massive servings of certain categories.
For instance, the “Fats, Oils & Sweets” segment was meant to be restricted, while the pyramid’s broad base promoted an aggressive intake of other groups. The recommendation effectively told Americans to eat a staggering 6‑11 servings of bread, pasta, rice, or cereals each day.
That’s right—over a decade of official guidance instructed citizens to consume six to eleven portions of carbohydrates daily. Meanwhile, the guidelines still allocated room for two to three servings of dairy and another two to three of protein, a broad category ranging from red meat to red beans, with considerable leeway in between.
Additionally, the plan suggested two to four servings of fruit and three to five servings of vegetables, even allowing potatoes to count as vegetables so as not to encroach on the allotted carb quota.
In 2005, the USDA overhauled the recommendations, yet curiously retained the pyramid shape—perhaps to give the illusion of continuity rather than a true redesign. Later, the institution reshaped the guidance into a “food‑plate” graphic, but the fundamentally flawed dietary advice persisted, and attempts to revert to a high‑fat, low‑carb regimen are routinely attacked by media, politicians, and self‑styled “academics.”
Fast‑forward to today, and 42 percent of Americans are classified as obese—a stark testament to the lasting impact of these misguided policies.

