Technology – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Fri, 10 Jan 2025 03:17:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Technology – Listorati https://listorati.com 32 32 215494684 10 Ways Technology Is Ruining Your Love Life https://listorati.com/10-ways-technology-is-ruining-your-love-life/ https://listorati.com/10-ways-technology-is-ruining-your-love-life/#respond Fri, 10 Jan 2025 03:17:41 +0000 https://listorati.com/10-ways-technology-is-ruining-your-love-life/

Technology has completely changed how we live our lives, and it’s happening at an ever-increasing rate. Our worlds are completely different from the world of even 10 years ago. This obviously has resulted in differences in the way we work and play, but did you know it’s even changed the way we have sex?

10Netflix Adultery

Netflix-Dreamworks

A new issue causing strife between couples is “Netflix adultery”: watching TV shows and movies alone that they promised their partner they would watch together. Twelve percent of those surveyed said they do it, and 59 percent of cheaters even reveal spoilers, which means that more than 7 percent of us are dating huge jerks.

Netflix’s director of public relations Jenny McCabe says that couples are reporting some serious drama over the phenomenon, commenting “We hear people say, ‘We made a pact, we were going to watch this together.’ ” It’s a real violation of trust and lack of consideration that can cause tension every bit as real as fights over money or other relationship matters.

9Internet Infidelity

Young-internet-user-watching-porn-2013363

The Internet has made actual infidelity easy and guiltless. Cybersex offers a convenience and anonymity that can prove too tempting for many to resist, even if they have someone who is generally willing to have sex with them right there in the other room. There’s no physical contact, so what’s the problem? It’s not like cybersex is really cheating, right?

Wrong: 77 percent of people surveyed said that cybersex infidelity is unacceptable. Despite the many reasons a cheater could use to rationalize their activities, an overwhelming majority of people agree that cheating is cheating, period. This new and puzzling gray area is such a big problem that it was responsible for a full third of divorce cases in 2009.

8We’re All Creepy Stalkers

Being Followed

The Internet has given us unprecedented access to the personal lives of prospective and past partners, and boy are we making use of it. Almost 90 percent of us admit to “stalking” the social networking activities of our ex-partners, and 60 percent of us admit to doing so to a crush.

This can have catastrophic effects on our well-being, because the information often doesn’t fully satisfy our curiosity and causes even more anxiety. Stalking an ex can significantly hamper our recovery from the breakup, and even spur us to make really bad decisions like hopping back into bed with them (yes, scientists actually studied this stuff). It might be best to keep them out of feed, out of mind.

7 Fear Of Intimacy

vid-games

Harvard professor Craig Malkin has coined the term “cybercelibacy” to describe the increasing number of people who turn to online games and networks to satisfy their social needs without having to face scary real people. It creates a vicious cycle, he explains, where people aren’t forced to face their anxieties about relationships, which makes those anxieties grow and causes them to retreat further.

How bad is the problem, exactly? Well, 28 percent of people surveyed admitted that they spend less time with meatspace friends in favor of online activities, and almost as many (20 percent) say they’re having less sex. It turns out that going outside occasionally is a really important step to taking up residence in someone else’s underpants.

6Facebook Provokes Your Jealousy

Facebook

Following your partner’s Facebook feed creates needless jealousy, one study says. Even after controlling for other factors (that’s science for “weeding out the crazy people whose unbridled jealousy would exist either way”), the study found that the more time you spend reading your partner’s boring status updates, the more likely you are to turn into a raging psycho.

This happens because a good chunk of your partner’s social interaction becomes visible to you, but you don’t have those in-person cues that gives the exchange context. For example, when your lady’s gay co-worker or the best friend who loves her like a sister leaves an innocent “You look great!” on her picture, they know it’s a harmless compliment—but all you see is some dude hitting on your girlfriend.

5 Too Many Points Of Contact

facebook-wide

A lack of communication can be a big problem in a relationship, but one study suggests that communicating too much can be a strain as well. A survey of 24,000 married people found that using more than five channels (such as social media, texting, instant messaging, etc.) to communicate with your partner actually decreases relationship satisfaction.

The stress of never being more than a series of ones and zeroes away from your partner and monitoring so many incoming data streams is a killer. Think about how easy it is to step over that threshold. You follow your partner’s Facebook and Twitter feed, obviously, and of course they have your phone number for calling and texting—if you regularly use even one more communication tool, you’re screwed.

4The Online Pornsplosion

Fighting

With porn so easily accessible, convenient, and increasingly hardcore, many women are feeling either neglected or pressured to adhere to male-centric sexual scripts that they don’t enjoy. It turns out that many ladies don’t actually enjoy being sprayed in the face or poked in the butt (acts that are simply a matter of course in even mainstream porn nowadays) but feel like they have to if they want to please their man.

That is, if they’re being asked to please them at all: More women are reporting that they can’t compete with the blonde, tanned, and augmented video vixens, and their partners neglect them in favor of prerecorded thrills. It’s never a good thing if one person is unhappy with the naked-time routine, but the problem is so bad that in 2003, it was reported that online porn played a major role in a quarter of all divorce cases that year (and we’re pretty sure the amount of porn available hasn’t decreased any since then).

3 Gadgets

Bedroom evening - woman with laptop

Some people are literally addicted to their smartphones; they can’t even leave the room without carrying them around like a colicky baby. Or maybe you like to bring your laptop to bed for some late-night work, or even just watch a little Letterman before tucking in. Well, all of those things could be wreaking havoc on your sex life, studies show. The mere act of having a phone nearby is so distracting that we can’t focus on the person we’re with, and simply having a TV in the bedroom can cut the amount of sex you have in half.

2 Dubious ‘Matching Algorithms’

Online Dating

Matching algorithms, such as those used by OkCupid and eHarmony, use questionnaire information about users’ personality and interests, which may help the strangers find things to talk about, but won’t in any way guarantee relationship success. Hold on, you say, isn’t it important that my partner likes Star Wars and skydiving as much as I do? If I end up with a scaredy cat who hates sci-fi, how are we even supposed to relate to each other?

Actually, the former has little to do with the latter. The way two individuals interact with each other specifically—i.e., plain ol’ chemistry—is the best indication of a good match, something that can’t be determined until two people meet. Maybe that overly cautious person keeps you grounded without holding you back, or the foreign film nut knows intuitively just what kind of support you need when you’ve had a bad day. Furthermore, the sites encourage users to objectify potential partners, “shopping” for matches based on these superficial and insignificant traits.

1Googling Your Date

The Google search page appears on a comp

There’s really no such thing as a blind date anymore: 48 percent of women will not hesitate to Google you before they agree to go out with you, and just as many are willing to decline if they find unsavory information. Sure, some serious bullets can be dodged this way, like if you find your potential date’s incoherent, violent blog about his serial killer fantasies, but in many cases, you might be rejecting your soulmate based on a false (or at least meaningless) representation.

According to one study, the more information we dig up about our suitors, the more likely we are to reject them. You might think that just saves everyone some time—you’re going to find out about her online shrine to Hanson eventually, right?—but before you judge too harshly, take a minute to Google yourself. Did anything potentially off-putting come up? That embarrassingly naive op-ed piece about Objectivism you wrote for your college newspaper, say, or videos of your misguided attempt at hip-hop superstardom? How representative are those things of you as a person?

The fact is, someone who’s had a chance to get to know all the virtues and quirks that come packed in the you-shaped bundle is probably going to find those things endearing, but someone whose first impression of you has been based on them is going to run away screaming. As study coauthor Joanna Frost, PhD., says, “Your disillusionment with someone during a conversation might take hours, during which your date has the opportunity to explain himself, whereas online that disillusionment can happen almost instantly.” So give that freak a chance to explain herself over a beer—it might just be a charming quirk in an otherwise flawless package.

Manna writes and wrongs from sunny Portland, Oregon. Check her out on Cracked, Twitter, and her blog.

]]>
https://listorati.com/10-ways-technology-is-ruining-your-love-life/feed/ 0 17286
10 Brilliant Feats Of Scientific Technology https://listorati.com/10-brilliant-feats-of-scientific-technology/ https://listorati.com/10-brilliant-feats-of-scientific-technology/#respond Tue, 08 Oct 2024 22:18:34 +0000 https://listorati.com/10-brilliant-feats-of-scientific-technology/

Sabine Hossenfelder is not a household name, but a recent article of hers has stirred up a host of debate among some scientific specialists. In her piece, published by the magazine New Scientist, the journalist and theoretical physicist argues against investing an enormous sum of money in a new particle collider. Research organization CERN has announced plans to build a €21 billion supercollider, a proposal that Hossenfelder says does not justify its hefty price tag.[1]

Her article has split opinion among theoretical and particle physicists. Many agree with Hossenfelder’s well-reasoned conclusion. Others argue that investment is needed for the evolution of cutting-edge technology; without fresh areas of work, the research will simply dry up.

Whether or not the high-cost supercollider will be built remains to be seen. However, in the midst of this forward-thinking debate, we must not lose perspective on the here and now. The Large Hadron Collider, CERN’s main boast, opened only a decade ago. In that time, we have witnessed the discovery of gravitational waves, the Higgs boson, and various quantum mechanical phenomena.

These bold leaps forward have only been possible due to a wealth of state-of-the-art technology. The following are all incredible feats of engineering that have helped to revolutionize our understanding of the world around us.

10 Dark Energy Camera

What is dark energy? The brief answer is that nobody is really sure. In a sense, dark energy is the antithesis of gravity, exerting a negative, repulsive pressure that is believed to accelerate the expansion of the universe. The elusive energy form is said to account for around two thirds of the total mass-energy of the universe, the rest being mostly dark matter.

That said, the mystery of dark energy may not be a mystery for much longer. A group of researchers at the Cerro Tololo Inter-American Observatory is exploring dark energy in an attempt to understand the universe at a fundamental level. Situated high in the Chilean Andes, their Dark Energy Camera (DECam) captures high-definition images of the cosmos. It is one of the most sophisticated digital cameras on the planet.

It took scientists from six different countries over a decade of designing and testing to come up with DECam. The project has mapped out roughly an eighth of the sky in exceptional clarity, while also cataloguing 300 million galaxies. Experts are currently in the process of analyzing the images.[2]

9 Einstein Tower

As extravagant to look at as it is scientifically vital, the Einstein Tower in Potsdam, Germany, has spent nearly a century studying the Sun. The observatory was opened in the 1920s with the aim of validating Einstein’s then-recently published theory of relativity. Housed in the tower is an unconventional style of telescope, immovable and bolt upright, that measures spectral shifts in solar rays.

Even more bizarre than the theory it was commissioned to verify is the building itself. The Einstein Tower is a renowned example of expressionist architecture that brought its creator, Erich Mendelsohn, to fame. Observatories are generally fronted by bland, purely functional exteriors, but Mendelsohn’s vision was far more avant-garde.

The outcome of this outlandish approach to architecture is a curvaceous, science fiction-esque structure that juts up out of the German landscape. The building’s namesake, Albert Einstein, is said to have disapproved of the futuristic design.[3]

8 Stonehenge


By modern standards, it may be a prehistoric artifact, but when the rocks were first erected on Salisbury Plain some 5,000 years ago, Stonehenge was state-of-the-art technology. Historians have found strong evidence to suggest that the stone circle was a primitive sort of observatory used to monitor the sky.

In fact, some claim that the builders of Stonehenge must have used Pythagoras’s theorem, two millennia before the Greek philosopher was born. The original henge is said to have been surrounded by 56 wooden posts. Ancient astronomers would have used these to map out solar and lunar eclipse cycles.[4]

7 Pierre Auger Observatory


Cosmology is brimming with mysteries. How did our universe come into being? What is it made of? How do you explain its unusual expansion?

One such mystery is cosmic rays. Our planet is being bombarded by a continuous stream of high-energy particles, hurtling toward Earth at close to light speed. This barrage of subatomic particles is a phenomenon known as cosmic rays. The lower-energy rays are known to be born out of stars dying in our Milky Way galaxy. Far less is known about the higher-energy rays. Thought to originate from far-off galaxies, their exact source has eluded scientists for decades.

Cosmic rays are also extremely rare. On average, a square kilometer (0.39 mi2) will be hit by just one high-energy particle per century. To combat this issue, researchers have constructed an enormous detector that spans for miles across Argentina. The Pierre Auger Observatory covers a detection area of around 3,000 square kilometers (1,200 mi2)—roughly 30 times the size of Paris. Completed in 2008, the observatory picks up the cosmic rays after they have struck the atmosphere and showered down to Earth in a cascade of various secondary particles.[5]

6 Lovell Telescope


In a rural village in the heart of England, a renowned radio telescope has spent the last 60 years examining the cosmos. Found at Jodrell Bank, an observatory run by the University of Manchester, the Lovell Telescope is one of the most powerful radio telescopes ever built.

Its standout feature is the fully steerable white bowl, 76 meters (250 ft) in diameter, that adorns two motor-powered towers. This enormous bowl acts like a giant satellite dish, collecting and focusing radio waves from sources in the sky to be transformed into electrical signals.

Still the third largest of its kind over half a century since it was first assembled, Lovell has played a pivotal role in furthering our understanding of astronomy. The theories now being explored by Lovell were virtually unimaginable when it was first opened.[6]

5 Super-Kamiokande

Neutrinos have been at the heart of a number of fascinating scientific discoveries in recent years. The minuscule subatomic particles are thought to be among the most abundant in the universe and also one of the most difficult to detect. In 2015, Takaaki Kajita and Arthur B. McDonald were awarded the Nobel Prize for Physics after demonstrating that neutrinos change their intrinsic properties as they travel.

This fluctuation requires the particles to have some mass, contrary to the long-held belief that neutrinos are massless. Particle physicists now have to reassess their understanding of the nature of matter. It will likely lead to an expansion of numerous scientific theories.[7]

Kajita’s groundbreaking discovery was only possible due to Super-Kamiokande (model pictured above), an enormous underground detector tank filled with 50,000 tons of water. As neutrinos sprint through the tank, the vast majority of them leave no trace, but a few emit dazzling bursts of Cherenkov light (the optical equivalent of a sonic boom). By analyzing the bursts, researchers are able to examine the properties of the neutrinos themselves.

4 Hubble Telescope

Orbiting 547 kilometers (340 mi) above our heads, the Hubble Space Telescope has been described by NASA as the most important step forward in astronomy since Galileo introduced his telescope in 1610. In April 1990, when Hubble was first launched and deployed; having a permanent telescope outside of the Earth’s atmosphere was seen as revolutionary. Almost three decades later, the technology remains right at the cutting edge of modern science.

Unlike traditional ground-based telescopes, Hubble surveys the depths of space unimpeded by Earth’s dense, distorting atmosphere. The telescope’s sophisticated cameras can view astronomical occurrences with better clarity and consistency than any observatory on the planet.

The steady stream of observations being fed back from Hubble has entirely altered our understanding of the universe around us. On average, around 150 scientific papers per day will cite research that in some way incorporates Hubble data. The telescope has enabled astronomers to explore all manner of topics in great depth, from supermassive black holes to dark energy. It’s a mammoth achievement, especially for a satellite that is only the size of a large bus.[8]

3 Large Hadron Collider


CERN’s Large Hadron Collider (LHC) is, for the moment, at least, the most powerful particle accelerator ever built—although, as addressed at the start of the article, developers are currently debating whether to build another almost four times larger.

Inside the 27-kilometer (17 mi) magnetic ring, two beams of particles are thrust together at close to light speed. Researchers in Geneva have been smashing subatomic particles into one another since 2009. In 2012, after the LHC had been in operation for a only a few years, they made global headlines by confirming the existence of the Higgs boson.

It was initially hoped that the LHC might also be able to shed light on string theory and dark matter. As time passes on, with no evidence found, this is looking increasingly unlikely.

In order for the ring to maintain its magnetism, coils of superconducting cable need to be chilled with liquid nitrogen, which keeps them at a frosty minus 271.3 degrees Celsius (–456.3 °F). At these extremely low temperatures, the cable has an incredible ability to conduct electricity perfectly without losing any energy.[9]

2 LIGO

Gravitational waves are distortions in the fabric of space and time that radiate from high-energy interstellar bodies. They emanate from accelerating objects and spread through the cosmos like ripples across a pond. The largest waves stem from massive, turbulent events like a supernova explosion or two black holes colliding. There is even believed to be some gravitational radiation still lingering from the birth of the universe.

Albert Einstein first imagined these celestial ripples in 1916 as part of his general theory of relativity. However, their existence was not proven until 1974. For the first gravitational wave to actually be detected, researchers at the Laser Interferometer Gravitational-Wave Observatory (LIGO) in Louisiana had to construct a style of high-precision instrument known as an interferometer. Interferometers are able to take minuscule measurements by comparing two near-identical beams of light. They are often used to determine small changes in position.

While interferometer technology has existed since the late 19th century, LIGO’s are the most sensitive ever built. The twin detectors are made from two 4-kilometer (2.5 mi) steel vacuum tubes and measure fluctuations thousands of times smaller than a proton.[10]

The first gravitational waves to be sensed by LIGO came from two black holes crashing into each other almost 1.3 billion years ago. This momentous achievement earned three of LIGO’s researchers the 2017 Nobel Prize in Physics, along with mass acclaim from the media and their peers.

1 International Space Station

Around the same size as a football field, the International Space Station (ISS) is the largest artificial structure that we have ever put into space. Since November 2000, the station has been continuously inhabited, hosting over 200 individuals from 18 different countries. In a day, the distance traveled by the ISS is the equivalent of flying to the Moon and back.

Onboard the ISS, research projects are conducted into a wide array of various topics. On one mission, the crew was tasked with burning small, spherical droplets of fuel as part of a study into flames in microgravity. Another grew large crystals of protein in the name of medical research.

What’s more, the ISS is mounted with an exceptionally sensitive particle detector known as the Alpha Magnetic Spectrometer (AMS). Unlike the Pierre Auger Observatory, this instrument is able to measure cosmic rays before they fragment in the atmosphere. Data from the AMS may help enlighten cosmologists as to the source of cosmic radiation, while also supporting some theories on the composition of dark matter.[11]

Writer from Britain.

]]>
https://listorati.com/10-brilliant-feats-of-scientific-technology/feed/ 0 15411
10 Jobs We Already Lost To Technology https://listorati.com/10-jobs-we-already-lost-to-technology/ https://listorati.com/10-jobs-we-already-lost-to-technology/#respond Sun, 15 Sep 2024 19:58:41 +0000 https://listorati.com/10-jobs-we-already-lost-to-technology/

We are all concerned about robots and artificial intelligence taking over our jobs. However, nobody seems to be worried about the jobs that were taken over by earlier advancements in technology and even modern robots.

A few centuries or even decades ago, some of these professions were mainstream and profitable. They disappeared when some easier-to-use technologies were invented to take over their roles. In rare instances, advancements in technology proved that the profession should not even exist.

10 Gong Farmer

A few centuries ago, what we consider a bathroom (or toilet in Britain) was called a privy. It was a raised board with a hole in the middle instead of a flush toilet. Users sat there to do their business. Their feces went down the hole into the cesspit below.

The cesspit soon filled up and needed to be emptied. This was the job of the gong farmer.

Gong meant “going,” while farmer was used to refer to the act of “harvesting” the “goings.” The gong farmers entered the tight cesspits where the feces reached their waists. Sometimes, they employed a smaller boy to do the job. The boys scooped the feces into carts for transport to dumps where the waste was converted to fertilizer.

The gong farmers stank a lot, which isn’t surprising considering that bathing was alien to the people of the Middle Ages. They smelled so bad that they were often confined to their homes and only allowed to work at night.

The job was also dangerous. The feces produced poisonous gases that could kill the gong farmers inside the cesspit. However, the handsome pay made up for any humiliation or danger they faced. The job went extinct after sewage pipes and treatment plants crept up in the 19th century. Gong farmers still exist in some parts of the world, though.
[1]

9 Knocker Upper

Decades before the alarm clock went mainstream, people depended on the knocker upper to wake them from sleep. Interestingly, the profession lasted until the 1970s.

The first knocker uppers knocked or rang at the doors of their paying customers. However, they soon discovered that this was bad for business. Neighbors often complained that the noise woke them up. The knocker uppers also realized that they often woke nonpaying clients during the rounds. So the knocker uppers started tapping on the windows of their clients with long poles.

The tap was loud enough to wake the paying client but quiet enough to wake no one else. The knocker uppers did not hang around to make sure that their clients were awake and left after three or four taps. The profession started to disappear as electricity and alarm clocks became common.

Most knocker uppers went out of business in the 1940s and 1950s, and they were extinct by the 1970s.[2]

8 Ice Cutter

From 1800 to 1920, people preserved their foods with ice harvested from frozen ponds by ice cutters who used ice axes and, later, handheld ice saws. The industry boomed so well that ice cutters started using large ice saws that required horses to tow.

Most of the ice came from the natural fresh water in the northwest United States between January and February. The work was tedious. Ice cutters worked seven days a week in 10-hour shifts to harvest enough ice before temperatures rose in March. The ice cutters were always at risk of falling into the frozen water.

The horses were not spared from the dangers of the trade, either. They were equally at risk of falling into the frozen ponds. Their dung also contaminated the ice. Most ice cutting businesses even employed a “shine boy” to clean after the horses. The shine boy put the dung into a waterproof wooden sleigh that he always carried along.

The harvested ice was stored in warehouses called icehouses for export to other regions of the US and Europe. The icehouses were built with double walls, raised off the ground, and filled with sand, straw, sawdust, hay, charcoal, bark, and whatever would stop ice from melting. These structures were also located far from trees over concerns that the ice could become damp and melt.

However, the industry was not very predictable because ice could melt or form improperly. It was unusual for ice cutters to record two profitable harvest seasons back-to-back. The winners were the farmers who owned the ponds. They sometimes made more from selling ice from their frozen ponds than they did from peddling their crops. The industry disappeared after the invention of the electric refrigerator.[3]

7 Match Maker

Centuries ago, match-making companies employed all-female labor forces to make matches. These women were called “matchstick girls.” The job was dangerous and tedious. This was especially true at companies like Bryant and May which paid low wages, overworked workers, had strict regressive rules, and used dangerous white phosphorous in manufacturing its matches.

Matchstick girls at Bryant and May worked 14 hours a day. They were often fined for minor infractions like dropping a match, talking to coworkers, or clocking in late. However, their biggest threat was from the white phosphorous they worked with.[4]

White phosphorous is toxic. It could cause a disease called “phosphorous necrosis of the jaw,” which the ladies called “phossy jaw.” The disease rotted the jawbone. Sometimes, it spread to the brain, causing the sufferer to die a slow and agonizing death. It could only be treated by removing the damaged jaw. However, this could also lead to death.

6 Rectal Teaching Assistant

While we were immersed in debates over whether robots and artificial intelligence would seize our jobs someday, robots crept behind us and snatched the profession of the rectal teaching assistant.

Medical personnel often diagnose prostate cancer by inserting their fingers into the anus to feel the prostate gland. At one time, they were trained with the rectum of a living human who was called a rectal teaching assistant. Only one man in all of the UK was licensed to become one.

Concerned about the shortage of rectal teaching assistants, scientists at Imperial College London developed a robotic rectum that mimicked a real human rectum. Unfortunately, the creation of the robot meant that the only licensed rectal teaching assistant in the UK lost his job.[5]

The inventors say the robot is better than a human. Cameras inside the robot allow medical personnel to see the internals of the robot rectum on a computer screen. There was no way they could have done that with a human.

5 Human Computers

The first human computers appeared in 1939 when the Jet Propulsion Laboratory in California employed Barbara Canright as a human computer. Canright was responsible for calculating everything from the amount of force required to take an airplane airborne to the amount of rocket propellant required to get a rocket into space.

The complex calculations were done with pencils and paper. Determining the travel time of a rocket took a whole day, and some other calculations could take a week. A single calculation could fill up to eight notebooks. Canright was later joined by Melba Nea, Virginia Prettyman, and Macie Roberts after the US got involved in World War II.

The human computers concentrated their efforts on the space race after the war. They were responsible for the calculations that put the US first satellite in orbit, the Voyager probes into space, the first unmanned rover on Mars, and Neil Armstrong and the Apollo 11 crew on the Moon.

The human computers still had the upper hand when NASA started experimenting with mechanical computers in the 1950s. Most people believed that humans were more reliable than machines. However, the humans soon lost their jobs to computers.[6]

4 Pin Boy

A few decades ago, bowling alleys employed young boys called pinsetters, pin spotters, or pin boys to manually reset fallen bowling pins and return the bowling balls to players. The work had low pay and was often part-time. However, it was highly demanding as the boys often worked until midnight.

This started to change when Gottfried Schmidt invented the mechanical pinsetter in 1936. The pinsetter was semiautomatic and still required human intervention. Some bowling alleys did not adopt the mechanical pinsetter and continued using the pin boys. However, the pin boys and the semiautomatic pinsetter soon gave way to fully automatic pinsetters.[7]

3 Lamplighter

The first public streetlights appeared in the 18th century. They used fish oil for fuel and required a lamplighter to light them at night and put them out in the mornings. The fish oil streetlight was later improved, leading to the invention of the gas lamp. However, that also required a lamplighter.

These professionals used long poles to light the lamps at night and extinguish the flames in the mornings. Lamplighters were also responsible for cleaning, maintaining, and repairing the lamps.

The profession started to die off in the 1870s when the first electric streetlamps appeared. Electric streetlights rendered the gas models obsolete in the US. However, the UK was still stuck with gas lamps for several decades because electric lamps were controversial there at the time. Finally, the UK dumped gas for electric lamps.

Critics complained that their lights were blinding, ugly, and too bright for the night. Others pointed out that electricity was expensive. The British Commercial Gas Association promoted gas lamps as the better alternative and deliberately sabotaged the adoption of the electric lamp.[8]

Electric lamps only took over in the 1930s. However, around 1,500 gas lamps still exist in London for historical reasons.

2 Log Driver

Long before trains and trucks came along, timber that had been cut down deep in forests was rolled into rivers and left to drift downstream. However, it sometimes got stuck in miles-long logjams that could involve tens of thousands of logs and required dynamite to break up. An entire industry sprang up around escorting the drifting logs downstream and clearing the logjams. The men were called log drivers.[9]

The job was dangerous and tedious. These men often followed the logs in special boats. Sometimes, they even jumped from log to log as the timber drifted downstream. Unlucky log drivers fell into the water and drowned while escorting the logs or trying to end logjams. Some were crushed to death after falling between the logs.

1 Leech Collector

The leech collector was a brief profession that sprang up and disappeared in the 1800s. At the time, bloodletting was used to drain blood from the body to supposedly cure diseases. Physicians applied leeches to suck blood from their patients.

Leech collectors soon appeared to cater to the high demand for leeches. These jobs were often done by poor women who obtained leeches from ponds and other areas where the creatures were plentiful. The collectors used their legs (the preferred, cheaper method) or those of old horses as bait to attract the leeches.[10]

The women allowed the leeches to suck their blood for around 20 minutes before pulling them off. This was because a full leech was easier to detach than a hungry one. Nevertheless, this often caused injuries that bled for hours and resulted in significant blood loss. But the bleeding attracted more leeches, which was good for business.

The profession started to die out after leeches became scarce. Around the same time, doctors started to doubt that bloodletting really worked. Medical advancements soon proved that the procedure did not work and was actually dangerous. Bloodletting became history, and the leech collectors followed. The winners were the leeches that were saved from extinction.

]]>
https://listorati.com/10-jobs-we-already-lost-to-technology/feed/ 0 14931
10 Ways Modern Technology Is Destroying Natural Selection https://listorati.com/10-ways-modern-technology-is-destroying-natural-selection/ https://listorati.com/10-ways-modern-technology-is-destroying-natural-selection/#respond Tue, 02 Jul 2024 13:15:58 +0000 https://listorati.com/10-ways-modern-technology-is-destroying-natural-selection/

Charles Darwin used his theory of natural selection to explain evolution. That is, organisms are able to adapt and survive in their environment better than their predecessors did. These organisms pass their most favorable features to their offspring, which continue the cycle.

However, technology is already interfering with natural selection—at least in humans. Almost every tech item out there today—from our smartphones to the massive advances in medicine—are altering our lives faster than natural selection could.

People with unfavorable conditions are surviving and passing these traits to their offspring. At the same time, others are developing new health challenges causes by overreliance on technology.

10 Cesarean Sections Make Women’s Hips Narrower

Cesarean sections are leaving women with smaller pelvises. Centuries ago, women with small pelvises died during childbirth along with their children, who could possess the genes for small pelvises.

However, these women are surviving these days as C-sections become mainstream. They are also able to birth children with the genes and even female children with narrow pelvises, who also pass the trait to their offspring. Studies have shown that 36 of every 1,000 children born today have a narrow pelvis. In the 1960s, it was just 30 of every 1,000.[1]

At this point, some would start to wonder why natural selection did not leave all women with large pelvises. That is because the human body evolved to prefer smaller babies that could pass through the narrower pelvises instead of larger babies that could pass through wider pelvises.

Interestingly, C-sections are slowly changing this. Babies are becoming larger despite their mothers’ smaller pelvises. This means that C-sections will become more common in the future.

9 Mobile Phones Are Causing Horns To Grow In Our Skulls

We frequently bend our necks downward to interact with our smartphones. This is causing the development of a bony, hornlike structure at the lower end of the back of our skulls. Scientists call these little horns “external occipital protuberances.”

The horns are growing because the bent head delivers severe pressure at the point where the neck muscles meet the skull. The skull responds by elongating the bone at its rear tip, causing the extension. People with an external occipital protuberance can often feel it with their fingers. You may even be able to see it on a bald person.

An external occipital protuberance may appear no matter what we have in our hands or right in front of us. The only condition is that we bend our heads frequently, which is what smartphones cause us to do. Books do, too, but not as frequently as smartphones. Besides, not everyone reads books.[2]

8 Search Engines Are Making Us Forgetful

Imagine you were asked some random question such as when Martin Van Buren became the president of the United States? What do you do? Recall the answer without batting an eye, or fire up your search engine? Most people will use their search engines because they probably do not remember the date offhand. Some do not even know he was a former US president.

This is what researchers call the “Google effect,” the likelihood of forgetting information you could quickly search for on the Internet. The condition was revealed in a 2011 study by researchers Betsy Sparrow, Jenny Liu, and Daniel Wegner.[3]

The researchers found that people often considered checking the Internet whenever they were asked questions they could not answer. They were also likelier to forget information if they knew it would be available somewhere—even if it was not the Internet. An example would be your spouse’s phone number stored on your phone.

The Google effect happens because we often remember significant information and forget unimportant facts. However, we can also forget the important info if we can access it somewhere. As to our original question, no need to Google it—Martin Van Buren became president in 1837.

7 Farming Made Our Jaws Smaller

Early hunter-gatherers had large faces with prominent jaws and teeth. However, all these started to disappear when we dumped the hunter-gatherer lifestyle for farming 12,000 years ago. Today, we are left with small jaws without enough space for our teeth.

Hunter-gatherers had large jaws because they did a lot of chewing. They ate uncooked meat and plants that were often tough and required lots of jaw strength to cut and chew. This made their jaws stronger. However, our jaws became weaker as we switched to farming softer crops that did not require superior jaw strength to chew. Our jaws further weakened as we switched to cooking our meals.[4]

The effects of our newfound farming lifestyle did not stop there. The switch also made our bones lighter and less dense, most especially around the joints. However, this was not caused by the softer food but the less strenuous lifestyle of the farmers who did not need to stalk, chase, and kill prey like the hunter-gatherers.

6 Processed Foods Are Changing Children’s Faces

The food eaten by children often determines the strength and shape of their faces—skulls, jaws, and all. However, the majority of children born nowadays have abnormal faces caused by the massive amounts of processed foods they start to consume right after birth.

This is because natural foods contain enough nutrients required for proper facial development. Besides, as we mentioned earlier, natural foods often force children to chew with their jaws, making their jaws and skulls considerably stronger. Processed foods often reduce the possibility of chewing, leading to considerably weaker jaws.[5]

Today, our overdependence on processed foods has made our skulls 5–10 percent smaller than those of early humans from the Paleolithic Era. This problem has been observed in animals, too. Young animals reared on processed foods often end up with jaw problems that are similar to those of humans.

5 Social Media Is Destroying Our Lives

Social media has been linked to a myriad of problems including depression, hyperactivity, anxiety, low self-esteem, and loss of concentration. This is worse in teenagers who form the bulk of social media users. They often suffer from the fear of missing out (FOMO), which makes them check their social media handles more than necessary.

However, the link between social media and these health problems is blurred as there is not enough research to establish such associations. Some critics even say that social media only appears to cause depression and loneliness because most users already have those characteristics and only turn to social media to meet people.

The critics say this even though a study has already found that social media does cause depression and loneliness. The study involved 143 University of Pennsylvania students split into two groups. One group reduced their time on social media while the other continued to use it normally.

The study revealed that the individuals who spent less time on social media enjoyed improved mental health and lower levels of depression and loneliness than people who used it more often.

Interestingly, FOMO and anxiety levels decreased in both groups even though researchers expected higher levels in people who used social media more frequently. Researchers believe that this was because the users from both groups became more aware of their use of social media while the study lasted.[6]

4 Smartphones Have Reduced Our Attention Span

Our brains have a very advanced concept of time. They are able to predict future events as we engage in our daily activities. For instance, the brain determines the best time to stretch your hand for a handshake—just so your hand will meet the other person’s hand at the right moment.

Our brains also apply this concept of time when we interact with our smartphones. For instance, if you check your phone every five minutes, your brain soon predicts this behavior and reminds you to check your phone at five-minute intervals.[7]

Before long, the habit interferes with your attention span, making you unable to concentrate on whatever you are doing. Instead, you are thinking about grabbing your phone to see the latest happenings on social media. Studies have found that phone addicts make less use of the brain regions responsible for focus. They also require greater efforts to concentrate on a task.

3 The Internet Is Making Us Unable To Cope Offline

In 2011, Professor David Levy of the University of Washington’s Information School coined the term “popcorn brain” to describe the effects of technology on our cognitive abilities (that is, our abilities to think and recall information). People with popcorn brain are so engrossed in their online lives that they become uninterested and unable to cope with their lives offline.

Levy came up with this concept after investigating how the Internet affected our lives offline. In his study, he discovered that we are always interested in reading every new email and message and visiting websites, hoping to find some new information. Our brain soon becomes used to this trend and often wants us to seek some new information every time.

This often leaves us with short attention spans, high expectations of finding some new information, and the inability to live our normal lives offline. Interestingly, Levy’s study supports earlier research that revealed that students who spent 10 hours a day on the Internet have lower cognitive abilities than students who spent just two hours.[8]

2 Technology Is Causing Nearsightedness In Children

Myopia (nearsightedness) is the latest health problem linked to the infiltration of technology into our daily lives. The statistics are terrible in high-tech countries like China where 90 percent of teenagers suffer from myopia. Sixty years ago, only 10–-20 percent of Chinese teenagers suffered from nearsightedness.

Myopia levels are also rising in Europe, the United States, and South Korea. In Seoul, over 96 percent of 19-year-old males suffer from myopia. Estimates indicate that 2.5 billion people (one-third of the world’s population) will suffer from myopia by 2020.

Teenagers are developing myopia because they spend so much time indoors and away from natural sunlight, which is important for the perfect development of the eyes. This is why myopia levels are low among Australian teenagers, who spend lots of time outside. Researchers believe that this trend could be reversed by exposing children to three hours of sunlight every day.[9]

1 Smartphones Are Causing Insomnia

Smartphones always get a bad rap for interfering with sleep. Well! That is because they really do. Taking your smartphone to bed is the best way to end up with insomnia.

Smartphones cause insomnia because they are distracting. The sounds and vibrations of calls, notifications, and messages can stop people from sleeping or even wake them from sleep. People who take their phones to bed can also end up checking social media and more, causing them to sleep much later than they intended.

If that was not enough, smartphones and almost every other tech gadget with a screen emits a blue light that the brain mistakes for daylight. This causes the brain to lower the secretion of melatonin—the hormone that tells our body it is time for bed. This is usually not a problem during the day but quickly becomes one when we are trying to get some sleep at night.[10]

]]>
https://listorati.com/10-ways-modern-technology-is-destroying-natural-selection/feed/ 0 13380
10 Cutting-Edge Uses Of Laser Technology https://listorati.com/10-cutting-edge-uses-of-laser-technology/ https://listorati.com/10-cutting-edge-uses-of-laser-technology/#respond Tue, 07 May 2024 06:37:47 +0000 https://listorati.com/10-cutting-edge-uses-of-laser-technology/

First developed by Gordon Gould during the 1950s, lasers are one of the most popular and widely used devices in modern society. The optical beams can be found in everything from weapons guidance systems to hair removal surgery. However, that was not always the case. Originally, Gould and his peers struggled to find practical applications for their new invention. In fact, one of the pioneers, Irnee D’Haenens, once described the device in jest as “a solution looking for a problem.”[1]

Nowadays, the technology is practically ubiquitous, and lasers—or, to give them their full title, light amplification by stimulated emission of radiation—are a cutting-edge tool used in developing all manner of brilliant innovations. With them, we are able to cool atoms to a fraction of a degree, create incredibly advanced systems of data storage, and even detect evasive astronomic phenomena like gravitational waves. Sixty years since their invention, laser technology remains as relevant and exciting as ever.

10 Chirped Pulse Amplification

Chirped pulse amplification (CPA) is one of the most remarkable innovations in modern technology. The groundbreaking technique is used to produce high-intensity laser pulses without destroying the material through which the light is moving. Optical bursts are stretched in time to bring down the peak power and then amplified before being compressed, forming a pulse of light with phenomenal intensity.[2]

First developed in the mid-1980s, CPA has become commonplace in corrective eye surgery, in which high-intensity lasers are used to reshape the cornea. Other potential fields of application include quantum computing and data storage. In fact, scientists hope that the principles of CPA could be used to build computers that operate with unprecedented efficiency—up to 100,000 times faster than current models.

Although it is far from being realized to its full potential, the laser technique has several significant applications and has made a number of considerable scientific contributions. As such, two of its key developers, Donna Strickland and Gerard Mourou, were awarded the 2018 Nobel Prize in Physics. The decision was welcomed by large swaths of the scientific community; Strickland is the first female physicist to be made a Nobel Laureate since 1963, and only the third in the prize’s history.

9 Clearing Train Lines

It might not seem like a major issue, but wet leaves on the track wreak absolute chaos in the rail system. The regular pressure of passing trains causes the leaves to become torn and compressed, which, over time, leads to a slippery coat forming. The treacherous debris reduces the friction on the line, creating dangerous conditions for any oncoming trains.

Typically, rail companies will attempt to blast the leaves from the line with jets of water or use sand to bolster the friction. However, both of these solutions have been known to damage the tracks, plus water and sand are both cumbersome materials to carry.

The company LaserThor has come up with an alternative solution to the leaf issue: Blast the leaves off using lasers. A 2-kilowatt Nd:YAG laser vaporizes organic material by heating it to an eye-watering 5,000 degrees Celsius (9,032 °F). In 2014, Dutch company Nederlandse Spoorwegen agreed to try the system on one of their DM-90 trains. As well as searing off leaves and other debris, the heat of the beam also dries the rails, preventing them from rusting.[3]

8 Laser Cooling


It might sound counterintuitive to use lasers to cool a substance down—after all, don’t they usually make objects hotter? However, in the mid-1980s, pioneering physicist Steven Chu demonstrated how laser beams can be used to cool atoms down to extremely low temperatures.

Particles in a gas exhibit frenzied behavior—darting about at high speeds with an abundance of energy. However, as the gas is cooled, the particles begin to lose their energy, and there is a marked drop in speed. In other words, slowing down the atoms causes the temperature of a gas to fall.

This is the essence of laser cooling. When an atom is moving toward a laser, it absorbs photons from the beam and begins to slow down. As they slow down, the decelerating particles will lose some of their energy and thus begin to drop in temperature.

The basic theory suggests that 20,000 photons would be required to bring the momentum of a sodium atom down to zero. This may seem difficult to achieve, but Chu has stated that, with the right level of tuning, lasers can inhibit around ten million absorptions every second. With laser cooling, atoms could be brought to a near-halt in milliseconds.[4]

Over the past three decades, the technique has developed to an astounding extent. Physicists are now able to cool atoms to a billionth of a degree above absolute zero.

7 Manipulating Rodents


In recent years, scientists have developed multiple techniques to alter the behavior of rodents. A handful of these involve lasers.

Laser technology has enabled scientists to reverse alcoholism in rats. A group of experts from Scripps Research, a medical institute in San Diego, California, managed to reduce the creatures’ dependence on alcohol. The team’s paper, published in March 2019, describes how they implanted fiber optics into the rats’ brains and targeted specific neurons with a laser beam. Scripps professor Olivier George has described the technique as being as quick and effective as a “flip of a switch.”[5]

Although their work is nothing short of remarkable, Scripps Research is not the first institute to use lasers to manipulate small, furry creatures. Two years previous, researchers from Yale University developed a similar technique to activate predatory instincts in mice. By shining blue light at neurons in the rodents’ temporal lobes, researchers were able to stimulate biting, grabbing, and other killer behaviors.

6 Holographic Data Storage

Since the launch of the compact disc in the 1980s, laser technology has played an essential role in the recording, storage, and retrieval of data. However, this technology has limitations. In all optical storage techniques that are currently available, data is written onto the surface of a disc. This means that the total amount of data that can be stored in a device like a DVD is constrained by its surface area.

To overcome the issue, scientists are looking to a new technique to improve the storage space of optical devices: holographic data storage. These devices would be able to hold masses of information in the form of three-dimensional holograms. Not only does this significantly raise the amount of data that can be held in a given area, but it is also said to be a more efficient and reliable method.

However, there is one major setback preventing you from picking up the latest blockbuster or hit album in holographic form: This technology only exists as a prototype. In recent years, developers have tried to produce a commercial technique for holographic storage, but none have taken off as yet.[6]

The most promising development comes from the Northeast Normal University in China’s Jilin province. Researchers at the university have developed a semiconductor film made from titanium dioxide and silver nanoparticles. To write data onto the film, a laser system alters the charge of the silver nanoparticles, and the particles are affected differently, depending on the wavelength of the light.

5 Contact Lenses

Although it cannot hold a candle to Superman’s laser vision, scientists have developed a sophisticated contact lens that shines laser light out of your eyes. This impressive feat is possible due to the creation of an ultrathin film—only a thousandth of a millimeter thick—which can be attached onto or embedded into a contact lens.

The technology was revealed in May 2018 by a team of physicists from the University of St Andrews, who told reporters that it could be used to create wearable security tags. When tested on a cow’s eyeball, the lens generated a laser beam with around a nanowatt of power.[7]

4 Military Drone Defense

They might sound like something from the imagination of a Star Trek fan, but laser weapons are the future of military technology. In the past few months, the US Marine Corps has begun testing of their Compact Laser Weapons System (CLaWS)—a vehicle-mounted system (pictured above) designed to bring down enemy drones (aka UAVs). Compared to traditional firepower, the laser weapon is not only better value for money, but it also makes it far more difficult for drones to track and target ground troops.

As tensions escalate in the Middle East, Turkish forces are also using weaponized lasers to display their military innovation and prowess. In fact, they have become the first country to bring down an opposition vehicle using a ground-based laser system in combat. In August 2019, Turkey deployed their laser weaponry to attack and destroy an armed UAE drone circling over the Libyan district Misurata.[8]

3 Detecting Gravitational Waves


When researchers from LIGO announced that they had detected gravitational waves for the first time in 2015, physicists and astronomers around the world held their breath. Until that point, gravitational waves had proved incredibly elusive. Although they were first predicted in 1916, it took almost a century of technological innovation before scientists were actually able to catch a glimpse. That astonishing feat—the first direct observation—was only possible due to laser technology.

Gravitational waves are cosmic ripples echoing across the universe at the speed of light. As they travel through space, the waves warp and distort their surroundings in a way that can be measured using sophisticated, highly sensitive laser detectors.[9] LIGO, the underground gravitational wave observatory, uses lasers and mirrors to detect infinitesimal changes that occur whenever a wave passes through.

2 Bioprinting Stem Cells

Bioprinting is an emerging and highly sophisticated process used by medical experts to manufacture synthetic organs and tissues. Typically, these artificial replicas are created by depositing droplets of bioink layer by layer to build up working 3-D structures.

In 2018, a team of researchers from Laser Zentrum Hannover developed a technique for bioprinting a type of stem cell known as hiPSCs—human-induced pluripotent stem cells. These cells incredibly versatile, able to change into any other type of cell in the human body. As such, they have excellent potential as a material for building replacement organs or personalized drug testing systems.

Prior to bioprinting, the hiPSCs are suspended in bioink and layered onto the surface of a glass slide. A second glass slide is positioned directly beneath the first. Droplets of biomaterial are then ejected from the higher glass slide onto the lower using brief pulses of laser light.

So far, the preliminary tests have proven successful. Almost all of the cells survived the laser printing process and retained their properties, too.[10]

1 Optical Tweezers

Previously in the article, we touched on Donna Strickland and Gerard Mourou, two of the physicists who were awarded the Nobel Prize in 2018 for their pioneering work on chirped pulse amplification. That year, a third optical trailblazer was also endowed with the prize: Arthur Ashkin, the American physicist renowned for inventing optical tweezers.

Optical tweezers are remarkable instruments with a broad array of biological applications, ranging from investigating the movement of live bacteria to examining the properties of DNA. Ashkin’s technique uses a highly focused infrared laser beam to suspend microscopic objects in midair, capturing them in the center of an optical trap. As the object interacts with photons from the laser, it is acted upon by the scattering and gradient forces that keep it locked in position.[11]

]]>
https://listorati.com/10-cutting-edge-uses-of-laser-technology/feed/ 0 12040
Top 10 Signs We’re Embracing Cyberpunk Technology https://listorati.com/top-10-signs-were-embracing-cyberpunk-technology/ https://listorati.com/top-10-signs-were-embracing-cyberpunk-technology/#respond Sat, 04 Nov 2023 17:27:32 +0000 https://listorati.com/top-10-signs-were-embracing-cyberpunk-technology/

Cyberpunk dystopias have been written about in sci-fi books since the ‘70s. The genre often paints a sort of futuristic take on a “Tale of Two Cities,” where society is divided between mega-rich conglomerates and tech-savvy slum dwellers. Citizens routinely augment their own bodies with cybernetic enhancements. The cream of high society uses its technological and militaristic superiority to crush dissent. And legions of transhuman hackers fight back against their corporate overlords.

Many of the horrors spawned from Philip K. Dick’s bustling, neon-lit streets are yet to come to fruition. But while the idea of android replicants and all-seeing pre-cogs may seem like a distant future, cyberpunk tech is gradually starting to take root. So let’s take a look at just some of the ways that society is embracing cyberpunk tech, for better or worse.

10 Ways Modern Technology Is Destroying Natural Selection

10 Beaming Ads from Space

Many companies are already leveraging a marketing tool known as “space advertising.” Since NASA’s decision to allow private flights to the International Space Station, PR opportunists have shuttled all manner of products to the orbiting research station, including Adidas soccer balls, bottles of wine, and shower heads. The Russians have even launched rockets plastered with ads for toothpastes, diapers, and TV networks. But now they want more.

Imagine the night sky filled with a giant, luminescent brand advertisement – something billions of people will see (and cannot avoid). That’s the ambition of a Russian company called StartRocket. The startup aims to put a series of satellites in low Earth orbit, each fitted with a reflective solar sail. The orbital display is then generated when the Sun’s rays bounce off the satellite array.

In 2019, the Russian arm of PepsiCo said it had struck a deal with StartRocket to devise a space billboard for one of its energy drinks, Adrenaline Rush. The idea was to launch a “campaign against stereotypes and unjustified prejudices against gamers.” The plans prompted an immediate backlash, however, and Pepsi was forced to respond. The drinks company said it had worked with StartRocket to test “stratosphere advertisements” using one of its logos but had no further plans to continue the project.[1]

9 Signing Virtual Influencers

Online influencers have one goal: advertize corporate swag for money. They use their lofty follower counts on Instagram and Snapchat to sign lucrative deals with big brands. The more followers, the better the pay. But Silicon Valley is starting to realize it can just “design” its own influencers in a digital laboratory. In 2016, a Los Angeles-based company, Brud, concocted a teen influencer named Lil Miquela. The CGI character immediately gained internet fame, attracting hoards of die-hard fans.

Lil Miquela’s Instagram account is where the money is made. With nearly three million followers, the 19-year-old’s handlers have struck deals with a number of big names, including Calvin Klein, Prada, and Samsung. [LINK 3] In 2019, Lil Miquela featured in a commercial for Calvin Klein, in which she kisses American model Bella Hadid. She was even signed by the Creative Artists Agency (CAA) in 2020 – the agency’s first virtual client.

Miquela claims to fight for social change, tackling issues on homelessness, racial equality, and LGBT rights. In 2018, Brud orchestrated an internet feud between Miquela and a digital pro-Trump influencer named Bermuda. The company made it seem as though Bermuda had hacked Miquela’s account and deleted all of her Instagram pictures. These types of stunts have paid dividends. To date, it is estimated that Brud has bagged a cool $6 million worth of investment.[2]

8 Electronic Tattoos

A number of researchers are hoping to use electronic tattoos (a.k.a. tech tats) as a means of gaining a foothold in the diagnostics industry. In April 2018, engineers at the University of Minnesota released a paper describing a method of applying 3D-printed electronics to a person’s skin. The team hopes the technology will deliver biomedical devices that can monitor a patient’s vital signs, including blood pressure, heart rate, and temperature.

According to Professor Michael McAlpine, one of the paper’s authors, the tech tats have military applications, too. Given that the circuits can be constructed using a basic $400 printer, McAlpine predicts the tattoos will become commonplace in the field. “You can start to think about printing life-saving devices on the body, like a solar panel on the wrist, or a chemical or biological warfare sensor on an arm,” he explained.

Meanwhile, a private company called Chaotic Moon aims to put the tech in the hands of the consumer. In 2015, the group’s hardware specialist, Eric Schneider, explained how his company’s tattoos would revolutionize the industry. “Rather than going to the doctor once a year to get a physical, this tech tattoo can be something you just put on your body once a year. And it monitors everything that they would do in a physical and sends that to your doctor.”[3]

7 Implanting Workers with Microchips

BioTeq, as part of its professional microchip service, has been implanting British workforces with radio-frequency identification (RFID) chips for years now. The tiny devices, which sit underneath the skin near the thumb, allow workers to open electronic doors, pay for cafeteria food, and log in to company computers – all with a wave of the hand. They can also hold small bits of data, including important medical information.

A Wisconsin-based tech company, Three Square Market, has also chipped around half of its employees. Those who accepted the implants were awarded “I Got Chipped” t-shirts. The company president says he got the idea after a business trip to Sweden. Today, thousands of Swedes are volunteering to undergo the procedure. One of the country’s national railways allows passengers to pay for train tickets using the chip implants. Businesses and gyms use Near Field Communication (NFC) scanners to grant building access to tech-savvy citizens. And a number of vending machine companies have started rolling out the contactless technology.

Consumers are even buying microchips online for the purposes of self-chipping – a practice known as biohacking. A company in Seattle, Dangerous Things, currently sells microchips to people across the globe. The latest model allows customers to select from a range of LED colors, with the device blinking into life during the scanning process.[4]

6 Brain-Computer Interfacing

Back in 2006, scientists at the University of Pittsburgh hit the headlines after hooking up a series of electrodes to the brain of a rhesus monkey. The electrodes fed impulses from the creature’s motor cortex to a nearby robot arm. The monkey then fed itself pieces of banana by controlling a robotic arm with its own brain.

Since that time, brain-computer interfaces (BCIs) have gained commercial appeal. In December 2020, NextMind started shipping a headset that connects its customers’ brains to their digital gadgets. The headset interprets different types of commands based upon the electrical activity of the brain, which are then beamed wirelessly to controllable devices. The user can remotely play video games, control light switches, and change the TV channel using nothing but the power of their own mind.

Meanwhile, Elon Musk has ploughed roughly $100 million into a BCI startup called Neuralink. In the future, the company aims to embed coin-sized implants in the human brain, allowing users to become one with the digital world. Musk recently demonstrated a very rudimentary form of Neuralink’s tech in a pig called Gertrude. The tech has a range of possible uses, from controlling neuroprosthetic limbs to uploading memories. “Everything that’s encoded in memory, you could upload,” Musk explained. “You could basically store your memories as a backup and restore the memories. And ultimately you could potentially download them into a new body or into a robot body. The future’s gonna be weird.”[5]

10 Cases of Appropriate Technology

5 3D Printing Chicken Nuggets

Aside from the obvious ethical concerns, intensive animal farming may become increasingly untenable as the population continues to grow at an exponential rate. One solution comes in the form of 3D-printed food. In 2020, KFC Russia announced that it was partnering with 3D Bioprinting Solutions to cook up the “meat of the future.”

Using a combination of chicken cells and plant matter, the fast food chain aims to reproduce the same taste and texture of its fried chicken. Biomeat, the company says, could reduce its environmental footprint, eliminate the need for additives in food, and lead to better treatment of animals. “Crafted meat products are the next step in the development of our ‘restaurant of the future’ concept,” explained KFC Russia’s General Manager, Raisa Polyakova.

Production of bioengineered meat starts with extracting stem cells from the desired animal. After a simple biopsy is conducted, the cells are multiplied and funneled into a bio-cartridge. The 3D printer then uses the live cells to craft the meat, layer by layer.

Several Israeli and Spanish companies are also experimenting with 3D-printed meat products, including Redefine Meat. Redefine is currently using plant-derived proteins to mimic the constituents of your average steak, right down to the blood, fat, and muscle. The company is currently working with restaurant owners to bring its “alt-steaks” to diners.[6]

4 The Rise of the Megacorp


A cyberpunk future envisions a time when big corporations become too big to fail. These powerful, ultra-rich behemoths assume monopolistic control over markets, dictate government policy, and often possess their own private foot soldiers. While today’s companies haven’t quite reached megacorp status, some worrying trends are starting to emerge.

Much of the post-manufacturing West has developed an intense corporate culture that demeans a particular subset of low-paid, overworked employees. Amazon is perhaps a good place to start. Amazon warehouse operatives have allegedly urinated in bottles to avoid falling behind on their daily quotas. Workers routinely complain about unsafe working conditions. And depression is rampant. The company has even patented a wearable wristband that tracks a worker’s progress and, via a series of vibrations, warns them when they are slacking off.

Over the decades, other major players have outsourced much of their manufacturing bases to places like China, Malaysia, and Taiwan. Lower wages, cheaper materials, and lax regulations all make for a better bottom line. This also means that much of the suffering is outsourced. Take Foxconn, for example. Since 2010, the electronics manufacturer has fitted a series of “suicide nets” around a number of its buildings and made workers sign agreements promising not to take their own lives.

Those companies that choose to operate in America, in whatever capacity, have their own sophisticated lobbying machines. They fund think-tanks, form connections with Congress, and point politicians in the “correct” regulatory direction. In November, a report surfaced that suggested Apple, Coca-Cola, and Nike had spent huge sums of money lobbying against the Uyghur Forced Labor Prevention Act. The bill seeks to ban importation of goods made using forced labor in the Xinjiang region of China. If passed, it will likely affect the supply chains of some of America’s biggest companies.[7]

3 Deploying Police Robo-Dogs

It’s no secret that Boston Dynamics has been working on perfecting its robotic tech. One of the engineering company’s most renowned products is Spot – a robotic dog that has a 360-degree field of view. In July 2019, the Massachusetts State Police revealed that it had begun testing the use of Spot in the field. The Bomb Squad leased the $75,000 device for a period of 90 days, reserving it for “the purpose of evaluating the robot’s capabilities in law enforcement applications, particularly remote inspection of potentially dangerous environments which may contain suspects and ordinances.”

In 2020, the New York Police Department deployed Spot in response to a shooting. The gunman, following an altercation over a parking ticket, missed his intended target and accidentally shot a woman in the head. Spot was used to assess the situation after the man hid in a nearby building. In Singapore, the quadruped was recently spotted broadcasting a pre-recorded message about social distancing rules. And the U.S. Air Force is testing similar machines for use in monitoring its bases.[8]

2 Gene-Edited Babies & De-Extinction

In 2018, He Jiankui became infamous for his role in producing the very first genome-edited babies. The Chinese scientist used CRISPR gene editing techniques in an attempt to create babies that are immune to HIV. Two of the babies, Lulu and Nana, were born in October 2018, prompting considerable backlash from the scientific community.

The Chinese authorities sentenced Jiankui to three years in prison for his role in the illegal experiments. But in December 2020, the U.S. Director of National Intelligence John Ratcliffe claimed China had already carried out “human testing on members of the People’s Liberation Army in hope of developing soldiers with biologically enhanced capabilities.” Beijing responded by accusing Ratcliffe of peddling “fake news.”

Scientists around the world have already made a number of breakthroughs with CRISPR, from disease-resistant chickens to mushrooms that remain fresh for longer. Then there is CRISPR’s use in “de-extinction” initiatives. One Harvard geneticist, George Church, is currently trying to use the technology to make a hybrid between a woolly mammoth and an Asian elephant. By giving the Asian elephant some of the mammoth’s genes – those coding for its shaggy coat and thick layer of fat – Church hopes the creature will prosper in the unforgiving temperatures of Siberia. “[I]t will be able to live at -40 degrees comfortably… It will look and behave like a mammoth,” explained Church.[9]

1 Flying Cars and Air Taxis

A number of flying cars are expected to hit the market in the coming years, including models from Terrafugia, Aeromobil, and PAL-V. The PAL-V Liberty is a gyroplane that can reach up to speeds of 112 mph (both flying and driving). Starting at an eye-watering $400,000, the Liberty’s base model is already available for preorder. The first batch of flying cars will ship in 2021.

The tech industry has come up with a more practical solution: the Vertical Take Off and Landing (VTOL) air taxi. The German aircraft maker Volocopter plans to have an air taxi service running by 2023. Tickets for the company’s first round of flights have already sold out. A partnership between Toyota and SkyDrive also holds promise. The Japanese companies recently performed a flight test using their SD-03 prototype vehicle, with plans for a commercial launch in 2023. Likewise, Uber has invested heavily in the air taxi business, partnering with Hyundai and Joby Aviation to bring about its own prototypes.

Although the aviation industry is currently working on regulations for VTOL flights, a number of challenges remain. Coordinating the flight patterns of congested airways will eventually require automated systems, along with designated airlanes and landing zones. But with the industry expected to reach a value of $1.5 trillion by 2040, it might be worth the hassle.[10]

10 Ways Technology Is Changing You For The Worse

]]>
https://listorati.com/top-10-signs-were-embracing-cyberpunk-technology/feed/ 0 8402
Top 10 Astounding Uses For Genetic Technology https://listorati.com/top-10-astounding-uses-for-genetic-technology/ https://listorati.com/top-10-astounding-uses-for-genetic-technology/#respond Fri, 15 Sep 2023 08:26:29 +0000 https://listorati.com/top-10-astounding-uses-for-genetic-technology/

Genetic technology is changing the world as we know it. As you read this, scientists are working on fascinating ways to modify DNA. Recently, a form of advanced gene-editing technology known as CRISPR has opened up new avenues of genetic experimentation. CRISPR is held in such high regard that, in 2020, its creators were awarded the Nobel Prize in Chemistry. Their new tool allows researchers to alter DNA with unique precision. Already, it is helping produce new forms of cancer therapy. Experts reckon it could one day be used to cure genetic conditions.

Of course, gene editing is a controversial practice that raises many ethical concerns. Bioscientists have been accused of “playing god” with the genome. But genetic technology has also inspired all manner of mind-boggling scientific innovation. Here are just ten of its most astounding uses.

10 Foods That Have Been Genetically Modified Beyond Recognition

10 Modified goats produce cancer drugs in their milk


Scientists in New Zealand have created genetically modified goats that produce cancer drugs in their milk. The goats have been specifically altered to create cetuximab medication, which is used to treat cancer in the colon and lungs. Currently, the drug can cost as much as $13,000 a month without insurance. Scientists hope their new method of production will help lower the price, thereby making it more accessible.

Manufacturing cetuximab is an expensive process. Its elaborate chemical structure means producers have to rely on proteins from inside mice cells to cultivate the drug. But these genetically modified goats have offered the pharmaceutical industry a way to mass-produce cetuximab.

“It’s a lot more economic to make cetuximab in animals because their mammary glands can produce large amounts of proteins,” explained Götz Laible, the researcher in charge of the project at New Zealand’s AgResearch institute.

9 Scientists store data inside living DNA


Data storage is a difficult business. Day in day out, we rely on electronic devices like hard disks, optical drives, and memory sticks to store vast amounts of information. But perhaps other materials could be better suited to data storage. Now, scientists in New York have come up with a new method that uses gene editing to store data in the DNA of live bacteria.

In 2021, researchers at Colombia University demonstrated that live E. coli cells can store up to 72 bits of data. At its core, a data file consists of a long line of ones and zeros. The scientists were able to encode ones and zeros into the E. coli DNA by inserting specific genes into the cell. They were even able to write the simple message “Hello world!” into the DNA of an E. coli cell, then decode it by sequencing the DNA.

DNA is surprisingly well-suited to data storage. Biological proteins can store a massive wealth of information. Scientists estimate that if a strand of DNA were the size of a grain of salt, it could hold the equivalent of ten feature-length movies. What’s more, the technology needed to read and write DNA is becoming increasingly powerful with time. That said, DNA data storage is still in its infancy, which means it is unlikely to take off any time soon.

8 Increasing the lifespan of dying mice


Scientists at Harvard University have extended the lifespan of dying mice using gene editing, increasing their life expectancy by more than double.

As part of the study, led by Professor David Liu, the mice were given progeria—a rare disease that causes premature aging in children. On average, children with progeria live to the age of fourteen. The condition is caused by a rare genetic mutation and cannot be treated using regular gene therapy. Instead, the Harvard team is developing a way to change the fundamental coding of the progeria sufferer’s DNA.

This technique was trialed on the terminally ill mice, and it significantly improved the length of their lives. The rodents, which were expected to survive 215 days, went on to live for a median average of 510 days. Liu and his team hope to use these findings to develop an effective treatment for progeria and similar genetic conditions.

7 Gene therapy in one eye enhances vision in both


Scientists have discovered a form of gene therapy for sight loss that, when injected into one eye, improves vision in both. The genes travel from the injected eye into the untreated eye, although eye specialists are unsure about the implications of this discovery.

The scientists were attempting to treat a condition known as Leber’s hereditary optic neuropathy (LHON), a form of progressive sight loss mostly found in young men. This rare type of blindness stems from a genetic mutation that attacks and destroys cells in the eye’s retina.

As part of a recent trial, 37 patients with LHON received gene therapy injections into one of their eyes. But remarkably, after two years, 29 of the patients reported improvement in vision in both eyes. At first, the scientists were taken aback by these results until they discovered that the gene therapy viruses were sneaking out of one eye and into the other.

Repeating the experiment on macaque monkeys, they found the genes were traveling down the optic nerve of one eye, crossing over to the other optic nerve, then traveling into the other eye.

6 Harmless bull with no horns


Researchers have found a way to create hornless bulls by editing the DNA of the father. This new method provides farmers with a painless alternative to current dehorning techniques. Currently, cattle need to have their horns physically removed. This is a lengthy and difficult process that can be extremely painful for the bull. But it does need to be done. Not only are the hornless bulls less likely to harm other animals, but they are also easier to transport and take up less room at the feeding trough.

In 2016, two baby bulls were born with a genetic mutation that means they will never grow horns. This was achieved by introducing a short string of DNA into the father’s cells. After analyzing DNA from all three bulls, scientists confirmed that the genetic alterations had been passed down to the young cattle without causing any accidental side effects.

“We’ve demonstrated that healthy hornless calves with only the intended edit can be produced, and we provided data to help inform the process for evaluating genome-edited animals,” explained Alison Van Eenennaam, an expert in animal science working at the University of California, Davis.

5 Cows are made more resilient to heat stress


As temperatures rise, cows begin to feel the strain. Bovines are particularly susceptible to heat stress. If left in the blazing sun for too long, cows start to lose their appetite, produce less milk, and are less likely to conceive. As you can imagine, the knock-on effects for farmers can be terrible. Each year, heat stress is said to cost the US dairy industry $900 million. In poorer countries, where farmers may only own a few livestock, it can be the ruin of many.

But now scientists in New Zealand have found a potential solution to this cattle-based conundrum. They are using gene-editing techniques to change the color of the cows’ coats. By altering their pigmentation genes, the researchers managed to lighten the dark, heat-absorbent hair of common dairy cows. Holstein-Friesian cattle are usually white with jet black patches, but the genetically altered calves were born covered in light silver-colored markings.

The researchers hope to refine their research using DNA from tropical cattle that are more resilient to high temperatures.

4 Overweight mice lose body fat


Gene editing could one day be used to treat obesity, say scientists at Harvard University. In August 2020, the researchers revealed a new method for combating weight gain in mice: transforming unhealthy white fat cells into energy-busting brown fat cells using CRISPR gene editing.

Stodgy white fat cells are full of unhealthy lipids that build up inside the body. An excess of white fat can lead to diabetes. But brown fat cells are much healthier. They break down some fat to create energy and store the rest in a smaller space.

The Harvard team was able to help the mice lose weight by altering their DNA. The scientists genetically altered the white fat, giving it the characteristics of healthy brown fat. The experiment focused on UCP1, a protein found in brown fat that turns chemical energy into heat.

Over the twelve-week study, the mice with white fat cells piled on the pounds, whereas the gene-edited mice found it much more difficult. There is even a suggestion that the gene-editing process helped the mice stave off diabetes.

Scientists predict that eventually this method could be developed into a treatment for obesity, although human trials are still a long way off.

3 Scientists cure mice of hearing loss


In 2019, researchers from Harvard Medical School and Boston Children’s Hospital announced a novel treatment for hearing loss in mice that could one day be used on humans.

Beethoven mice suffer from a genetic mutation that also affects humans, causing progressive hearing loss and eventual deafness. The name Beethoven mice is a reference to the German composer, who began to lose his hearing when he was in his twenties.

The hearing loss that the mice experience is caused by a minor alteration in their DNA. Using sophisticated biological technology, scientists can pinpoint the defective gene without harming any of the remaining healthy genes. This means they can cure the Beethoven mice of their deafness without causing any unwanted side effects.

The scientists do warn people not to get their hopes up too quickly. There are still years of research to be done before this therapy can be tried on humans. “We believe our work opens the door toward a hyper-targeted way to treat an array of genetic disorders that arise from one defective copy of a gene,” explained Harvard’s Jeffrey Holt. “This truly is precision medicine.”

2 Killer moths help New York with pest problem


In January 2020, New York State officials released swarms of genetically modified (GM) male moths to curb the number of pests. Young female diamondback moths are capable of inflicting a massive amount of harm to farmers’ crops. Despite their short lifespan, the larvae consume a huge amount of brassica plants, including kale, cabbage, and oilseed rape. The moths and their rapacious diets are said to cause $5 billion of damage each year.

Typically a pest like this would be dealt with using pesticides, but the diamondback moth is remarkably quick at developing resistance. So Oxitec, a biotechnology company based in the UK, has developed a fleet of killer GM moths to wipe out the young pests.

Scientists added a gene to the male moths that causes newly-hatched larvae to drop dead, but only affects the females. This means the harmful young females will perish before they can do any damage. The young males, on the other hand, will go on to mate with other wild females, passing on the killer gene to their larvae. This should continue for a few generations, after which Oxitec says the lethal gene will fade away.

1 Gene editing leads the fight against superbugs


Antibiotic-resistant superbugs are a global crisis waiting to happen. Destructive pathogens that, only a few decades ago, were easy to treat with penicillin are building up immunity to antibiotics. Unless scientists can create new antibiotics quickly, we could be facing 10 million deaths a year by 2050 due to these hostile germs.

But there is hope on the horizon. Researchers from the University of Manchester have uncovered a new way to produce antibiotics using CRISPR gene editing. By combining several cutting-edge biological techniques, the team produced an unusual type of antibiotic known as malonomycin. This novel technique could help scientists develop new forms of antibiotic medication—drugs that are better suited to fighting off highly resistant superbugs.

“We are now optimistic that our findings might lead to the discovery of new antibiotics,” explained the leader of the study Jason Micklefield, “and may also provide new ways of making antibiotics which are urgently needed to combat emerging drug-resistant pathogens.”

10 Amazing Powers From Rare Genetic Mutations

]]>
https://listorati.com/top-10-astounding-uses-for-genetic-technology/feed/ 0 7589
10 Innovative Pieces of Technology That Failed Miserably https://listorati.com/10-innovative-pieces-of-technology-that-failed-miserably/ https://listorati.com/10-innovative-pieces-of-technology-that-failed-miserably/#respond Sun, 30 Jul 2023 18:25:48 +0000 https://listorati.com/10-innovative-pieces-of-technology-that-failed-miserably/

Ever since Zeus invented technology (that’s what happened, right?) mankind has been constantly inventing. Some creations – like the iPod or the electric nose hair trimmer – become so ingrained into our everyday lives we can’t imagine a world without them. Others, meanwhile, are… these. Some inventions, no matter how brilliant, are mostly remembered for failing incredibly hard.

Intellevision

The Mattel Intellivision was a home game console released in 1979. Development began less than a year after the introduction of its main competitor / arch-nemesis, the Atari 2600. It had graphics and sound capabilities that put the 2600 to shame, but that was only the beginning of its innovations. Intellivision was the first 16-bit gaming system, the first to feature voice synthesis, and easily the first to feature downloadable games via cable.

But poor marketing, along with a poorly designed non-ergonomic 16-direction control pad lead to Mattel selling only 3 million units over its lifetime. Not bad, you say? The underpowered Atari machine sold ten times that number. In 1983, the video game market crashed – only to be revived by the Nintendo NES, a system with all of the Mattel’s innovations but none of its shortcomings.

Laser

The first digital home video format was introduced in 1978, known as Laserdisc or “DiscoVision” because this was the ’70s. Brought to market just two years after the introduction of videocassettes, this high-capacity digital storage format meant video and audio quality far exceeding any VCR. Compact Discs, still four years down the road, were based on this technology. Laserdiscs boasted extremely sharp images – the likes of which had not yet been seen on home video – as well as digital surround sound.

Unfortunately, the discs were heavy and easily damaged, and the players quite loud compared to VCRs. There was no recording capability, and the discs and their players were super expensive. VCRs reigned supreme until the advent of the DVD, which was a kind of mini Laserdisc.

Cinerama

The very first widescreen projection format, Cinerama, made IMAX look like a wussy. Projecting a Cinerama film meant projecting three synchronized 35mm projectors simultaneously onto a gigantic curved screen. While technically challenging to present and requiring a skilled projectionist (or three), the results were visually astounding and far ahead of any other method of the time. 

Did we say “technically challenging”? We meant “damn near impossible.” Projecting three films with perfect synchronization is every bit as hard as it sounds, and there was no means of automation. The projectionists just had to be that good. Plus, very few theaters were willing to make the necessary and expensive modifications. As a result, only a couple of dozen films ever used the format.

Betamax

As the other home videocassette format, Beta is synonymous with “also-ran.” Sony’s format offered much smaller, more durable cassettes and better resolution than JVC’s competing VHS format. Betamax even beat VHS to the US and Japan markets by over a year. So what went wrong?

The “format wars” between Beta and VHS (see: Sony and everyone else) are the stuff of tech legend. Sony misjudged the home video market in a number of ways, but the most likely cause of Beta’s failure was Sony’s reluctance to license its technology. JVC had no such qualms, resulting in a wide array of manufacturers selling VHS machines much more cheaply than Betamax. Also, Beta machines could initially only record for 60 minutes, compared to the 3 hours of VHS. VHS wins…

Quadraphonic Sound

Simply put, Quad would now be called 4.0 surround sound. Like stereo, but… twice as much. It was intended to replicate the experience of live sound on speakers, which it did. Debuting in 1971, several quad vinyl records were released in differing (and incompatible) formats. Played on the right system, the “3-D audio” result was pretty spectacular. 

But there are about a billion ways to produce quadraphonic sound, and no single format was ever agreed upon. Dolby surround sound, which does pretty much the same thing, was standardized and eclipsed quad quite quickly. Of course, surround is primarily used for movies. For listening to music, most people think stereo is just fine.

QR

Look familiar? That is a QR code, short for “Quick Response”, and they’ve been popping up all over the place for the last ten years or so. Storefronts, packaging… you probably have one tattooed on your butt. They’re like barcodes on crack; they serve the same purpose (as barcodes), but hold a lot more information. They were originally used to track parts during manufacture by the auto industry, but what fun is that when they can be used in advertising?

The problem is, nobody knows what the hell to do with them. PR for the QR was severely lacking. A recent study showed that about 80% of college students, that most tech-savvy of demographics, have no idea what to do with a QR code. Hint: scan them with some third party app on your smart phone.

And once we do figure it out, what’s our reward? Why, intrusive, in-your-face advertising, of course… what tech-savvy consumers love most of all. I have no idea what went wrong.

4

Digital Audio Tape (DAT)

DATDATs were introduced in 1987. They were tiny little cassettes that record digitally at CD quality or better, meant to replace standard cassette tapes. They were far superior to cassettes, more durable and portable than even CDs, capable of 16-bit sampling and widely varying recording lengths. Why, they’re the Superformat of the future! And since the music industry is never scared of new technology, we can’t figure out why – oh, wait.

The failure of DAT as a format for selling music was (of course) mainly due to piracy concerns. Music industry types were concerned that piracy would skyrocket with a high fidelity, recordable medium – and effectively buried the technology for consumer use. This paved the way for all-digital formats like mp3, which of course are much easier to pirate. Great job, music industry!

VR

As seen in every ’90s sci-fi movie, fully immersive 3-D computer generated imagery is essentially virtual reality. Even in the early ’90s, companies like the cleverly-named Virtuality were releasing VR arcade games like “Dactyl Nightmare” that placed you right inside the cheesy, blocky action.

The technology simply wasn’t advanced enough to meet the vision, and attempts at true VR were underwhelming to say the least. While technology has obviously come a long way, we’re still pretty damn far off from having a real-life Holodeck which – let’s face it – is what we all really want.

Newton

Long before Apple released the iPod and began dominating mobile devices up one side and down the other, there was this 1993 ill-advised attempt at that market. The Apple Newton was essentially the father of all PDAs, and was innovative in many ways, but was ultimately a spectacular failure.

The Apple Newton PDA never quite caught on due to a hugely inaccurate handwriting recognition system and an exorbitant price tag, not to mention that it looks like a Commodore 64 mated with a tape recorder. The 1995 debut of the smaller, cheaper and more functional Palm Pilot was the final nail in its coffin. The Newton was discontinued in 1998.

DIVX

In its first incarnation, DIVX will likely go down as one of the biggest tech flops of all. Its innovation was in catering to those who wanted a way to “rent” movies digitally (you know, like everyone does now), but the way it was implemented was a misstep, in the same way that falling down the stairs is a misstep.

Piloted by electronics retailer Circuit City, the idea was simple enough. You rent a disc, watch it for two days then throw it away. Simple, right? Except that it was like a DVD without all the features, required a separate player that consumers had to buy, and the video rental industry fought against it tooth and nail.

By the time Netflix and Blockbuster came along to make digital rentals simple, DIVX was but a memory, having been sold only between 1998 and 1999. Its legacy lives on in annoying, unnecessary software constantly trying to get you to download it for some reason.

Mike Floorwalker’s blog is freakin’ sweet.

Mike Floorwalker

Mike Floorwalker”s actual name is Jason, and he lives in the Parker, Colorado area with his wife Stacey. He enjoys loud rock music, cooking and making lists.

]]>
https://listorati.com/10-innovative-pieces-of-technology-that-failed-miserably/feed/ 0 6885
Top 10 Twisted Theories About the Future of Technology https://listorati.com/top-10-twisted-theories-about-the-future-of-technology/ https://listorati.com/top-10-twisted-theories-about-the-future-of-technology/#respond Wed, 19 Jul 2023 16:59:22 +0000 https://listorati.com/top-10-twisted-theories-about-the-future-of-technology/

In the blink of an eye, technology has taken us from prey to predator and from caves to the cosmos. It’s moved so fast that parent and child no longer live the same life, and it’s only getting faster. Every day technology grows and mutates.

Some futurologists see a technological utopia in our future while some predict catastrophe. All of them agree, however, that things are changing more rapidly and dramatically than they ever have before. It’s about to get really weird on Earth.

From digital beings to computers in the air, from mind control to the end of death, this list gathers ten of the more twisted predictions about the future of technology.

Related: 10 More Incredible Ways Nature Beat Us In Technology

10 Forced Neurohacking

Neurohacking is the process of interfacing with and improving the human mind. At the moment, it ranges from simple at-home attempts at cognitive improvement like nootropic supplements all the way to large-scale studies of brain-machine interface. Many labs across the world have already designed machines that are controlled solely by human thought, and so one question poses itself: can they also design machines to work in reverse and control human thought?

The surprising answer is: almost certainly yes. Some labs have decoded the brain’s electrical activity enough to allow otherwise non-communicative people to “speak” through machines, and our brain’s electricity is not some special variation; if understood well enough, it can be manipulated right back. Though one obvious end result of forced neurohacking is zombie-like slaves, one more realistic and likely imminent effect could be neuro-marketing, by which our devices could actively emit signals that alter our brainwaves to want one product or another.

9 Designer Humans

This topic has generated a lot of press in recent years. As our understanding of the genetic basis for human abilities and disabilities improves, scientists and lawmakers are becoming forced to ask the question: is it appropriate to edit people’s genes before they’re born? That’s the central premise of designer humans, and its implications are as numerous as they are frightening.

On the one hand, designer humans could mean an end to most congenital diseases, including cystic fibrosis, Huntington’s, heart disease, and many cancers. This could easily then extend to common genetic annoyances like poor eyesight and baldness. But the unsettling implications remain. Who will have access to this technology? Who won’t? Will we allow designer humans to compete in sports against non-designed? Will and should we go farther than prevention and alter genes to make us smarter, faster, and stronger? Will there be tiers of different designer humans? Is any of it ethical in the first place?

8 Every Atom a Computer

Moore’s law is a commonly cited prediction about the future of technology that the number of transistors in an integrated circuit will double about every two years. More broadly, Moore’s law states that computers will become faster and more efficient even as they become smaller. Many futurists predict that this trend will continue until we create what is known as zero-size intelligence, a computer more powerful than any ever built and yet with almost (or even literally) zero mass.

This could lead to any number of utopian or dystopian scenarios, as with enough time, energy, and resources, potentially every atom around us could become a supercomputer. One natural fear is the artificial intelligence singularity, as fighting armies of atom-sized supercomputers in our air does not sound fun. Another, perhaps more likely fear, is that the machines don’t become self-aware and rise up but rather stay under human control. This leads us to…

7 Smart Dust

Smart dust. A term first coined by physicist Kris Pister in 1990. Smart dust refers to clouds of nano-robots that would be able to generate an unparalleled amount of data with an equally unparalleled amount of secrecy. Essentially, the very air around us could become swarms of nano-bots that relay our every action and thought to whomever they serve.

This needn’t wait until we reach zero-size intelligence, however. Computers as large as insects could swarm around us even now, drawing little to no attention, recording our every movement.

6 The Coding Olympics

Predictions about the future of technology tend to focus on end-of-the-world and redefined-humanity scenarios, but what will happen to the day-to-day, fun parts of life? Sports, for example, have been a topic of discussion for futurists recently, and most think they’ll be very different in the near future.

Robots already exist which can sink basketball shots, kick field goals, and run laps. Though they may end up replacing human athletes, some futurists predict an even weirder change. Inventor and futurist Dean Kamen notes that the original Olympic games were created around skills useful to that time period, like chariot racing and wrestling. Future Olympics could then hold contests relevant to their own time period, like coding, wiring, and physical calculations.

5 Second Life

Popularized by movies like The Matrix, the idea of a simulated reality is one of the most common predictions of humanity’s future. In The Matrix, humans are farmed as resources and kept docile with the simulated reality. In the real world, however, humanity might willfully retreat into a simulated world simply to escape the ever-deteriorating “real” Earth.

Even without the scary spider-machines forcing us into the illusion, the idea of a complete retreat from reality is frightening. The new digital reality could be created using entirely different rules than we’re used to. Even worse, whoever designs it could imbue it with shortcuts and cheats known only to themselves or a chosen few. In a universe completely created by the few, inequality could reach unimaginable new heights(/lows).

4 The Dead Will Walk

The concept of replicating human consciousness into a digital avatar raises another twisted concept: anyone could be stored forever, and so death would come to mean little. So-called “dead” consciousnesses could be stored in video portraits, a la the Harry Potter universe’s magically interactive portraits.

They could also be stored in lifelike robots, and if the robots become lifelike enough, death would mean essentially nothing, just a quick hop from one body to another. Eventually, the robots containing uploaded consciousnesses could replace biologically-born consciousnesses and, if their bodies were durable enough, functionally end the cycle of birth and death.

3 No More Sky as We Know It

This prediction is actually two different predictions, and both are almost guaranteed to come true. First, the blue sky of our atmosphere will begin to vanish as more and more drones, and potentially personal flying machines, fill the skies. Second, the blackness of space at night will begin to shine and glow as our planet begins to be orbited by more and more—and larger and larger—satellites.

As drones begin to dominate industries like home delivery and long-range tracking, and as satellites likewise begin to establish competing networks of communication and defense, our skies as we know them will disappear. Though at first, that will mean only the occasional speck in the otherwise clear sky, eventually every square mile of the sky could end up as crowded as your average metropolitan street.

2 No More Animals as We Know Them

Currently, the human-led Anthropocene Extinction Event is causing millions of species on Earth to go extinct at a rate hundreds to thousands of times faster than the natural background rate. It’s entirely possible that the near future may be free of wild animals and plants completely. Perhaps the most logical species to prioritize saving over the rest are those that produce food and materials, but already, lab-grown substitutes have hit mass markets as alternatives.

That leaves a world with little to no non-microbial life and little practical, economic reason to regenerate it. This may remind some of the world depicted in Blade Runner, in which animals (like the replicants) are biological entities designed from the ground up, and with good reason; that is entirely possible. After the extinction of the original species, designer animals may be the new norm. And as microbe farms would sustainably produce food, the only purpose those animals may serve would be entertainment for the wealthy and bored.

1 No More Technology as We Know It

Technology is usually thought of as mechanical and electronic, and likewise, technological advancements are thought of as advances in computing and data transfer. In reality, the future of technology may be more biological in nature. Perhaps even completely so.

Traditional manufacturing has proven disastrous to our planet, and alternative processes to generate materials like plastics and fuels are already starting to gain widespread use. Microbes, in particular, are proving useful in generating materials, fuels, and even sustainable food. A natural solution to traditional manufacturing, as well as a natural synthesis of the designer human, designer animal, and microbe materials concepts, is to replace electronic machines with biological machines. Many futurists imagine a human society wherein our food, homes, and even vehicles are entirely grown by designer microbes and perhaps even living entities themselves.

]]>
https://listorati.com/top-10-twisted-theories-about-the-future-of-technology/feed/ 0 6704
10 Everyday Things That Grew Out of Military Technology https://listorati.com/10-everyday-things-that-grew-out-of-military-technology/ https://listorati.com/10-everyday-things-that-grew-out-of-military-technology/#respond Mon, 10 Jul 2023 15:39:51 +0000 https://listorati.com/10-everyday-things-that-grew-out-of-military-technology/

Many things people use every day contain technology first developed for military use. Over human history, warfare has proven to be a primary driver for innovation. It’s a shame killing other people provides the necessity to play mother to humanity’s best inventions. But multiple technologies originally developed to help destroy enemies and protect allies have shaped modern society in unmistakable ways.

The following is a list of ten everyday items born from the battlefield.

Related: 10 More Incredible Ways Nature Beat Us In Technology

10 Duct Tape

Much like every toolbox, broken car window, and down-home first aid kit, no list of military breakthroughs is complete without duct tape. Originally imagined by a factory worker to seal and waterproof ammunition boxes, duct tape quickly became a handy solution to all sorts of problems encountered on the battlefield and far beyond.

That worker, Vesta Stout, nearly failed to get her world-shaking invention into the hands of interested parties. After not receiving enough attention for her tape from those in charge around her, Stout pulled a boss move and wrote directly to President Franklin D. Roosevelt. Within a few weeks of her letter, Stout was informed that Johnson & Johnson would immediately begin manufacturing duct tape for military uses.

From 1942 to today, duct tape remains a mainstay for rapid repair, impromptu bandaging, and, for a bit in the early 2000s, making shiny wallets for eighth graders to sell to each other.[1]

9 Microwave Oven

Somewhere right now, a college student is preparing a cup-o-noodle for lunch in three minutes, a writer is heating up their coffee for the fourth time in two hours, and a kid is creating a destructive laser light show using old CDs. What makes all of these modern tasks possible? The microwave.

In 1940, while Nazi Germany’s aircraft terrorized Great Britain, a team of British physicists arrived in the U.S. bearing top-secret cargo that significantly increased the capabilities of contemporary radar. The British team’s research advanced the usefulness of radar throughout WWII. Then, in 1946, an American engineer named Percy Spencer filed patents that used a primary component from the British radar designs, the cavity magnetron, to rapidly heat food. Spencer allegedly got his idea when a peanut bar melted in his pocket while working around radar equipment.

By 1955, the first commercial microwaves made their way onto the market for over $1,200 (around $12,000 in 2021 dollars). Thankfully for later generations of frozen food and ramen connoisseurs, the microwave gradually became affordable. Now, even the dingiest motel room is incomplete without a little piece of WWII tech.[2]

8 Super Glue

True to its name, super glue sticks all kinds of surprising things together with serious gusto. A short internet search shows many uses of the glue; most are impressive, and some are downright hilarious. Like so many of the world’s best inventions, this extraordinary adhesive was discovered by accident in the pursuit of more efficient weaponry.

In 1945, a group of scientists, including Dr. Harry Coover, worked with chemicals called cyanoacrylates to create clear gun sights for use in WWII. The sights never materialized from their initial experiments, but six years later, Coover began researching the chemicals again. He later released his “Super Glue” to the market in 1958. Even after the glue went public, the military continued to find new uses for the product, including sealing aircraft canopies and closing flesh wounds.

From its roots as a failed way to help soldiers shoot better, to an incredibly adherent research chemical, and finally, to the recognizable and sensationally sticky household item of today, super glue is a case of truth in advertising and a textbook example of military R&D spilling into everyday life.[3]

7 Global Positioning System

Long, long ago, GPS did not exist as an acronym nor a technology, and the idea that a person could be directed to a destination of their choosing by a talking computer remained a concept buried deep in science fiction. Little did anyone know that one day the technology would arrive that placed fairly accurate directions in the palm of every distracted driver’s hand.

The glorious directional revolution humbly began as a way to track submarines during the Cold War. By the 1970s, the rest of the DoD joined the effort, sending up the first NAVSTAR unit in 1978, with the final satellite entering orbit in 1993. The project proved its worth definitively in 1991 during Operation Desert Storm. Even though the system was incomplete with only 19 of the 24 satellites operational, the limited coverage of the unproven technology allowed U.S.-led Coalition forces to outmaneuver Iraqi units over a trackless desert devoid of landmarks.

The original uses of the technology increase in relevance even today, and the military is only a slice of the expanding GPS pie chart. Everyone with a cell phone uses this technology in the modern world without even having to think about it. GPS is used across the globe, on most internet-connected devices, and for all sorts of tasks the inventors of the tech could not have imagined: see “location-based marketing,” “ridesharing,” and “on-demand restaurant delivery.”[4]

6 The Internet

The impact of the internet on human society is impossible to quantify, but it has undoubtedly altered civilization. Loved and hated by so many in equal measure, the modern internet is an accessible web of useful applications, great information, meaningful connections, echo-chamber politics, cyberbullying, and so much pornography. The cursed blessing that is the internet started as a much simpler concept to connect computers and share scientific data.

In the 1960s, the DoD Advanced Research Projects Agency received a proposal from computer scientist J.C.R. Licklider for a globally-connected computer network for sharing data. From there, new programming languages sprang up from universities and labs across the world, the network became more uniform, and the web of connectivity extended into the private sector. A computer manufacturer registered the first domain in 1985, opening the internet to the wider world of consumers, and from that point through today, the internet has grown at incredible speeds.

The internet must be considered for the “most impactful military tech” list for how much it has altered humanity. The deceptively simple concept of sharing data across multiple connected systems is now embedded into life so deeply that to lose internet access is less of an inconvenience and more of a crisis for modern Homo sapiens.[5]

5 Canned Food

Today, many foods can be found in some shape or form in a can. Corn, tuna, pig’s feet, tamales, and so much more pack the canned food section of the grocery store. Middle-class pantries fill with canned foods year after year until it’s all thrown out one spring cleaning. This modern preservation method, which helps feed so many, developed from that pesky, crucial requirement that soldiers have to eat too.

The well-known idiom, “an army marches on its stomach,” explains the absolute necessity for food to accompany soldiers on a campaign. Food, however, spoils, causing a major problem for anyone planning to eat it after a certain amount of time. The French army showed great interest in extending the shelf life of their soldiers’ rations after seeing more men die from eating rotten food than succumbing to combat injuries. In 1806, Nicolas Appert sought a sizeable monetary prize from the French government in exchange for developing a way to preserve food long-term. With his invention, he became “the father of canning,” and military rations were forever changed. Quickly following their development for the military, canned foods entered the private market and proved to be an effective, sanitary way to store food for extended periods of time.

Modern canning and preservation are direct contributors to human population growth. Vienna sausages, Spam, and corned beef hash may not be considered gourmet, but the same process that seals those sweaty meat products lets armies march across nations and, more importantly, feeds those that otherwise might starve.[6]

4 Bagged Salad

Another slightly less salty military technology that translated seamlessly into everyday life is bagged salad. Though short on taste, nothing says freshness like a crispy piece of iceberg lettuce on a flash-fried chalupa.

A German man named Karl Busch invented the first vacuum sealer machine during WWII, intent on preserving food for military families and soldiers heading to war. In the 1950s, cutting-edge vacuum technology repurposed from the military helped end the need for vast quantities of ice to ship vegetables. The method more effectively cooled and preserved leafy greens, especially by removing air and slowing spoilage. This allowed fresh lettuce and other greens to be transported long distances to places far from the farm. In 1963, Busch’s redesign for an industrial-sized machine eventually led to the technology making its way to private homes, with the first modern home sealer invented in 1984.

Without vacuum technology from WWII, salads from fast-food restaurants would cost way too much for people to buy, a bit like now, but the consumer’s option to go home and make their own chicken Caesar salad would be impossible.[7]

3 Synthetic Rubber

Synthetic rubber is nearly everywhere in modern life. Cars, electronics, furniture, skateboards, footwear, dog toys, and so much more owe their current existence to the invention of synthetic rubber. The world of rubber surrounding people today may be much different if not for military-funded research. The necessity for rubber during WWII led directly to the invention of the modern wonder material.

When the Japanese took over their Pacific empire, much of the world’s supply of natural rubber was cut off. This left the United States scrambling for a way to produce rubber synthetically. With consumption rates already around 600,000 tons per year leading up to the war and little supply to match, President Roosevelt appointed a committee in 1942 to solve the rubber supply crisis. By April of that year, the first bale of synthetic rubber rolled off the manufacturing line at Firestone.

The rapid invention of synthetic rubber to fill wartime demand directly led to all of the various uses of rubber in the world today. Synthetic rubber is another example of technology developed to meet a military need that became a staple and growth multiplier in the civilian market.[8]

2 Virtual Reality

Countless games and multimedia experiences allow consumers to lose themselves in today’s virtual world, but all of these interactive programs trace their lineage back to military training systems first employed in the 1980s.

It begins with the construction of advanced flight training simulators and extends through modern simulated combat environments, vehicle trainers, team-building exercises, and mental health therapy. Modern military VR exists to support the training of personnel and is not intended to replace actual training; much in the way that a player’s skill on a VR surgery simulator does not mean they should try to remove an appendix.

The military likely does not train soldiers to walk across a beam 20 stories high, escape a virtual prison cell, or shoot aliens with giant lasers, but the training technology introduced in the 1980s helped bring about the growing virtual world.[9]

1 Roomba

Skynet has been slacking. The company iRobot kicked off the machine revolution nearly 20 years ago. Their robots are already among us: in our cities, on our streets, and in our homes. The short, mobile discs roaming floors while devouring Cheeto crumbs, dog hair, and cat litter are not only simple autonomous vacuums; Roombas contain the programming DNA of a bona fide warfighter.

According to iRobot co-founder Colin Angle, the technology that enables Roombas to thoroughly vacuum floors uses the same programming employed by military robotic minesweepers. The same breakthroughs that allow Roomba to navigate its way around furniture without getting stuck (usually) also help soldiers clear dangerous minefields using robotic minesweepers.

While civilian Roombas patrol kitchen floors for enemy crumbs and the occasional penny, their military cousins help human allies defeat hazardous explosives. It’s only a matter of time before the vacuums join with the voice assistants to film another Terminator sequel.[10]

]]>
https://listorati.com/10-everyday-things-that-grew-out-of-military-technology/feed/ 0 6559