Vintage – Listorati https://listorati.com Fascinating facts and lists, bizarre, wonderful, and fun Wed, 24 May 2023 10:53:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://listorati.com/wp-content/uploads/2023/02/listorati-512x512-1.png Vintage – Listorati https://listorati.com 32 32 215494684 10 Examples of Vintage Computing Still in Wide Use Today https://listorati.com/10-examples-of-vintage-computing-still-in-wide-use-today/ https://listorati.com/10-examples-of-vintage-computing-still-in-wide-use-today/#respond Wed, 24 May 2023 10:53:47 +0000 https://listorati.com/10-examples-of-vintage-computing-still-in-wide-use-today/

When we think of modern computing, we inevitably think of how fast technology moves these days. It seems we can barely buy a new laptop or smartphone and get it out of the box before it starts to feel obsolete. New features, speed increases, more storage—it all seems to happen so fast.

That’s why it can be surprising when you stop to take a look at some of the tech that we use every day and realize just how old some of it is. From ancient operating systems and programming languages to network protocols developed decades ago, some parts of our tech-heavy world have been around for decades now. And they show no signs of going away anytime soon. Here’s a look at ten examples of very old computing technology that are still widely used today.

Related: Top 10 Most Catastrophic Computer Failures In History

10 The OS from Half a Century Ago

“It’s a Unix system! I know this!” says young Lex as she saves the day in the 1993 film adaptation of Jurassic Park. This line became one of the earliest internet memes, and it has endured; there’s even a whole subreddit devoted to it. The line resonated so strongly because many computer professionals can relate: If you know the Unix operating system, you can sit down at any Unix-like system made in the last 50-plus years and instantly feel at home.

Unix originated at AT&T’s Bell Labs in 1969. Designed from the ground up to be a multitasking and multiuser system (i.e., with the ability to do multiple things for multiple logged-in users all at once), Unix has long been hailed for its innovative design and rock-solid stability. But perhaps the biggest reason users are so loyal is the “Unix philosophy,” a guiding set of design principles that encourages the use of small, useful applications that can easily pipe data to other applications.

While AT&T sold Unix licenses for many years, the core concepts led to the development of many Unix-like systems over the years. Today, developers can submit their operating systems for certification as a “UNIX-Certified Product” to the current owner of the Unix trademark, the Open Group.

In the world of free and open-source software, the most popular operating systems are Linux distributions, with Linux being categorized as a Unix-like system. Linux powers many of the servers on the internet today and has made major inroads as a desktop OS too. Considering its age, it’s pretty incredible to consider how long Unix and Unix-like systems have been around—and how relevant they still are today.[1]

9 The Ancient Programming Language That Banks Still Run On

When it comes to programming these days, you’re likely to see a lot of references to languages like Go, Rust, and C#. But there’s a programming language that’s been in heavy usage since its debut in 1959 and continues to be the backbone for global finance.

COBOL came about when a group of businesses and the United States government saw the need for a common language that could run on the competing mainframes of the day, with an easily understood English-like syntax. Once the language was complete late in 1959, it was immediately embraced by banks, brokerages, and government agencies like the IRS.

Despite the tech industry’s tendency to embrace the “latest and greatest,” COBOL has remained the de facto standard in financial industries. At the same time, there have been shortages in COBOL programmers for years now, as young coders tend to learn and specialize in newer languages. Plans by banks and government agencies to migrate away from COBOL continue to be put on hold due to the cost and complexity involved in retiring legacy systems. That means our financial systems still run on a language over 60 years old now.[2]

8 The Very Popular and Very Old Coding Tool

While the average computer user will write text in a word processor, programmers work with a plain text editor. Put simply, plain text is not formatted with the niceties we see in word processors, like multiple fonts, text justification, and formatting. Since computers read code written in plain text, coders need a good editor that allows them to write and edit plain text efficiently.

Many of the popular plain text editors today are actually IDEs (integrated development environments), which help you keep track of all the files in your codebase and revisions in that code. Microsoft’s Visual Studio Code is the most popular IDE today, as it routinely tops developer surveys. But the minimalist (yet powerful) editor Vim is still a popular choice today among coders, which is pretty incredible considering its age.

Vim itself was released in 1991, but its lineage goes back way farther. The Unix app vi (short for “visual”) debuted in 1979 and was itself a newer version of an older tool. A couple of decades after vi, Vim appeared, with its name originally meaning “vi imitation,” but now meaning “vi improved.”

A quick look at Vim may scare off newcomers, as it literally looks like a display of text with no menus or controls. But what makes it so popular with programmers is its modes: Insert mode allows you to insert text, while Normal lets you run commands on your text. Normal mode is the secret weapon, as it enables quick copying, pasting, and other text manipulation without your fingers ever leaving the keyboard. It’s this speed and power that has kept Vim popular, even though its lineage goes all the way back to the creation of Unix in 1969.[3]

7 A Steve Jobs Failure, Reborn as a Success

In what has become the stuff of business legend, Steve Jobs was forced out of Apple in 1985 after a boardroom showdown with John Sculley. Jobs then took $12 million of his own money and founded a new computer company called NeXT. In 1989, NeXT rolled out its first product, the NeXT Cube, an immaculately designed yet very expensive workstation computer. Priced out of the range of most of the universities and researchers that Jobs thought would be his target market and way past the budget of the home computer user, NeXT would go down in history as one of the most high-profile failures in the industry.

However, those who did buy a NeXT computer during the company’s brief existence had nothing but great things to say about the company’s operating system, NeXTSTEP. Built on top of a Unix core, NeXTSTEP was powerful, flexible, and stable in a way that other operating systems of its day were not. When Apple found themselves needing a revamped operating system for their Mac line, they purchased NeXT in 1997 for $429 million. For that price, Apple got the rights to NeXTSTEP and brought Jobs back into the company.

Apple’s rise after that to become one of the most successful companies in the world is well-documented. But what is often overlooked is the role of NeXTSTEP in that success. It was first retooled as Mac OS X for desktops and laptops, but it is also the basis for iOS on iPhones, the iPad OS, and even TV OS on Apple TV boxes. Although it goes by different names now, the 30-plus years of NeXTSTEP make it one of the oldest operating systems still in active development today.[4]

6 A Standard for Downloading and Sharing Files

If you’ve spent any time downloading files from the internet or sharing them with others, you’ve run across ZIP files. But what are they? In technical terms, ZIP is a compression format, which means it takes an existing file and makes it smaller. Once the ZIP file reaches its destination, it can be decompressed to return the file to its original state. This not only saves space but also helps files to transfer across networks quicker, avoiding corruption in the process. There are many other compression formats, but ZIP has outlasted them all, which is pretty incredible considering that the format is in its fourth decade of use.

Created by programmer Phil Katz at his company PKware in 1989, the ZIP file format predates the modern internet. Given the very high price per megabyte of hard drives in the 1980s, ZIP was one of many compression tools to come out at the time. But its ease of use, and its later ubiquity across nearly all computing platforms, have made ZIP the standard for file compression since then.

Another aspect of ZIP’s long lifespan is its usefulness in general file handling. Microsoft’s standard Office file formats (for example, DOCX for Word and XLSX for Excel) are actually ZIP files underneath the hood. This allows Microsoft to essentially combine several different files into what appears as one file, allowing for compatibility with other office applications.[5]

5 The Big Computers of Yesteryear

When thinking of the history of computers, it’s easy to recall that they used to be giant mainframes that filled whole rooms but now have eventually shrunk to desktop computers and handheld devices. But the truth is, mainframes are still with us today and still perform critical business functions for companies all over the world. A 2021 survey showed that 67 of the Fortune 100 still use mainframes.

Mainframes got their name from the cabinets on large computers that hold the CPU and main memory—the “main frame.” While today’s mainframes share the same construction and size as their mid-20th century counterparts, the computing power has definitely increased over time. What hasn’t changed is the reliability in processing many transactions per second that mainframes are known for.

And in today’s computing landscape, mainframes are learning new tricks. In addition to running legacy systems like COBOL applications, modern mainframes provide the backbone for cloud computing and running virtual machines simultaneously. Not just a relic of the past, mainframes are actually a key component of our tech-oriented world today.[6]

4 The Peripheral That Won’t Go Away

It’s hard to imagine computing without some sort of keyboard, whether for typing text or issuing instructions to the computer. And at this point, it should also be assumed that the computer mouse is here to stay too.

The first prototype mouse was created in 1964 by Douglas Engelbart, who was then a Director of the Augmentation Research Center at Stanford Research Institute in Menlo Park, California. But the mouse’s entry into mainstream culture can really be traced to 1979 when a group of Apple engineers and executives led by Steve Jobs visited the Xerox Palo Alto Research Center (PARC).

It was on this trip that Jobs first saw computers with icons, windows, a mouse, and other technologies that had been developed at PARC. Convinced (quite correctly) that this was the future of personal computing, Jobs took this information back to Apple. By 1983, Apple had shipped its first computer with a mouse, the Lisa, followed by the first Macintosh in 1984.

Since then, the mouse has become an essential part of personal computing. Not bad for a piece of hardware designed in the 1960s and, save for some technical improvements and ergonomic changes, is more or less the same here in the 21st century.[7]

3 Modern Networking Is Really Old

Another innovation developed at Xerox PARC—which Steve Jobs admitted that he totally overlooked during his 1979 visit—was a workgroup of personal computers networked together, providing the ability to share files and resources like networked printers. It’s something we take for granted today, especially considering the giant worldwide network that is “The Internet.” But it was all made possible thanks to Ethernet.

Bob Metcalfe invented Ethernet in 1973 at Xerox PARC. The company patented it in 1975, and it was later made an open standard. For wired network connections, Ethernet is the absolute standard, but it’s not because there haven’t been other options over the years. Competing connectivity options like Token Ring, FDDI, and Apple’s LocalTalk all competed with Ethernet at one point or another over the last five decades. Yet Ethernet has remained the standard.

But what about WiFi? It was actually created as a wireless variant of Ethernet, and its official name as a recognized standard is, in fact, “wireless Ethernet 802.11.” So while Ethernet has gotten faster and gone wireless over time, it’s essentially the same concept today as what Metcalfe came up with in the 1970s.[8]

2 The Internet Protocol Predates the Internet

You’ve probably seen your computer’s TCP/IP settings at some point, but what is it? Suffice to say, it’s complicated and best left to qualified network engineers. But from a 20,000-foot view, TCP is the Transmission Control Program, which regulates how data travels over the internet. The Internet Protocol defines your address on the ‘net and how the data is routed to you. What may be most interesting about these two is that they were developed years in advance of the dawn of the public internet in the 1990s.

The precursor to the public internet was the ARPANET, a creation of the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. And it turns out TCP/IP was developed and refined there over the 1970s and 1980s, with two men—Robert E. Kahn and Vinton Cerf—being credited as forefathers of the protocols. In reality, it was very much of a work in progress, with TCP first replacing earlier protocols in 1974. Kahn and Cerf later realized the growth of the network was not feasible without breaking out the addressing and routing into a separate protocol. Thus in 1983, the Internet Protocol was created. TCP/IP as we know it today was now in place.

If it weren’t for the legwork done on the ARPANET, the internet would not have been ready for the public in the 1990s. The work of Kahn and Cerf may be positively ancient in technology terms today, but thankfully it has proven robust enough to scale up to the massive global network we enjoy today.[9]

1 Email Is as Old as Networked Computing

Even though a lot of us dread looking at our email inboxes these days since it is likely full of spam, promotional offers, and more work to do, it is still an essential part of daily computing. If you have no love for email these days, try and imagine how exciting it must have been in its early days, when sending a message across computers seemed like something from the future.

Not surprisingly, ARPANET was the network that facilitated the first email delivery. On October 29, 1969, UCLA professor Leonard Kleinrock and his student and programmer Charley Kline sat down to send a message over ARPANET to another programmer, Bill Duvall, at Stanford Research Institute. The message was to be one word: “login.” And the system crashed right after the “o” was typed!

Thankfully, the message was able to be sent successfully about an hour later, and that was the birth of email. Was that a good thing? That’s for each person to decide, but considering the longevity of email and the many billions of messages that have been sent since 1969, it’s definitely worth noting that one of the oldest computer technologies is still with us as a part of everyday life.[10]

]]>
https://listorati.com/10-examples-of-vintage-computing-still-in-wide-use-today/feed/ 0 5930
10 Jaw-Dropping Moments From Vintage Television https://listorati.com/10-jaw-dropping-moments-from-vintage-television/ https://listorati.com/10-jaw-dropping-moments-from-vintage-television/#respond Sat, 04 Mar 2023 06:23:17 +0000 https://listorati.com/10-jaw-dropping-moments-from-vintage-television/

With hundreds of shows being produced a year on Hulu, Netflix, and Amazon alone, older television can seem like it can’t possibly be worth checking out. Visually, it’s almost certain to look staid beyond being grainy and black and white. The reference points for the comedy will likely be so dated as to be incomprehensible, the plots of all the shows will have long been ripped off to death or spoiled by the time you can see them these days, and surely the censors removed with surgical precision everything halfway interesting.

But no. Entertainers had just as much desire to break out of creative molds decades ago as they do now. Censors could miss what would today be considered the most jaw-dropping content you could imagine. Also back then when screwups happened, they could put a whole season’s worth of blooper reels to shame. All that and more are available below, thanks to the dedicated efforts of antiquarians who scoured through hours of television for its hidden novelties.

10. William Shatner’s Twilight Zone Slur

Everyone who knows about Rod Serling’s 1959-64 sci-fi/fantasy classic remembers William Shatner’s struggle with his sanity and against a gremlin on the wing of the plane in Nightmare at 20,000 Feet in the final season. Turns out that two seasons earlier he starred as Don Carter in a much less-remembered but also extremely good episode called Nick of Time from the third season, probably because in that episode he did battle with a little novelty fortune teller with uncannily accurate answers in a diner, meaning it was less relateable than an embodiment of the commonplace fear of flying.

In both episodes Shatner is accompanied by a beleaguered wife who doubts his sanity. In the middle of the episode, after Don Carter has already had six straight fortunes confirmed by the toy in the diner, his obsession is clearly worrying her. While they’re crossing the street after leaving the dinner, Shatner delivers the shocking line “Oh, stop treating me like a retarded child.”

Despite what you might think, “retarded” was considered an insult at the time. According to Unlocking the Door to a Television Classic by Martin Grams Jr., Serling’s office received a letter from a mother of a child with Down’s Syndrome expressing her discomfort with its use in the show. Serling wrote a letter of apology and told his staff that they should never use the word in such a context again.  

9. William F. Buckley Calls Gore Vidal a Slur as He Threatens to Punch Him

In 1968, during the Democratic National Convention, ABC aired numerous debates between National Review founder William F. Buckley and incendiary author Gore Vidal. During a debate about whether allegations over waving Vietcong flags justified police using tear gas and beatings, Vidal took the chance to call Buckley a crypto-Nazi for favoring using those methods on protestors. Buckley replied “Now listen, you queer, stop calling me a crypto-Nazi or I’ll sock you in your goddamn face.”

It became enough of a scandal that a year later Buckley called back to it on Firing Line while debating Noam Chomsky. Buckley came to regret it, writing several letters to Vidal questioning how he could have lost control like that. Indeed, in 2017, two Academy Award winners made a documentary Best of Enemies that focused in large part on how that loss of poise changed the tone for televised American debates for the coming decades.

8. Jackie Gleason’s Half-Hour Apology

Now, here’s something a little lighter in tone. If you think that today the media spends too much time apologizing for the slightest trespasses, you should see what the creator/star of The Honeymooners got up to in the early sixties. Admittedly, he was much more entertaining about it than most.

On January 20, 1961, the day that Kennedy was inaugurated, Jackie Gleason played host to a program called You’re in The Picture where contestants stuck their heads through pictures, such as a picture where there was an image with a body of a woman in a yellow polka dot bikini in it. It’s a common photo gag at vacation spots, which might have been part of why the show got such a tepid reaction.The next week, with just two commercial breaks, he spent a half hour of airtime dissecting the terrible pilot, calling it the biggest bomb in entertainment history, and explaining the creative process behind it. This half hour made such an impact that Johnny Carson brought it up on The Tonight Show when he interviewed Gleason in 1985. Though since Carson had been one of the contestants on You’re in the Picture, he probably only felt comfortable bringing it up because he’d just gotten around to forgiving Gleason for it.

7. Lon Chaney as Frankenstein Doesn’t Even Try to Break Chairs

In 1952 Lon Chaney (best known for playing the titular role in The Phantom of the Opera (1925) was hired to play Frankenstein‘s Monster for a 1952 live broadcast of Tales of Tomorrow. Through some confusion that either stemmed from Chaney being confused after how long it took to apply his makeup or drunkenness, he thought much of the performance was a rehearsal instead of a live broadcast. This led to the hilarious sight of Chaney picking up chairs to smash them in a monstrous fury, only to gently set them down again.

Since at one point he was supposed to be leaving the set as he breaks a chair, he turns almost straight into the camera and in his regular, barely audible voice says “I’m saving the chair.” For the remainder of the program Chaney gave a perfectly competent and very physical performance, which put to rest the claims he’d been drunk. Still, it was hard for audiences to forget the Frankenstein Monster’s bewildering delicacy with furniture.

6. Dorothy Gray’s Cold Cream Campaign

You get some sense of how prevailing fear of nuclear destruction was in 1950s America that the concept of “duck and cover” as a means of attempting to survive an atomic strike was taught in elementary school. But then this commercial comes along and shows just how ambivalent feelings were at the time about radiation in general.

The Dorothy Gray cosmetics company had been founded in 1916, so it had been around 38 years and entered the cultural consciousness by the time it launched what would today be considered a truly shocking ad campaign, especially for a relatively benign skin treatment. To demonstrate the effectiveness of their cold cream, in commercials that aired in 1954 they would spread radioactive dirt on the faces of their models, use a geiger counter, then apply cold cream, and use the geiger counter again to demonstrate how effectively it removed the dangerous substance, all of which the narrator cheerfully explained. Considering that the company lasted until it was bought out in 2008, it seems a safe bet that models didn’t sue them for all they were worth for reckless endangerment.

5. Andy Griffith Explains Gunless Law Enforcement

Among right-wing pundits, this sitcom that ran from 1960 to 1968 is particularly treasured for presenting a wholesome portrait of small town values. Even 50 years after it ended, the town of Mount Airy is kept afloat by tourism because it was Andy Griffith’s real hometown and preserved numerous locations that inspired stories set in the fictional town of Mayberry. This makes a monologue that Sheriff Andy Taylor gives particularly surprising.

In the 1965 episode TV or Not TV, Andy Taylor is asked why he doesn’t carry a gun. He answers, “When a man carries a gun all the time, the respect he thinks he’s gettin’ might really be fear. So I don’t carry a gun because I don’t want the people of Mayberry to fear a gun. I’d rather they would respect me.” It is at least a critique of the notion that guns are necessary to keep the peace, and the militarization of the police. Also, this was deep in the Civil Rights movement, when it might have been a more inflammatory statement than it seems today. It is true that his deputy Barney Fife has a gun, but he was often shown as being a combination of buffoonish and horribly dangerous with it, which seems like only a slightly subtler critique of gun enthusiasts than Griffith’s words.

4. I, Claudius Brings Graphic Violence and Nudity to Public Television

This 1976 12-part adaptation of Robert Graves’s epic story of the man who went from palace clown to emperor is still one of the most acclaimed BBC productions of all-time. The obvious low budget and technical limitations (numerous cast members said early on that they didn’t expect the show to work) didn’t prevent audiences from appreciating the stellar performances and the riveting story. But it had a potentially much greater obstacle to mainstream appreciation, especially as far as America was concerned: A frank attitude towards graphic sex and violence, as could be expected of any show set dealing with palace intrigue during the height of the Roman Empire.

When the Public Broadcasting Station agreed to air it in 1977, the political climate provided much that should have given them pause. ABC had recently experienced a concerted boycott campaign for the the comedy series Soap which had only mentioned then controversial topics such as transvestism. It had cost the network considerable advertisers by the time the show had run its course. A publically-owned network was even more vulnerable to such pressures. And yet beheadings, toplessness, and other provocative material and all, the series also ran its course on the 270 local PBS affiliates.

3. Poor Devil: Sammy Davis Jr. is a Demon  

You might have heard that this member of the Rat Pack joined the Church of Satan for a few years beginning in 1968. Apparently he wasn’t just fine about people knowing but wanted to spread the word because in 1972 he managed to sell NBC on a pilot where he would play a demon who has to go around convincing people to sell their soul to Satan (played by a perfectly cast Christopher Lee).

His primary target is played by Jack Klugman – then in the middle of his run as Oscar on The Odd Couple, to show how much he was putting on the line by attaching his name to a sitcom that portrayed a demon sympathetically. As surprising as it is that such a show ever got a green light in the 1970s and made it to air, it’s probably not so surprising that it never made it to series. Probably didn’t help that instead of airing it more sensibly on Halloween it premiered on Valentine’s Day.

2. Queen for a Day

https://www.youtube.com/watch?v=0YW-uv3Ibm8

The following program would sound like something out of some especially grim science fiction dystopia except that as Stephanie Buck wrote for Timeline, it was a real program that ruled the airwaves from 1956 to 1964. It was a game show on NBC Universal hosted by television bit player Jack Bailey, although not so much a game in the Jeopardy! sense as in the Hunger Games sense.

The nature of the show was that the contestants would tell audiences their financial troubles in hopes of garnering enough sympathy to win, through written ballots, some variety of prize from the show’s sponsors that would hopefully lift them out of poverty. Not conventionally fabulous prizes: Things like artificial legs, lessons for a beauty school, or a year’s supply of baby food. One woman of the several who had to fly out at their own expense per episode would be given the prize per episode. Even at the time it was well understood how emotionally exploitative this was, with the show and its knockoffs known as “misery shows” or “sob shows.”

1. Twilight Zone’s Pro-Child Marriage Episode

https://www.dailymotion.com/video/x5rotr2

In the third season episode The Fugitive, the story is about a kid named Jenny with leg braces and her aged friend Old Ben. Jenny lives with her aunt in a small apartment. One day, after Ben and Jenny play a ball game with some friends and it’s revealed that two agents show up. They use some kind of sci-fi/magic device to put Jenny in a coma. Ben then heals her, and heals her legs (which involves removing her leg braces and socks), and then it’s revealed when the agents show up that he’s actually a runaway king and the two agents are there to bring him back to his planet, which will mean he has to abandon Jenny.

After the agents give them a moment alone to say goodbye, they come back and see that Ben has turned himself into Jenny, meaning they have to bring both of them back to their planet. Serling then shows up to do the outro while sitting on Jenny’s bed and tells us that Jenny will become “an honest to goodness queen,” and shows us a photo of what Ben supposedly looks like, and Ben’s actually a handsome young man. It’s literally a happy ending where a girl is taken from her home and marries the much older man who’s known her since she was maybe eight-years-old.

You might think that since this was way back in 1962 that this might have just been the product of a more innocent time. But considering that the issue of age of consent had been a controversy even back in the late 19th century and was still a controversial topic in 1939 when the exploitation film Child Bride was released, the episode was clearly made at a time when such portrayals would have been at least decades out of fashion. It’s bewildering how censors, producers, and network executives all could have failed to see what a creatively blinkered episode this was.

]]>
https://listorati.com/10-jaw-dropping-moments-from-vintage-television/feed/ 0 4097