Revolutionary inventions that disappeared for dumb reasons

From time to time, everyone has a revolutionary idea, one that could change the world. But then something interrupts our thought process: it's sweeps week, the kids are late for school, the boss is harping at us to get off the stupid phone. And thanks to one stupid thing, it totally face-plants.

The Apple Newton falls on its head

Before the dawn of cheap personal computers, no one in their right mind believed humans would be walking around with Star Trekian PADDs in 20 years. Yet the bastard children of PDAs—Personal Digital Assistants, not public displays of affection—not only cost a fraction of yuppie fanny-packers like the Palm Pilot, but are nearly indispensable to contemporary culture, used in millions of homes, restaurants, and by upwardly mobile kitty cats. Nowadays, having a tablet amounts to breathing or wearing underpants (we are wearing underpants, right? Wait. Don't answer that).

In 1993, nearly a decade before flat screen tablets ran amok in the mass market, the innovators at Apple gave the world the Newton MessagePad. Hardwired for wireless (with infrared technology) and packing a whopping 4 megabytes of memory, the Newton ran on its own operating system, one that informed the iPhone's developers about what not to do. It also used an early touchscreen with a stylus to transfer handwritten notes directly into useful data, scheduling appointments and reminding you you're better off with an actual friggin' notebook.

And therein lay the problem: For a system marketed for its e-notepad applications, the handwriting recognition software was horrific. Gary Trudeau's Doonesbury was the first jump on the scrawl-wagon, taking a week-long potshot at the PDA. During the strip, the titular character tries time and again to get the handwriting software to work, querying the device with "Catching on?" which it promptly converts to: "Egg Freckles." The satirical misnomer caught fire in the mass media, and the once-renowned company became a laughingstock.

Later models improved upon the scribble decoder, but the damage was done. It also didn't help that the project was headed up by Steve Jobs's primary competitor, Apple CEO John Scully. After Scully left the company, it was only a matter of time before the Newton went the way of the dinosaurs. Yet, from the ashes of Apple's first PDA rose the iPad. Who's laughing now, huh?

The EV1 jumps the e-vehicle gun

Even beautifully conceptualized cars with state-of-the-art advancements like the safety windshields on the short-lived Kaiser or revolutionary design like the Studebaker often hit a wall of stupid—hopefully with crash test dummies and those newfangled seat belts installed.

The fully autonomous electric vehicle was one such dream. After being muttered about in ecological circles, and placed in development for-freaking-ever, General Motors finally unveiled the first workable electric car, the EV1, in 1996. A stylish, zero-emission sedan, EV ran for roughly 100 miles between each charge, which wasn't bad given the capabilities of the era. True, there were a few complaints about its awkward steering (due to extra heft in the rear) and its quickly discharging battery. Other than that, the car had a lot of potential, making it all the more curious that the manufacturer chose to scrap it.

Upon ending its run in 2003, GM cited a lack of trained mechanics to safely repair the battery, the car's limited drive-time (although drive time had increased), inadequate family car applications as a two-seater, and most of all, cost of production. It also didn't help that hybrid electric-gas vehicles—the key word being gas—were pouring on the market more than enough fumes for the hint of conspiracy.

Of the several thousand constructed, only about 40 EV1s survived, due to GM's affinity for obliterating every last vehicle. The company's peculiar behavior, in addition to many lessees' mourning the death of the zero-emission-mobile, prompted filmmaker Chris Paine to make his 2006 documentary Who Killed the Electric Car? Sadly, it seems the EV1 was a little too far ahead of its time to succeed.

Intellivision gets played…by itself

Everyone had that one friend whose parents either sat a little prettier in the bank account department or were willing to shell out beaucoup bucks for frivolous crap for their children's material enrichment. As a result, they always got the latest toys, designer clothes, and state-of-the-art video game systems. For those of us growing up in the early '80s, this meant playing the short-lived but surprisingly advanced Mattel Intellivision (Mattel wasn't just Hot Wheels and He-Man).

"Intelligent television" pulled out all the stops for its 1979 release, with the toymaker top-loading the game station with a whopping (for the time) 1456 bytes of RAM, a 2 megahertz processor, and 16-bit graphics way before Nintendo came on the scene. Users could even "download" games via the PlayCable service—sort of an early Xbox Live. Gamers stoked for Advanced Dungeons and Dragons or Q*bert and his filthy #@*$ mouth could then kick back and play their brand spankin' new game on a 12-button numeric keypad controller or an optional keyboard.

Intellivision was cursed with several flaws, most notably its highly non-ergonomic controllers with their unintuitive, frustrate-you-until-you-throw-it-against-the-wall button set. The system also suffered from a hastily vetted design, which, although updated, took too long to catch up with its competitors' hardware. The Intellivision III was a perfect example of too-little-too-late and failed to wow '80s gamers. It also didn't help that personal computers dropped drastically in price. And as the video game market plummeted, it pulled Intellivision down with it.

Despite its shortcomings, the system still influenced countless home gaming devices and enjoys a cult status among nostalgia gamers and cranky old guys who hate new stuff.

LaserDisc barely scratches the video tape market

The LaserDisc is the ultimate example of a would-be revolutionary format. Initially developed by MCA and Philips, the first movie released in the US on LD was Jaws in 1978 (dah-dum). The 12-inch silver platters offered superior resolution to standard VHS tapes and were initially much cheaper to manufacture. The medium also allowed for precise playback options, giving viewers a crisp fast-forward, fast-reverse, fairly seamless pauses, and watchable slow-mo options—likely leading to a conversation or two starting with "son, why does my Basic Instinct disc choke up when Sharon Stone is about to be interrogated?"

LaserDisc did have some serious drawbacks. For one thing, it cost a crapload just to buy the player—that first entered the market at around $800 (compared to $200 VCRs at the time) with higher-end models running into the thousands (although prices fell to a couple hundred dollars). Later variations even rivaled modern DVDs in storage size during the mid '80s, but most movies forced viewers to flip the disc at least once during playback, if not swap out discs. Early LDs also suffered from "laser rot," a condition that chewed up the reflective aluminum layer, causing skips and lags. It also didn't help that the discs were bulky, fragile, and looked like a shiny vinyl—so you know someone's farsighted grandpa accidentally popped one on the hi-fi, scratching someone's rare Ninja Scroll import all to heck.

The main problem with LaserDiscs was that US distributors never really embraced them, limiting the available titles. Most importantly, they didn't record anything, so your creepy uncle couldn't crib skin flicks from Cinemax or Showtime like a VHS tape. Cinephiles and import collectors (as the discs were better supported abroad) were about the only holdouts for LaserDiscs, while everyone else just said "meh."

The flight of the Concorde underwhelms

The Concorde was born during an era of flight-based optimism, when rockets were headed to the Moon and everyone felt a little groovier about supersonic flight. Product of a joint venture between Britain and France, the mach-capable commercial airliner first took to the skies in 1969, before on-loading its first paying passengers in 1976. Throughout its nearly 40-year career, Concorde was one of only two commuter companies to carry passengers at supersonic speeds.

The final flight of the Concorde took place in 2003, the majestic bird's downfall arriving due to several reasons. The biggest hitch: catching a lift on the sound barrier-breaking jet ran about $7,000, with round trips a bargain rate at 10 grand (house down payment or fast puddle hop … hmm?). Reaching those near-1,400 mph cruising speeds also caused wall-rattling sonic booms, which handily succeeded at pissing off every neighbor for at least a hundred miles. The Concorde was highly inefficient, sucking over 100 tons of fuel for just one flight between London and New York (versus the massive 777, which burns half that fuel per transatlantic flight). The final nails in the Concorde's coffin were a crash that killed everyone aboard and new fears of flying after 9/11, which combined to sap what little demand the plane had.

There's a light at the end of Concorde's tunnel, though. In 2015, Club Concorde purchased a working museum piece and plans on restoring it, potentially relaunching in 2019—50 years after it first took flight.

Going off the rails on a maglev train

Admittedly, it's difficult to imagine a single-railed train without humming the "Monorail" song from The Simpsons. Perhaps the long-running cartoon show and its fast-talking Phil Hartman-portrayed huckster wrecked maglev trains (trains propelled by magnetic repulsion along an elevated, single-railed track) for Americans. Perhaps not. The prospect of high-speed trains in the States' car culture seems about as likely as personal helicopters.

Unlike the US, some parts of the world have embraced the maglev. Cleaner, quieter, and more efficient, it truly is a superior form of railroading. Rapid rail systems in Europe and Asia routinely reach average rates of speed between 200 and 300 mph—with a Japanese bullet train recently topping out at nearly 400 mph.

High-speed rail is political sore spot in the States, though, with car culture, as well as geographical distance, standing between a larger-scale investment in maglev trains. America also has a longtime love-hate relationship with trains, but if the US ever does embrace high-speed rail, it could gain more traction in developing nations.

Sadly, maglev lags far behind its own potential across the globe. However, repulsive transit might still has a lot of life left in it, even in the United States, where several urban areas are investigating the possibilities.

Imaginary pirates killed the DAT

Wedged between that dusty KC and the Sunshine Band 8-track in grandma's garage and the Moana soundtrack your kids forced you to download (at pain of tone-deaf singing) is the missing link.

First developed in the early '70s, digital audio tapes (or DATs) were a compact, high-fidelity alternative to the muddiness and single-headed, tape-chewing goodness of analog cassettes (because who doesn't love that regurgitated-by-chipmunks sound). As CDs were poised to take over for vinyl, DATs sat on the bench waiting for a shot at their analog predecessors. Offering longer recording times, digital quality, and the ability to erase and rewrite—something CDs weren't capable of at the time—DATs offered a smaller package without the chipping, scratching, and shattering problems of similar formats of the era.

But there was one major hitch: The Recording Industry Association of America, worried that bootleggers would have a field day flying their skull and crossbones all over digital tapes, fought against them. Eventually, the RIAA backed down, once the government introduced regulations designed to combat piracy, limiting DATs' capabilities. As a result, the tapes never made much of an impact. Aside from heavily saturated markets like Japan, and the recording industry, who used the format heavily until everyone decided to auto-tune their albums on Pro Tools.

Sony finally dropped the format in 2005, although DATs are still floating around in concert bootlegger and anachronistic collector circles.

Cinerama displays its flaws across a wide screen

As television began to time-suck the world in the 1950s, the slightly flummoxed film industry crafted new and often silly gimmicks to coax audiences back into the theaters: Smell-o-Vision, 3-D, Emergo, and Sensurround, among others, all tried pique viewers back into theater seats. One of these tricks wasn't as much a cheap ploy as a misfire at the widescreen format.

Cinerama—which, "coincidentally," is an anagram for American—was developed the early '50s. The first film released was the self-reflexive This Is Cinerama from 1952, which impressed audiences enough for a return ticket. Cinerama made use of a bowled movie screen and three projectors to create an immersive viewing experience, much like IMAX does today. There were, however, a few drawbacks.

For instance, the beleaguered projectionist was forced to work three reels at once. If one strip of film broke, the screen went temporarily black—probably startling the audience with fears of going partially blind. Plus, images tended to look warped to anyone not sitting around the "sweet spot" in the cinema. Cinerama filmmakers also dealt with their fair share of hassles working in the medium, since the rounded screen made it nearly impossible to shoot close-ups. Actors talking to one another appeared to be looking past each other when projected on screen, prodding later directors to only film one actor at a time.

Later iterations used 70-millimeter anamorphic cameras to better effect, but by that point, Cinerama was already fading into obscurity. Today, aside from a handful of theaters around the world, the format has largely died off. Its influence did pave the way for IMAX and other widescreen processes, though.

Pay By Touch strikes a nerve

Apparently, fingerprints weren't just a fun way for Sherlock Holmes to flip the worker's salute to Scotland Yard time and time again. Biometrics, or a system of identification based on our own unique characteristics—like the annoying fingerprint time clock that only lets you punch in a third of the time—was at one point the wave of the future for easy, secure access to our credit cards, bank accounts, air travel, and other features. What's that about ethical issues based on coding and classifying an entire populace based on biological traits? Shhh.

After landing several hundred million dollars in backers, the Pay By Touch Company developed point of sales systems—the place you get stuck behind the ancient woman with the checkbook—in the late '90s. Biometric credit cards offered several advantages over the current credit card point of sales systems. Not only would it be extremely difficult for sketchy people to access your bank account without slicing off a finger (yikes), but forgetting or losing your credit card would be a thing of the past, without losing a digit (yikes again). By the mid-2000s, Pay By Touch's POS system was poised to change the way we shop. But trouble was brewing for the company.

Before PBT slapped biometric scanners in supermarkets across the country, CEO John Rogers ran afoul of several lawsuits, alleging domestic abuse, sexual harassment, workplace bullying, drug binges, and mismanaging funds. The claims, the public relations nightmare, as well as an internal power struggle, forced the company to fold in 2008. Since then, other companies have noodled around with biometrics, but the technology has yet to regain its prominence from the Pay By Touch-era. Somehow, George Orwell is letting out a sigh of relief.

AT&T's Sceptre fails royally

Remember WebTV? We're not surprised if you don't. It was a stripped down way to browse the web and check emails from the comfort of your own television. AT&T's Sceptre was an early version of it, and this "web" connected through the cable box or via telephone lines. Of course, the web wasn't really the web as we know it, but "videotex," an early interactive systems with only one centralized source of information, as opposed to the Internet, which lets thousands of cookies and spyware apps converge on your phone.

Sceptre initially came as a kit, equipped with a central processing unit that connected to the television, an infrared keyboard, and a 1200 baud modem. Once connected to the videotex server—either Knight-Ridder's Viewtron or Los Angeles Times Gateway—the interactive box delivered news, e-commerce, airline schedules, program downloads, and even a chat feature ("do you hate modem noises? Me too!"), all for the low, low price of just $39.95 a month.

Naturally, there was a catch: The system designed for the mildly technophobic required browsers to drop $600 for the Sceptre (marked down from $900), which is a lot of simoleons for a luddite looking to break into the proto-online world. On top of that, the TV-based unit only functioned with one of the viewtex services, so you'd be shelling out a lot just to get started, plus a $40 subscription charge each month, and connection fees for a handful of services—without experiencing any of the additional perks that come along with early personal computers, like delightful monochrome views and complex text-based operating systems. And AT&T wondered why Sceptre wasn't raking in the dough.

Eventually, the telecom giant caught wind of this problem, expanding the service to people rocking PCs with relatively inexpensive software. Sadly, it was too late for the Sceptre and other videotex companies, as the service was all but dead in the water, with AT&T taking a bath to the tune of $100 million. Fortunately for people starved for a venue to complain about movies, conjure up conspiracy theories, and search for porn, the Internet was only a few shy years away.