Hollywood has never been a safe place for smart science fiction. Since square-jawed actors first stepped into soundstage rockets, geeks have watched countless classic sci-fi novels beaten into 90-minute pulp, often filled with useless comic relief, bizarre romantic subplots featuring implausibly attractive research types, and a bold corruption of core scientific principles. Either that, or the movies just didn't make any sense—the 1984 David Lynch film adaptation of Frank Herbert's classic Dune was so incomprehensible to the uninitiated that some theaters passed out a glossary of the film's lingo to moviegoers.
Even among sci-fi scripts that weren't pulled from the pages of classic novels, the influence of Flash Gordon was hard to shake. The “What if?” thought=experiment premises that were the hallmark of great sci-fi gave way to decisions over how much clothing a Martian lady should or shouldn’t wear ("less" usually won). But failure, like most things, occurs in degrees. There was a time when big-budget science fiction wasn’t completely brainless. In fact, some of it was good, even brilliant, and a source of inspiration for generations of researchers. From 2001: A Space Odyssey to The Terminator, the promise (and the apocalyptic threat) of technology was sometimes presented with a surprising level of authority and thoughtfulness. The plotlines still veered toward pulpy melodrama, but by respecting the science, these films helped shape the years that followed.
That was then. Now, big-budget science fiction is drifting toward the event horizon, as Hollywood’s addiction to remakes, reboots and franchise building continues to suck the creativity out of an occasionally proud genre. And the explosion of comic book movies, while not completely to blame, has become part of the problem.
The Comic Movie Conundrum
Despite the rise of the so-called graphic novel over the past few decades—self-contained, morally complex comic series like Alan Moore’s Watchmen and Frank Miller’s The Dark Knight Returns—comic books are still widely regarded as kiddie stuff. And when X-Men and Spider-Man recently proved that film adaptations of popular comics were a relatively untapped box-office goldmine, they did so without substantially updating the science behind the superheroes.
Mutants are explained away with passing references to evolution—usually failing to address its glacial pace, or the fact that there are no shared powers or traits among these genetic deviants (each one of them is a new subspecies). In the first X-Men movie, Magneto's doomsday weapon, which was designed to turn regular people into mutants, is a supremely magical and unscientific plot device. In Spider-Man, the original radioactive spider that bites Peter Parker was replaced with a genetically engineered one. It’s no less goofy a premise, and once the villainous Green Goblin starts surfing around on a hoverboard designed for the military, we’re firmly in the funny pages.
In the sequels for both flicks, things got even zanier. In Spider-Man 2, a scientist creates a limitless power source, then casually builds a bunch of self-aware robot tentacles that interface directly with the spinal cord. Why? Because that power source gets hot, and someone has to hit the buttons on the perilously close control panel. This guy has the answer to all of our energy problems, and he chooses to give himself a few extra limbs.
There’s a reason to pick on these movies. It’s not that they’re claiming any level of realism. But superhero movies, with their alrflagrant, built-in disinterest in getting any aspect of science or technology right, have taken a bite out of science fiction’s market share. Not only do they appeal to much of the same demographic, but they vie for the same studio funding, and the same creative personnel.
A quick comparison: In 1982, the year’s major sci-fi releases included Blade Runner, E.T. the Extra-Terrestrial, Star Trek II: The Wrath of Khan, The Thing and Tron. In 2007, we saw Aliens vs. Predator: Requiem, 28 Weeks Later, I Am Legend, Fantastic Four: Rise of the Silver Surfer, The Invasion, Resident Evil: Extinction, Spider-Man 3 and Transformers. In this glut of sequels, remakes and comic, cartoon and video-game adaptations, the closest thing to an original production was I Am Legend, based on a classic novel that had already been made into multiple movies. Unfortunately, the science in that movie is on par with Optimus Prime’s magic, robot-killing heart, or the completely brushed-aside explanation of the Silver Surfer’s cosmic abilities. In all of these movies—particularly the ones based on comics—technology is used to leap sudden chasms in the plot, then shoved quickly out of sight. They get away with this because we no longer expect it to make sense. After all, it’s a comic ... or a video game, or a cartoon, or a live-action movie that feels like a cartoon. So it’s supposed to be stupid, right?
The Uncelebrated Smartness of Science
From the late 1960s through the '80s, there was something awe-inspiring about science. We were sending people into orbit and to the moon. Robots seemed perpetually on the cusp of serving (or destroying) humanity. The national buzz that came from winning the moon race took decades to fade. This, you could argue, was a golden age for Hollywood science fiction. Alien, Outland and Saturn 3 weren’t trying to predict anything specific, but they built feasible visions of a space-faring future. And whether technology was used for murdering tourists in Westworld, limiting population growth in Logan’s Run or discovering advanced underwater civilizations in The Abyss, even the most ridiculous premises were often informed by sober speculation.
The ‘90s were more of a silver age for science-based sci-fi. Concepts like virtual reality and cyberspace, explored in the previous decade’s cyberpunk novels, finally trudged onto the screen. The results were mostly embarrassing: The Lawnmower Man, Johnny Mnemonic. And while The Matrix may have been the decade’s biggest genre hit, none of the movie’s technology holds up. Minor tech trends, like nanotech and genetic engineering, popped up onscreen in Terminator 2 and Jurassic Park, but by the ‘90s, the marriage of action movies and sci-fi films had been consummated. For every 12 Monkeys or Strange Days, there was an Armageddon or RoboCop 3. Technology was providing excuses for bigger explosions—and, in the case of Armageddon, moon buggies that were equipped with machine guns. No reason was given for those guns. And by then, no one was asking.
Which brings us to the Bronze Age, the era of The Day After Tomorrow and The Island, and a nonstop run of comic movies. This summer, science fiction amounts to The Incredible Hulk’s muscle-producing gamma rays and Iron Man’s inexplicably powerful body armor. As some geeks strain to provide real-world correlations to Tony Stark’s invention, they’re missing the point: There’s nothing like it in real life, and there never will be. Repulsor beams sound cool, so they made it into the comic. A big bull's-eye target on your chest looks cool, so that’s how Iron Man is drawn. And wings, stabilizing fins and bulky fuel tanks are not as slick as a gleaming, form-fitting metallic suit, so never mind the lack of flight control or an even marginally plausible power source. This is a comic book, not rocket science. And now it’s a movie, so trust that Tony Stark is genius, and let the special effects do their thing.
Without diving into an even more complex issue, interest in science is at a low point in this country. Gadget and robot-related news might score high marks online, but there’s a difference between reading a blog and getting a doctorate. Okay, the country is still looking for ways to reenergize science education, but ask yourself this: Will Iron Man inspire anybody to build a better exoskeleton? Is The Incredible Hulk another way to encourage p