DYSON: Baby, this thing is going to blow 'em all away. It's a neural-net process --
TARISSA: I know. You told me. It's a neural-net processor. It thinks and learns like we do.
A few months ago, I argued that 1994 was a pivotal year when all kinds of current trends began. I don’t think I was wrong, but when I talked about movies it was hard to ignore the elephant in the room, because in my view the greatest film of the 1990s came out three years earlier. It was years ahead of its time and still looks brilliant, and more importantly I think it perfectly bridges two forms of paranoia. Come with me if you want to live …
Sitting where we are in 2025, Terminator 2 opening in a war-ravaged “Los Angeles 2029AD” maybe no longer seems as funny as it did. There’s a fun game you can play looking back at future movie dystopias: Escape from New York happened in 1997 during the second Clinton administration, Blade Runner took place towards the end of Trump’s first term, and so on. But although the whole franchise plays with changing timelines, T2 is not a film about 2029, it’s very firmly rooted in 1991.
I’m not just talking about the mullets and the videogames, or Axl Rose on the soundtrack, though of course those are a factor. But in the very same month Terminator 2 was released, Bush and Gorbachev signed the START 1 nuclear weapons reduction treaty. By that stage we were already a little bit past the peak, but as a species we still had 60,000 nuclear warheads pointing at one another.
For all that people talk about the effect of climate doom and virus paranoia now, it’s hard to over-emphasise the psychological effect that this nuclear Sword of Damocles had. You didn’t have to be a particularly over-imaginative child growing up in the 1980s to have some version of Sarah Connor’s dream one night or another, and we didn’t even know at the time about the Petrov Incident or Able Archer. In that sense, the idea of the world ending in nuclear flames was all too familiar, even if we’d never seen it filmed as well as this.
And of course, you can’t talk about Terminator 2 without talking about the effects. Watching it now in 2025, it still looks brilliant. We’re talking about a 34-year-old film - so it’s as old now as 12 Angry Men or 3:10 to Yuma were when it was released - and yet the effects don’t seem dated. Stan Winston and his team played an absolute blinder.
I think part of it was knowing when to use computers and when not to. Of course some of the liquid metal effects could only have come out of 1990s processing technology, but equally much of them (and of scenes like the Sarah Connor dream sequence) was done with old-fashioned model work and with animatronics.
I haven’t even mentioned the human actors. Schwarzenegger plays his role perfectly, always almost right on the edge of self-parody. Linda Hamilton shows fantastic range, from placidly trying to fool the hospital right through to full-on ass kicking mode. But in some ways the people don’t matter so much. Of course, to some extent it’s a film about family, about whether a robot can become a father figure (a theme taken to an extreme by Arnie’s turn as “Carl” the semi-retired Terminator in the not very good Terminator: Dark Fate).
However, I think there’s a second form of paranoia that feels much more up-to-date. It’s a film about humans and machines in opposition, where we’re at a huge disadvantage:
You are targeted for termination. The T-1000 will not stop until it completes its mission. Ever.
In that sense, while it was partly a film about the nuclear paranoia of 1980s, it’s just as much about the AI paranoia of today. With constant talk of us approaching singularities and of our jobs being phased out by LLMs, Terminator 2 taps into our modern fears, thirty years ahead of the curve.
But I think what’s really interesting is how we get there. Because of a neat trick with the timelines that you shouldn’t think about too much1 , Skynet is developed by a guy working on neural networks in a regular kind of office building. He doesn’t seem so different to us, and I think that’s what’s really scary.
We’re used to the idea that bad things happen because of bad people. Take Mark Zuckerberg and Facebook. When in his Harvard dorm room he was creating a way to rank girls, he probably didn’t see that it would be implicated in the Rohingya genocide, and nor could he be expected to. But on the other hand, if the corporate motto was to “move fast and break things”, you can’t suddenly act all innocent when things get broken.
Of course, the portrayal of Zuckerberg in The Social Network is fiction, but it’s fiction that rings very true, and bad outcomes are pretty much baked in from the start. As one of my favourite Substacks The Metropolitan puts it:
Sorkin proposes that Mark’s idea for Facebook is driven by a misogynist pathology, one that sees young women as fleshy symbols of male status, and infuriating sexual gatekeepers. Freshly dumped and furious, Mark spots the Web’s compulsive potential for widening sexual opportunities and exploiting sexual longing; he literally encodes the denaturing of human relationships.
Sorkin’s Zuckerberg does the things that he does because he only cares about status, about money, about not being invited to the right Harvard clubs. He wants to win more than he wants to have actual friends.
Miles Dyson, the inadvertent architect of the Terminators, is not like that. He’s a family guy. He wants to make the world better and safer:
Imagine a jetline with a pilot that never makes a mistake, never gets tired, never shows up to work with a hangover. Meet the pilot.
The absolutely pivotal scene of the movie is where Sarah Connor shows up at Dyson’s house ready to kill him. She knows that Dyson’s work leads to the creation of Skynet, to Judgement Day, to the deaths of billions of people. She knows that killing him can stop all this - and yet when she gets inside and sees him and his family, she can’t do it. That’s the difference between human and machine. The T-1000 can never be stopped or diverted, but for all her planning and preparation, Connor is different.
So if there’s a deep message from Terminator 2 in 2025, I think it’s this. We could all be Miles Dyson. None of us get a handy moment of clarity where an enormous Austrian robot shows up at night like something out of A Christmas Carol:
Dyson listened while the Terminator laid it all down. Skynet. Judgment Day... the history of things to come. It's not every day you find out you're responsible for 3 billion deaths. He took it pretty well, considering...
It’s really up to us. We all have to navigate the world, make the best choices we can, but never knowing for sure what we do is going to lead to. Maybe by not acting like Sorkin’s Zuckerberg we can improve the odds, but because we don’t have a convenient time loop to tell us where the things that we do will go, we all just have to do our best.
And if that isn’t at least as scary as an implacable liquid metal cyborg on our trail, then I don’t know what is.
If Skynet is developed using technology from the original bad 2029 Terminator, how was the original bad Terminator developed if Skynet was never built? I can’t decide if this is a problem or not. But it’s a good example of T2 being more interesting than your average popcorn movie.
I know we’re all currently worrying about tech but the development of the atomic bomb - and particularly the increasing desperation of the physicists who were trying to stop it - is such a powerful fable about all of this. And I still don’t know what the answer is, in terms of the human urge to develop new capacities via tech. Can that really be stopped? Once a potential has become evident to more than one person, isn’t the development pretty much guaranteed (so long as there’s some utility or profit at the end of it)?
I’ve often thought that about T2 effects, and why the constraints (and knowledge of those constraints) made for much better work than later ‘let’s just hammer this with CG’ approaches. Reminds me as well of that Michael Lewis story about the best coders in finance being the ones who grew up with limited computer access, so could only run punchcards once.