Hollywood director Ridley Scott is known for some of the biggest hits in Hollywood: Alien (1979) Thelma and Louise (1991) and Gladiator (2000) among them. But his name will forever be associated with one of the all-time great movies, Blade Runner (1982) starring a young Harrison Ford.
The cult classic explores themes that are even more relevant today than when it was released: the nature of humanity, the ethics of artificial intelligence, and the impact of environmental degradation. Above all, Scott’s musings and predictions on the development of artificial intelligence, which awed us as science fiction more than 40 years ago, are front and center in our concerns today.
In his dark, dystopian vision of the future–the year 2019 in the movie–he also predicted some aspects of today’s technology, such as biometric scanners, voice control, photo editing and video calls. In the movie, AI is used to create human-like robots called replicants, who rebel against their creators and try to escape. Does this sound familiar? It should, since one the most discussed topics today among tech experts is the notion of “singularity,” the point at which artificial intelligence becomes self-directed and therefore, a threat to humanity.
Blade Runner is a classic example of retro-futurism, where the future looks like the past with some twists. In this case the twists are terrifying, but unfortunately, also accurately predictive.
In an interview with Rolling Stone promoting his film “Napoleon,” Scott was asked if artificial intelligence worried him, and the answer was that it terrifies him. Not mincing words, he called AI a “technological hydrogen bomb.”
“We have to lock down AI. And I don’t know how you’re gonna lock it down,” he told the outlet. “They have these discussions in the government, ‘How are we gonna lock down AI?’ Are you f—ing kidding? You’re never gonna lock it down. Once it’s out, it’s out.”
Indirectly pinpointing the dangers of singularity, Scott explained: “If I’m designing AI, I’m going to design a computer whose first job is to design another computer that’s cleverer than the first one. And when they get together, then you’re in trouble, because then it can take over the whole electrical-monetary system in the world and switch it off. That’s your first disaster. It’s a technical hydrogen bomb. Think about what that would mean.”
Referring back to ideas explored in Blade Runner, Scott added, “I always thought the world would end up being run by two corporations, and I think we’re headed in that direction,” the 85-year-old said. “Tyrell Corp in ‘Blade Runner’ probably owned 45-50% of the world, and one of his playthings was creating replication through DNA. Tyrell [played by Joe Turkel] thinks he’s god…”
In what has been called one of the most chilling scenes in science fiction, Tyrell-as-technological-god has a fatal showdown with Roy Batty the super-replicant who wants his short life extended through technology.
Scott is also deeply worried about how AI will change the arts and specifically, the movie industry. Indeed, this was a major sticking point in the recent WGS strike negotiations, as the workers demanded that AI should not replace human jobs or reduce wages, while the management argued that AI can improve productivity and efficiency.
Scott has unequivocal beliefs on the subject: “They really have to not allow this, and I don’t know how you can control it,” he said.
Nor does he have any faith that AI could bring any improvements to creativity. He added, “There’s something non-creative about data. You’re gonna get a painting created by a computer, but I like to believe – and I’m saying this without confidence – it won’t work with anything particularly special that requires emotion or soul. With that said, I’m still worried about it.”