Tuesday, October 18, 2016

Don't Tech People Ever Encounter Dystopian Fiction?

The drones from Iron Man.

 Often when I read the news these days, I think to myself: don't these people ever encounter dystopian futuristic books and movies?

To me, it seems like modern narratives are full of of very plausible depictions of the very awful and disastrous consequences of creating and adopting new technologies that are of very dubious usefulness in the first place. Don't the people creating these technologies ever think to themselves "Wow, I'm like the inadvertently evil person in a futuristic disaster movie"?


One obvious example is the Internet of Things. I'm not even a huge sci-fi fan, but even I know that many of the classics depict objects turning, or being turned, against us. It's not in the least far-fetched. In fact, just recently a successful DDoS attack was executed by a bunch of "innocuous things like digital video recorders and security cameras."

When I first read that, I felt like, 'Well, duh." This is what novelists and artists have been telling us for years. Isn't one of the main sci-fi moments when Dave says "Open the pod bay doors, HAL," and Hal says "I'm sorry, Dave. I'm afraid I can't do that"? Didn't Philip K. Dick write about a door that wouldn't open unless you pay? Aren't these the logical extensions of having your fridge or your front door connected to the internet, and, by obvious extension, to mega-corporations and the NSA?

I don't even get why people want the Internet of Things. What's so tough about making a note to buy milk, and if you forget and run out one day, it's not the end of the world? What's inadequate about the existing concept of, say, a key to get into your home? The electric grid is fragile from years of neglect. One good shot could knock out communications satellite. If the power is out, do you really want to be unable to get into your own home? I picture the poor befuddled people of the future, thinking "If only there were some simple technology where you could fashion a device, maybe out of metal and it would just ... open the door." Sad!

I thought the same thing when I read about how Facebook wants to help banks evaluate your credit-worthiness by looking at the creditworthiness of your friends. For fuck's sake, people. Isn't this well-worn territory? In just the latest incarnation I happen to know of, Gary Shteyngart's Super Sad True Love Story describes a system where people's scores are constantly broadcast so everyone knows exactly how you stack up mony-wise and prestige-wise. I'd tell you more about Super Sad True Love Story but the truth is I haven't read it, partly because it seems too insanely depressing and I have other things to worry about.

And what about new and improved facial recognition technology? The dystopian possibilities of 100 percent surveillance are well-explored, and yet we keep marching forward. The always great MathBabe says that a new company headed by "two 20-something Russian tech dudes" is producing software pretty good at it. Faced with the obvious ethical questions, their response is along the lines of "It's too late to worry; we can distinguish the good guys from the bad guys; Luddites gonna be Luddites."

Finally, I'm sure you've read about Amazon testing drone delivery, out in the back-wilds of the UK (and, I now learn, in Canada!). Drones? To bring consumer crap to your house? Don't these people go to the movies? You'd think the Iron Man franchise was some kind of Indie cult film you could only get on Blu-ray.

So: what is the deal? Is it that the powers of capitalism are so intense that people forge ahead knowing that it will all end in tears? Is it some kind of cognitive bias for optimism, where people just think "this time it will be different"?

The popularity and style of modern dystopian narratives almost suggests to me a much darker and creepier possibility: that there is a desire for dystopia, a yearning for a crisis that will throw us out of our current state of moral complexity and our compromised ways of living and boredom. The problems of modern life are so complicated and unglamorous. It's hard to do a good thing without worrying you're also doing bad. Solutions to problems like the refugee crisis, systemic injustice, and climate change are going to require thinking and dealing with laws, education, and bureaucracy.

Are people secretly longing for a new situation, one where some of us are heroes and some of us are vulture food? Where instead of dealing with difficult problems that we don't know how to solve, we'll be in a more Mad Max situation, where it's like "Weakness = bad! Protecting daughter by killing guy = good!"

I don't know. But whenever I go along with this train of thought, I always end up in the same place. Should I give up this whole "philosophy professor" biz, and to learn how to repair low-tech kitchen appliances?

2 comments:

Vance Ricks said...

This is only partly a serious response to your serious questions.
I think that folks like danah boyd and Evgeny Morozov, to take two of the usual suspects, have regularly noted that various strains of Randianism/libertarianism have found extremely hospitable hosts in "the" tech communities in the US, especially those on the west coast. Maybe that would help to explain the kind of blitheness you're noticing (i.e., the sense that no matter what, "I" will be fine)? It certainly shows up regularly in discussions about surveillance and privacy, in the "well, *I* don't have anything to hide, and I have nothing to worry about by showing my ass to all and sundry, so, what's the problem?" responses that infest those discussions.
As I said, all of that is only partly serious.

Janet Vickers said...

It's research for when homo sapien has become extinct and the anthro-hyena can write papers on this phenomenon.