Skip to content

We are ruled by sociopaths

Here's a tip: Nobody wants to receive a birthday message from a fake you.

3 min read

I don't intend this site to be a one-note "A.I." bash-fest, I really don't, but every damn day there's some new story that strongly hints that the people scrambling to invent uses for "A.I." are sociopathic freaks who love the current incarnations of "artificial intelligence" because they have so damn little intelligence that the computer version, spitting out pictures of three-kneed farmers and blindly regurgitating whatever the programming teams managed to slurp in from Reddit and StackExchange, looks positively brilliant to them.

Apple, Microsoft and Google are heralding a new era of what they describe as artificially intelligent smartphones and computers. The devices, they say, will automate tasks like editing photos and wishing a friend a happy birthday.

But to make that work, these companies need something from you: more data.

In this new paradigm, your Windows computer will take a screenshot of everything you do every few seconds. An iPhone will stitch together information across many apps you use. And an Android phone can listen to a call in real time to alert you to a scam.

Oh my God stop, stop, stop. First off, all of it sounds more like dystopian nightmare than anything else—and I cannot imagine how much of the remaining Amazon rain forest will be going bye-bye while Microsoft attempts to piece together what your life looks like by taking screenshots and running the screenshots through the bowels of an A.I. engine, which is about the most inefficient means of reconstructing user actions short of having interns build popsicle-stick representations of whatever they see on your screen and then burning their dioramas to analyze the smoke—but second of all I can't get over just how petty most of the supposed consumer uses of "A.I." sound.

First there was the Bumble executive who proposed that the future of dating consisted of everyone making fake AI personas of themselves and having the robot personas—sorry, your "dating concierges"—pretend to date each other and report back to you whether they fell in love. But I do want to meet whatever LinkedIn-huffing executive willing to pipe up with "You know what consumers need? Consumers need an AI to wish their friends happy birthday for them so that they don't have to bother. Consumers want to send a fake-ass personal note to a supposed friend telling them they're supposedly thinking about them, but don't want to actually write a note or think about them because that's too much work and it would make people late for the only jobs that will be available in the future, hosing human intestines off the propeller blades of Amazon delivery drones that didn't land where they were supposed to."

You might think that this is just a made up case, but it's not. If you go on Google right now, you will find a full damn page of companies offering up A.I.-generated birthday messages so that you, a vapid and emotionless social barnacle feigning friendship with other people, don't have to engage in those interactions yourself.

I don't think this is the fault of A.I, I really don't. I think this is the fault of LinkedIn brain. There's a particular sort of person who wants to become Silicon Valley "entrepreneur," and that person can best be described as "ambitious sociopath who has little to no idea of how actual human interactions work." In their minds, we need A.I. to save us from a future in which human beings are writing poetry or painting pictures or doing absolutely anything at all that might bring a sense of fulfillment. We need A.I. to make fake versions of ourselves that look better in faked pictures of ourselves, faked pictures we send out using our "dating concierges" so that our fake selves can go on dates with other people's fake selves like a bizarre reality dating show that exists only inside the confines of one unknown datacenter.

The Silicon Valley version of "A.I." doesn't envision the technology as a way to supercharge scientific problem-solving or hash out trends or patterns that might be too nebulous for human observers to easily find. They see "A.I." as a way to free consumers from having to be human. If everyone could just talk to their friends or dates or family members using "A.I.", if those relationships themselves could be automated away, in would free up so much time that consumers could otherwise use to work more hours or consume more Content.

This is what I mean by LinkedIn brain, this omnipresent corporate notion that human beings themselves are utterly interchangeable cogs who only have societal use if they can be freed of their human traits and turned into good little automatons. And I'm convinced that much of the current chatbot fanclubbing comes from top-level managerial classes who are so socially incompetent, narcissistic, and sociopathic that they've never had a goddamn real relationship with anybody, so the idea of editing the emotions and humanity out of everyone else sounds like a utopian ideal on par with the colonization of Mars.

Comments

We want Uncharted Blue to be a welcoming and progressive space.

Before commenting, make sure you've read our Community Guidelines.