7H1NK3R
Are you experiencing some machine anxiety? Perhaps concern that some manner of robot will take your job away.
This theme never seems to get tired, but this time, when it happens, it will trigger a chain reaction on civilization as significant as harnessing fire. It is the dawn of an age that will fuck with our minds on a whole new scale. And yes — tick tock — it’s almost time to cull the herd!
We’ve all been reading stories about it for years, watching all the keynote speeches, embraced the hype, and made our purchase decisions to fuel giants, so each of us, in our own way, has participated in the nurturing of the machine. We have been feeding it, placing our trust in it, surrendering our identities to it.
After all, machines will be machines… and for the most part, guided by external forces; they are, until now, tools in our service.
It’s understandable that when you read articles about artificial intelligence, they usually fall into one of two extremes, A) wonder with optimistic hope and B) the ‘we are totally fucked' Skynet camp.
This is the way we tend to operate. We all hate an ambiguous 3 star movie review ; how many of us are happy when friends respond with ‘maybe’ to our dinner party invite? In every way, it is easier to extract and derive meaningful feedback from polar extremes.
As Dekkard put it in Blade Runner… “Replicants are like any other machine - they're either a benefit or a hazard. If they're a benefit, it's not my problem.”
The problem with this, and this whole article, really, is it doesn’t really have anything to do with fucking robots!
When I grew up, there was a lot of people that didn’t dig sci-fi as a genre. At the time, those that loved the shit tended to be social misfits, and were rightfully ridiculed, which in turn motivated them to take over the world. Thus, today, sci-fi is completely mainstream. For those late to the sci-fi party, do you ever get the sense you’ve missed out on the backstory of our present and the rudimentary survival training that came with it?
The shit is now.
According to an American trend study conducted in 2014 by Pews Research Center, Science and Technology ranks among the top 3 interests. It’s on top of Business, Sports, Entertainment, Religion and Government… with only local community events and health ranking higher.
We, as a civilization are booking it, full steam towards smart, autonomous, self-aware machines. Once this tips, a whole shit load of ‘holy fuck’ will unfold because the amount of change impacting daily life will be at unfathomable proportions… Think of the legislative and regulatory perspective, the economic impact, and, on a human emotional level, are we remotely prepared?
When you walk in on your partner fucking a self-aware machine, is that adultery? If you are the ASFR-type and fall in love with your robot companion, will your marriage be recognized by the state? How do we feel about hunting self-aware machines for sport and pleasure; is it cool? Will machines be our new slave class? Will they need human rights?
As the cantina dude at Mos Eisly said, “We don’t serve their kind here!”
It’s like opening a big ass can of tapeworm caviar.
Some people write how ‘we’ can steer this technology and make good social decisions. For sure, I support this kind of positive proactive approach as much as I oppose evil… but there is no shortage of examples where good intentions result in calamity… and, let’s face it, who is ‘we’ anyway? This stuff will be decided by organizations that have monetary measurements and line items for good deeds won’t balance the books.
It really isn’t about the robots…
Humans don’t like each other so much, we never have. We tolerate each other to maintain enough of a social fabric and cohesion so we aren’t alone to fight groups that we tolerate even less. In 2015, Wired published “Why people care more about pets than other humans” citing research by two American universities. What resounded most to me from this article was our ‘special concern for creatures that are innocent and defenceless”. In many of the experiments and research, humans demonstrated “the lowest levels of emotional distress” from victim scenarios involving human adults. This isn’t a smoking gun to support my point because I really don’t need any additional reinforcement beyond what I see, hear, and read on a day to day basis, it’s just fascinating that there are so many circumstances where we value animals over human lives.
Let's throw an inanimate object into the mix… save your phone or save a stranger scenario. I can only assume it would be a lot tougher decision than many would care to admit. If there was NO social consequences, I predict entire cities (if not civilizations) could be sacrificed to avoid the minor inconvenience of replacing a SIM card and having to set up a new device.
Now, from a workforce perspective, I don’t think anyone reading this has not been exposed to an employee, a business partner, co-worker, or boss who is certifiably useless, unreliable, or downright contemptuous. According to recent reports by Gallup, worldwide employee engagement remains dismal. 70% of people are just disengaged, unhappy, and have little to no trust in their bosses. They just don't give a shit.
Even though I am a sucker for a good underdog story, the reality is, I haven’t experienced enough turn-arounds where a disengaged under-performer becomes a super-star or even an asshat demonstrating slight improvement in attitude or partial reformation. Have you? Top that off with your most performant members bailing on you for better opportunities, it is typically an uphill battle on the HR front. Now how appealing would a nice reliable machine-force be to replace all the fuck-faces that seem to go out of their way to undermine your chances of success?
I get why we are fearful. Robots are scary not only because they can be better than us in many, if not all, functional ways; 1) they can be deployed in a manner that can take away your livelihood-as-you-know-it (btw, you are likely okay with them taking someone else's), 2) their purpose is programmed by an organization or person you do not entirely trust, 3) at one point, they will not require any human intervention at all to self-perpetuate, 4) they will learn to fear us like we fear ourselves, 5) they will evolve to become complete assholes too.
With any great invention or new innovation, any manner of good or nefarious purpose can be applied.
What I do know, however, is there will always be something to be worried about. The panic-du-jour will be robots one day and once we’ve come to terms with that (if we survive it), we will find something else to fill in the void that is somehow that much more terrifying.
I find solace in the illusion of preparedness.
My survival regime includes active participation in tech to earn some baseline of credible opinion, watching lots of sci-fi movies imagining they are real-world scenarios, studying news and history documentaries pretending they were created by screenwriters, and then making and saying shit as though tomorrow really matters.
After all, for now, it's still all about humans.
Robot Artwork by Tom Lopez.