The disappearing computer and what it disappears

Tech Life

Walt Mossberg’s last column, The Disappearing Computer, is well worth reading and has certainly given me much to mull over these past few days. 

I think his analysis and predictions regarding the direction technology is going to take in the short-to-medium term are rather spot-on. Much of what’s going to happen feels inevitable. That doesn’t mean I have to like it or be okay with it. And for the most part I don’t and am not.

All of the major tech players, companies from other industries, and startups whose names we don’t know yet are working away on some or all of the new major building blocks of the future. They are: artificial intelligence / machine learning, augmented reality, virtual reality, robotics and drones, smart homes, self-driving cars, and digital health / wearables.

The first thing that struck me upon reading this list was how little I’m interested in those things (save, perhaps, for wearable health monitors). Apart from personal preferences and interest, though, I can see how some of those fields can provide some degree of usefulness to people. What worries me, generally speaking, is the price we’ll have to pay in the process of bringing such technologies into the mainstream. Another thing that worries me is the state of our planet, and none of those technologies strikes me as an Earth-saving technology. There are days in which I feel particularly bitter, and the world looks like the Titanic, and technology like the orchestra that keeps playing sweet music in our ears while we all sink.

Whenever I voice my concerns about where technology is driving us, I am mistaken for a Luddite, or for a technophobe. I’m not. I simply refuse to drink the Silicon Valley Kool-Aid. Artificial intelligence and machine learning are great concepts that can have a lot of useful implementations, but to generate meaningful output, to produce a response that mimics intelligence, a machine needs a huge amount of data. A machine isn’t intelligent, it’s just erudite. A lot of data is collected without enough transparency. A lot of data about us is collected without our explicit consent. A certain amount of personal data is tacitly given away by ourselves in exchange for some flavour of convenience. My biggest concern is that a lot of data is being collected by a few entities, a few ‘tech giants’, which are private corporations with little to no external oversight. And despite their public narratives, I seriously doubt their goal is to advance humanity and make our lives better in a disinterested fashion. 

I expect that one end result of all this work will be that the technology, the computer inside all these things, will fade into the background. In some cases, it may entirely disappear, waiting to be activated by a voice command, a person entering the room, a change in blood chemistry, a shift in temperature, a motion. Maybe even just a thought.

Your whole home, office and car will be packed with these waiting computers and sensors. But they won’t be in your way, or perhaps even distinguishable as tech devices.

This is ambient computing, the transformation of the environment all around us with intelligence and capabilities that don’t seem to be there at all.

On the surface, this is all great and exciting. On the other hand, I don’t want technology to be too much out of the way. Or, in other words, while I think it’s cool that tech becomes more ‘invisible’, I don’t want it to also become more opaque. I don’t want devices I can’t configure. I don’t want impenetrable black boxes in my daily life, no matter how much convenience they promise in return.

Google has changed its entire corporate mission to be “AI first” and, with Google Home and Google Assistant, to perform tasks via voice commands and eventually hold real, unstructured conversations.

Not long ago, I humorously remarked on Twitter: “Remember, it’s Google’s Assistant. Not yours.” Well, I wasn’t really joking. I’m utterly astonished at the amount of people who don’t mind giving Google a great deal of personal information, only for the convenience of having a device they can ask in natural language, e.g. How long is it going to take me to get to my office if I leave by car in 15 minutes? and receive a meaningful response. To receive precise trivia for answers, people are willing to put devices in their homes which basically monitor them 24/7 and send data to a big, powerful private corporation.

The problem is that a lot of non-tech-savvy people don’t care and don’t react until they see or feel the damage. In conversations, I often hear the implicit equivalent of this position: “Yeah, I’ve been giving Google/Facebook/etc. all kind of personal information over the years, but none of them harmed me in return; what’s the big deal?” The big deal is that someone else now owns and controls personal information about you, and you don’t know exactly how much data they have, and how they’re using it. Just because they’re not harming you directly or in ways that immediately, visibly affect you, it doesn’t make the whole process excusable. A company may very well collect a huge amount of personal information and just sit on it for years until they figure what to do with it; one day the company gets hacked, all the data is exposed, your accounts and information are compromised, and oh, at this point people are finally angry and upset, and blame the hackers — when the hackers are just an effect, and not the cause.

I urge you to read Maciej Cegłowski’s transcript of Notes from an Emergency, a talk he recently gave at the re:publica conference in Berlin. It’s difficult to extract quotes from it, exactly because it is entirely quotable. If you want to understand my general position on Silicon Valley, just read how he talks about it. This passage in particular has stuck with me ever since I read it, because it perfectly expresses something I think as well, but with a clarity and briefness I couldn’t find:

But real problems are messy. Tech culture prefers to solve harder, more abstract problems that haven’t been sullied by contact with reality. So they worry about how to give Mars an earth-like climate, rather than how to give Earth an earth-like climate. They debate how to make a morally benevolent God-like AI, rather than figuring out how to put ethical guard rails around the more pedestrian AI they are introducing into every area of people’s lives.

Back to Mossberg:

Some of you who’ve gotten this far are already recoiling at the idea of ambient computing. You’re focused on the prospects for invasion of privacy, for monetizing even more of your life, for government snooping and for even worse hacking than exists today. If the FBI can threaten a huge company like Apple over an iPhone passcode, what are your odds of protecting your future tech-dependent environment from government intrusion? If British hospitals have to shut down due to a ransomware attack, can online crooks lock you out of your house, office or car?

Good questions.

My best answer is that, if we are really going to turn over our homes, our cars, our health and more to private tech companies, on a scale never imagined, we need much, much stronger standards for security and privacy than now exist. Especially in the U.S., it’s time to stop dancing around the privacy and security issues and pass real, binding laws.

From what I’ve seen so far, legal systems everywhere haven’t been able to keep up with the pace technology is moving. It’s extremely unlikely that technology’s pace is going to slow down, and while I hope laws and regulations will be passed and enforced more swiftly, I’m not sure governments will be completely impartial about it, as knowing more about people seems to be a shared agenda between governments and tech giants. 

I still hope people themselves can fight back to regain control of their data, but this era of technological progress is also characterised by so much regress in other human behaviours: intolerance, racism, xenophobia, a rise in superstition and distrust of science, the tendency of believing whatever it’s on the Internet without displaying a sliver of critical thought… 

And in our tech lives, I witness more and more frequently just how blinded by convenience we’re becoming. Never before have I seen so much aversion towards friction in our daily lives. And convenience is the siren song tech giants are constantly singing in our ears: “Put everything in the cloud, give us your data, your documents, your photos, it all gets out of the way, it’s all synced conveniently to all your connected devices; it’s all so much easier, and so inexpensive!” I agree that certain friction is unnecessary, but there’s also a kind of friction that prevents our brains from working on auto-pilot all the time, that contributes to keeping our minds nimble, that keeps laziness and apathy at bay. Judging from the increasing number of people I see completely engrossed in their smartphones every time I’m out and about, the siren song of convenience is getting more and more intoxicating. Cegłowski is right, The danger facing us is not Orwell, but Huxley. The combo of data collection and machine learning is too good at catering to human nature, seducing us and appealing to our worst instincts. We have to put controls on it. The algorithms are amoral; to make them behave morally will require active intervention.

Perhaps my cynicism and lack of starry-eyed Silicon Valley visions of progress stem from the current bleak picture painted by what’s happening in the world on a daily basis. What is technological evolution without a corresponding human evolution? The first thing that comes to mind is something Lenny tells David Haller in the pilot of the TV series Legion: “Don’t give a newbie a bazooka and be surprised when [they] blow shit up.”

The Author

Writer. Translator. Mac consultant. Enthusiast photographer. • If you like what I write, please consider supporting my writing by purchasing my short stories, Minigrooves or by making a donation. Thank you!