My year in the rearview mirror

Tech Life

In 1993 I could have been involved in a terrible car accident. I was the second car in queue at a big intersection. When the light turned green, the car before me hurriedly crossed and cleared the intersection. I followed suit: it was one of those high-volume traffic spots in the city where you’re subtly (or not so subtly) encouraged to move quickly and not dawdle too much. The moment I cleared the intersection I heard the loudest screeching and banging behind me. It startled me so much I struggled to maintain control of my car. I looked in the rearview mirror and more or less realised what had happened: another car, coming from the road crossing the one I was on, didn’t stop at the red light, cut through the intersection at mad speed, and hit the car right behind me with such force as to push it against the external brick wall of a factory about 300 metres off the intersection. 

I saw that in my rearview mirror. But it took me about a minute to fully process what had just happened. That’s when I had to stop at a petrol station nearby because I was shaking so much I couldn’t safely drive anymore. It could have been me, the thought kept echoing and rippling in my head for a good while. I don’t remember where I was going — it’s been so long — but I’m pretty sure I forgot where I was going right then and there on that almost-fateful late afternoon. For months thereafter the sound of an ambulance siren gave me chills and flashbacks.

I know that, in the grand scheme of things, going from 31 December of one year to 1 January of the following year is just a convention following a calendar established by someone a few centuries ago. Anyway, there have been years where this symbolic passage has reminded me of that near miss, 28 years ago. At times it’s not a perfect equivalence, because it’s like saying I nearly avoided a catastrophic year, when in fact 2021 did hit me rather forcefully. But it still remains like a shocking image in my rearview mirror, a mess I’m thankfully leaving behind.

Explaining why, in detail, is something that goes beyond the scope of a post on a blog. It’s something that ultimately becomes long-winded, uninteresting, and enters personal levels I don’t really feel like sharing with strangers on the Internet. As for what I can share, I’ll say that 2021 is starting to feel a lot like 2017. In 2017 I lost my father, rather unexpectedly, and that loss didn’t hit like a car — it hit like a bullet train. It destabilised me. It confused me. It crippled my creativity. It blocked me. And as I was starting to recover, here comes 2021 where I lose my mother, in a different way than my father, but again in an unexpected manner. 

When one of your parents passes away, you grieve, but you grieve together with the remaining parent, who also helps take care of all the bureaucracy and ‘business’ related to someone’s death. When you’re an only child and the remaining parent passes away, the overwhelmingness of it all feels even heavier. Even though I wasn’t alone in facing this terrible event, and had the help and support of a few dear people, I still had to go through a self-discipline routine to focus and tell myself I needed to do whatever had to be done one thing at time, one thing at a time, because you feel the sky falling down, falling down.

You can’t stay on top of everything

The fundamental thing that 2021 made me painfully aware of is that — unless you live a very simple life, or your life is made simple by the constant help of others — you can’t stay on top of everything.

I have resisted this for years. For years I’ve told myself I can take care of everything my work as translator and localisation specialist throws at me; that I can stay up-to-date on so many things related to technology, photography and other several interests I have (either for personal or work-related reasons); that I can carry out my day-to-day duties in the household; that I can stay creative and cultivate my fiction and literary projects; et cetera, et cetera, et cetera.

In order to try to do all this, to serve this illusion, I’ve sacrificed hours of sleep, reconfigured my priorities on a daily basis to be malleable and versatile and adaptable. I’ve tried to do this smartly, not with that mindless relentlessness that sometimes overcomes you when you want to achieve a goal at all costs. The only thing I have truly achieved is getting this close to burnout. Or maybe I actually entered burnout territory for a while without realising it. I don’t know. But this — this really feels like the flashback I told you about at the beginning. This feels like a near miss where I’m watching the aftermath unfolding in my rearview mirror.

Age is a factor you can’t just ignore

There’s one aspect that especially baffles me in the way Internet and social media have impacted people’s behaviour. Today, I increasingly feel I’m interacting with fewer and fewer adults and more and more eternal adolescents. It’s like people have found this great, global, big toy in the Internet and related technologies, and don’t want to let it go because it gives them so many dopamine kicks; it gives them the illusion that they’re in full control of their lives and that they can stay on top of everything all the time; that they can hack the world and their own lives to fulfil a sense of forever-young-ness. It’s their own Matrix, their own blue pill.

But age is a factor you just can’t sweep under the rug because it’s not making you look cool. With age, the biggest thing you have to consider is the way you spend your time. Because sooner or later the thought will hit you, and hit you hard. Your time is finite, and when you stop and think of all the time you frittered away, you’ll tell yourself it felt like a good idea at the time, but that won’t really make the taste of regret fade away.

And here’s the thing: today, so many things are designed around us to keep us involved in this big global toy, in this time-wasting Matrix where a lot of people seem to live in a state of eternal adolescence, of forever-young-ness, of ‘it’s all a big exciting game’. Before you think I’m being a holier-than-thou Morpheus who tells Neo to ‘wake up’, know that I myself have fallen for this illusion for a relatively long time. And that it’s not just an illusion that touches on silly things and gamifies our lives only in the leisure department, but it also (and more dangerously) impacts our sense of productivity.

In other, simpler words: if you’re in your late forties you can’t expect to be productive and stay on top of everything like a twenty-something. You may be in good health, you may feel up to it, you may even manage to do it for a while, but the truth is it’s unhealthy. The whole culture around the idea of being a productivity rockstar, of being this relentless productivity machine fuelled by workaholism is simply destructive, and not even in the long run.

Another area where the age factor is overlooked is in the design of user interfaces. (You thought I couldn’t find a way to talk about user interfaces in this context, didn’t you?) I keep seeing this: user interface elements, targets, designs, paradigms that require users to have perfect vision, flawless reflexes, constant adaptability, and time to waste readjusting their workflows and relearning how to carry out the same stuff they were used to carrying out in an operating system, application, device, 2 or 3 iterations ago. Some designers keep making the same error I think Donald Norman talked about in his book The Design of Everyday Things — the mistake of thinking that their target users are like them. They’re not. Often, they’re people who just need to get things done without losing an entire morning trying to figure out how and why the application(s) they rely on for work have changed after the last update. Or they’re people who really need tooltips in an app’s interface in order to understand what that control with the obscure icon does; who really need obvious interface cues and affordances you’re desperately trying to spirit away because your application or environment doesn’t look clean, trendy, or minimalistic enough.

I’m not saying applications, interfaces, and operating systems shouldn’t change and evolve. Only that they should do so by actually taking into account that end users aren’t this homogeneous mass that moves in perfect sync with your fancy designs and redesigns. But the kind of approach to do things right by many different types of end user involves more work and a generally slower pace of development, which is a big no-no for the stupid breakneck pace technology wants to move — and wants us to move today. And this seamlessly brings me to my next point.

2021 left me weary again, tech-wise

I wanted to write more articles here in the past year. On several occasions life and work got in the way. But I’ve also realised that many of the articles I would have published would have ended up being rants about how disappointed and unenthusiastic I feel towards an increasing number of tech-related things. 

Lately I was afraid I was losing my enthusiasm for technology in general, no matter what, but that’s not true. There are still things that excite me and pique my interest, so that’s a good self-check. It means that I’m not becoming a bitter curmudgeon after all (though I know I sound like one every now and then). But today everything seems to revolve around Big Tech, and that’s wearing me down. I’m tired of Big Companies having an increasingly deeper control of our lives, I’m tired of people stupidly letting them have it, tired of arguing that no, Apple are not the good guys you think they are, only maybe the lesser evil.

And speaking of Apple, I’m tired of arguing with people who will defend the company whatever stunt they pull with their products or designs. Tired of being considered the crazy one because I think a computer display should not have a fucking notch in the middle of the top bezel. Tired of being considered averse to change only because I dare point out that a new design in an application just does away with years of usability research and tried-and-true practices that used to make the application very intuitive and self-evident in use.

This of course doesn’t mean I’m going to stop talking about these subjects and these issues. Only that I won’t pay too much attention to people whose interest clearly isn’t to be open-mindedly engaged in a conversation, but to just waste my time. And time is getting ever so precious to me. It’s an age thing, one day you’ll understand.

New Year’s resolutions

I have none. Nothing specific. I just want to move at my own pace, not at the pace dictated by someone else, or by some vague notion of ideal lifestyle, or of how to be a rockstar of productivity. Have you looked at actual rockstars once they’re past their prime? Yeah, not a pretty picture, generally speaking.

When dropped support feels like sabotage

Software

If you know me, you know I’m a packrat when it comes to older devices. Sometimes it’s for sentimental reasons, as silly as it may look to some. Sometimes I keep and/or acquire vintage devices on purpose because there’s something about them that fascinates me, or I feel it’s important for computing history reasons, or to study their user interface. Whatever the case may be, one thing I always do with such devices is to put them to good use, as I’m not the kind of collector who accumulates stuff to put it on display. 

So, when it comes to Apple mobile devices specifically, I still have older iPhones, iPads, and a few iPod touch models going back to iOS 4 and iPhone OS 3. Naturally, the uses of a first-generation iPod touch (2007) or iPhone 3G (2008) are quite limited today. Many applications have features that no longer work, or rely on discontinued APIs to communicate with related services, and are therefore practically useless (though I still open them every now and then to be reminded of how great certain user interfaces were on older iOS versions). 

But there are devices on slightly more modern iOS versions that keep retaining a certain degree of usefulness to me. In many respects, my first-generation iPad (2010) with iOS 5.1.1 is still a good device to use — remarkably more efficient and responsive than my third-generation iPad (2012) with iOS 9.3.5. And my fourth-generation iPod touch (2010) with iOS 6.1.6 is still very much in use: not only can I experience on a retina display the great user interface that iOS 6 had, but the extremely compact size of this iPod touch makes it a fantastic music player when I’m out and about, and when travelling it’s the main device I connect with a Libratone Bluetooth speaker and I have a very portable yet good-quality setup to listen to music.

And there’s more: both the iPad 1 and the iPod touch 4 can still run certain apps that have been discontinued, or run older versions of apps whose interface has worsened update after update, and so those older versions are still ‘the good ones’ in my eyes. Like Snapseed and Penultimate, to make a couple of examples off the top of my head.

Something I’ve been doing in recent years has been to occasionally go back to my catalogue of purchased apps and try to install some of the older ones, hoping to trigger the Download last compatible version feature. It has worked well: I can still use Microsoft’s OneNote app, the official Gmail app, or the excellent x2y by Joe Cieplinski on the iPod touch under iOS 6. These things make me happy because I feel that both the hardware and the software are not being wasted. Sure, they’re ancient devices by current tech nerd standards, but I prefer having them working on my desk or in my backpack or in my pockets, rather than thrown in an e‑waste bin where I’m not even sure whether they’re going to be fully recycled or not.

Well, long story short: it appears that recently (I don’t know exactly when, I just found out the other day) devices running iOS 5 and iOS 6 have stopped connecting to the App Store. Which means that I can no longer install older versions of apps I paid to use. Yes, I can still use those apps (some of those, at least) on more recent devices, but I should be able to install them wherever the hell I want.

Speaking of managing apps: until recently — let’s say a year ago or even less — I’ve used the last version of iTunes that lets you manage App Store apps. It’s version 12.6.5.3 and Apple still provides a separate download of it at this page. With this version I used to be able to keep purchasing iOS apps from the App Store, download them on my MacBook Pro, and install them on my iOS devices via a direct connection. This had the added benefit of letting me have a local copy of the app’s .ipa file so that I could quickly reinstall it in case I deleted the app from the device only to change my mind later. And whenever an app update was issued, I could copy the previous version of an app in a different folder, let iTunes download and update the app, and keep older versions backed up in case I didn’t like the changes in the app (often UI-related), or in case the update stopped supporting a previous version of iOS.

This was a great plan, though admittedly this backup strategy was only reserved for the apps I cared the most and utilised the most; otherwise things would have got cumbersome pretty quickly. Anyway this, too, doesn’t matter now, because for the past — uh, 10 months? — iTunes hasn’t been able to download anything from the iOS App Store. It still connects, I can still browse, and even go through my list of purchased apps, but I can’t purchase — download — install anything. Not even copies of purchased apps from iCloud.

When we talk about planned obsolescence we often refer to hardware: computers, devices, accessories. But what makes a device obsolete equally often is something that starts with the software. Features are dropped, certain operating system versions are no longer supported, certain functionalities are only recognised by newer versions (or even just the latest version) of the operating system. And while this may be justified in the case of smaller software companies and third-party indie developers who might not have the means to afford a deep level of backward compatibility, in my opinion it’s harder to excuse when bigger companies with many more resources are involved. And then there are very big companies making both hardware and software whose interest is for their customers to always — how can I put it? — to always be in the mood for upgrading their devices.

I can hear you loud and clear from here. But Rick, Apple is possibly the company that is most user-friendly when it comes to maintaining old devices functional! You can install iOS 15 on a device as old as the iPhone 6S from 2015! And like you said before, until recently you still were able to download and install apps on your iOS 5 and iOS 6 devices!

Yes, that’s great and all. But that’s not the kind of support I’m talking about. 

Why prevent iOS 5 and iOS 6 devices from accessing the App Store? The possible answer, Because it’s come the time to drop support for these old devices, leaves a lot to be desired. While I agree that currently more than 95% of iOS developers have long moved on, and finding apps that can still run on iOS 5 and 6 means stumbling on abandonware most of the times, one might want to still be able to access their purchased apps and download older versions that still run on older versions of iOS. Why remove that capability? 

If, for example, it’s a matter of updated Web security protocols that the vintage device cannot handle, then why not 1) let the user know, and 2) keep allowing iTunes 12.6.5.3 to manage iOS apps, so the user can still download and install their purchased apps from a Mac that can handle any updated Web security protocol that’s been put in place? 

Let’s make another example: why limit the support of security updates to just the two previous versions of Mac OS? There are still a lot of people running Mac OS 10.13 High Sierra and 10.14 Mojave. In a lot of professional environments, the actual Mac OS system upgrade doesn’t happen when Apple releases the new version of Mac OS; it happens when the third-party software companies making the applications or plug-ins a studio or a firm relies on, release an update or fix any compatibility issues with the new version of Mac OS (or the new Apple Silicon architecture). In these environments, users are not willing to screw up their production setups just to try out the shiny, buggy new version of Mac OS. This particular issue is exacerbated by the fact that Apple is releasing incredibly powerful Apple Silicon Macs aimed at these professionals among others, Macs that these professionals would purchase in a heartbeat, but they come with the latest version of Mac OS, and some of the software applications these professionals depend upon don’t work well (or even at all) with the latest version of Mac OS. But I’m digressing slightly here. The question remains: why not extend security coverage to at least one more previous version of Mac OS? What is so technically unsurmountable that prevents you from packaging for Mac OS 10.14 Mojave the same security patches you’re releasing for Mac OS 10.15 Catalina?

And another example: with iOS updates, why is the path always forward? Why not allow users to perform a clean, legitimate downgrade if they want to or need to? Back when iOS 9 came out, allowing the iPhone 4S and the third-generation iPad to update to it was a mistake, as iOS 9 impacted their performance noticeably. I was initially okay with iOS 9 on my iPad 3, but as time passed I regretted not staying on iOS 8.4.1. I would have loved to just be able to re-download iOS 8.4.1 and downgrade without hassle. And another thing: suppose you have an iPhone with iOS 12, you skip both iOS 13 and iOS 14, then iOS 15 comes out and nags you to update. If you do, your device will go from iOS 12 to the latest minor update of iOS 15. But what if for some reason you want to update from iOS 12 to iOS 14 instead? You can’t. Why? Because Apple. (You’re rolling your eyes and want a good reason why one would want that? How about to keep using a few great apps that you love and still want to use, but they got retired some time ago, and stopped working under iOS 15?)

Again, what is so technically unsurmountable that prevents Apple from providing an easy way to reinstall an earlier version of iOS on a device that can run it? 

I may be wrong, but the answer to all these questions is, Nothing, for it hardly looks like a technical issue. It’s a matter of policy. As I’ve often pointed out, Apple’s behaviour — at least for an outside observer — is to adopt the course that’s more convenient for them. The course that makes things easier for them to manage, streamline, deploy. It’s all very opinionated. It’s not a matter of costs or lack of resources, I don’t believe that for a second. Apple moves forward, doesn’t look back too much, and constantly nudges their users to do the same. So, Mac OS and iOS updates move forward, and your only alternative — if you don’t want to switch to another platform altogether — is to pause everything and step down from the Apple treadmill, to get back on it when you are ready. Non-Apple users often call Apple users sheep. It’s offensive, for sure, but increasingly often I’m left with the feeling that Apple treats their users just like that, behaving a bit like a shepherd dog.

Circling back to the title of this article, I’m aware that sabotage is a somewhat strong choice of word. I had no intention of writing a clickbaity title. It’s simply how I felt when I realised that my devices on iOS 5 and iOS 6 couldn’t access either the App Store or my Purchased apps; and how I felt a few months ago when I realised that I couldn’t use iTunes anymore to manage iOS apps, apart from the ones I’ve already downloaded (and thankfully stored) on my Macs over the years.

And I’m perfectly aware that some people will see this simply as a silly, whiny rant. They’ve perhaps joined the school of thought that considers software as a short-lived, disposable thing that has value until it works — strike that — until it’s allowed to work. Then who cares if the apps you paid money for can’t be used anymore on an older device that cost a non-trivial amount of money when you purchased it. And you know, shrugging and telling me that I bought that stuff years ago and it’s not worth getting worked up about it, is a reaction that would really open an interesting debate about what you value and what’s worth for you. I give tremendous value to software and to hardware that still works and still has a purpose. As I wrote in my previous piece On sideloading, I grew up in an era when software was just software and was valued very differently than today. It was software that cost more money but was also ‘allowed to work’ for longer. You were more in control of its lifespan, so to speak.

Well, my rant is over, make of it what you want. I needed to get this out of my system and I hope it’s been at least a little thought-provoking in the process.

On sideloading

Software

I usually take for granted that my audience is largely made of people who are tech-savvy enough to know what I’m talking about. But here’s the definition of sideloading taken from Wikipedia:

Sideloading describes the process of transferring files between two local devices, in particular between a personal computer and a mobile device such as a mobile phone, smartphone, PDA, tablet, portable media player or e‑reader.

Sideloading typically refers to media file transfer to a mobile device via USB, Bluetooth, WiFi or by writing to a memory card for insertion into the mobile device, but also applies to the transfer of apps from web sources that are not vendor-approved. 

The Epic vs Apple lawsuit has inspired a lot of points for debate regarding Apple’s App Store management and policies, and in general regarding Apple’s anticompetitive behaviours. The possibility that regulators, in Europe and elsewhere, could order Apple to allow any software application — not just those approved by Apple — to be installed on iOS devices has generated a rather polarised discussion. 

Apple, of course, is strongly against such a possibility, and has manifested its concerns several times, most recently having Craig Federighi give a speech at Web Summit 2021 in Lisbon, Portugal. Federighi has reiterated Apple’s angle, i.e. that allowing sideloading would be a catastrophic blow to customers’ security. Reading the afore-linked article by Ars Technica, I find Federighi’s framing of the issue quite a bit overkill: 

Sideloading is a cybercriminal’s best friend, and requiring that on the iPhone would be a gold rush for the malware industry… That one provision in the DMA [Digital Markets Act] could force every iPhone user into a landscape of professional con artists constantly trying to fool them. 

But of course you have to exaggerate the risks if you want to position yourself as the Guardian Angel of all your customers and users. You’ll never hear a longer lists of threats to your life as when an insurance company is trying to sell you a life insurance policy.

I grew up in an era when software was just software, and you could simply start typing a BASIC program into the computer and execute it. Generally speaking, it was an era when tinkering — both in hardware and software terms — was unhampered and even encouraged. Philosophically, I can’t be against sideloading. I actually dislike how the term’s connotation has been hijacked towards negativity. On the contrary, one should think of it in terms of freedom to install any compatible software available for a certain platform. 

But what about malware? Yes, in a completely open scenario, malware can indeed be a risk. But the problem, in my opinion, lies elsewhere. It lies in the tradition of treating end users like ignorant idiots instead of training them to separate the wheat from the chaff.

A bit of a long-winded interpolation on how base tech-savvy in users has developed and ultimately evolved over the past few decades

In the 1980s, when computers started entering people’s homes, the need for clear, simple, extremely usable interfaces was evident. Back then, the majority of people weren’t tech-savvy at all. They were literally ignorant when it came to using a computer. The Apple Lisa and Macintosh computers were revolutionary, UI-wise, because their interfaces were the result of a painstaking research into how to present information to the user and the ways the user can interact with and manipulate it. The abundant, well-written documentation, the manuals and user’s guides accompanying those machines, taught people even the most basic operations — like using the mouse — because at the time these were completely new things to most people. What today looks and feels ‘intuitive’, back then was not.

What was great about those manuals and those first graphical user interfaces is that they truly educated people in the use of the personal computer without insulting their intelligence. And, at least in the beginning, people were also educated on the matters related to software: what software is and how to use it, how to deal with it. This unfortunately didn’t last long. First, the IBM PC and Microsoft Windows became the most widely used platform — and sadly this ‘winning’ platform was also less user-friendly. 

Then the software for this platform propagated at gold rush levels, and soon people found themselves overwhelmed by the sheer quantity of Windows applications. Needless to say, with great quantity inevitably comes a varying degree of quality. At the same time, the march of progress brought an increasing complexity in operating systems and related software, not to mention the great speed with which companies and businesses became computerised in the 1980s and 1990s. At this point a lot of people were also overwhelmed by having to learn to use badly-designed, user-hostile software for work. I’m going from memory here, so this may be me injecting anecdotal evidence into the narrative, but I distinctly remember how this was a shocking experience for many people who at the time didn’t have a computer at home, never used one before, and suddenly found themselves on work-mandated crash courses to quickly — and badly — learn to use one. Or rather, to use the two or three main applications the company required them to master.

No wonder that a lot of folks became intimidated by technology, computer-averse, or even flat-out unwilling to become more informed on technology matters, even when it was clear that technology would become extremely embedded in people’s lives in the years to come. This was a terrible phase, one I remember too vividly, which coincided with my freelancing as ‘tech support guy’. 98% of the people I helped out back then had a degree of computer literacy that, in a sense, was worse than being completely ignorant: it was a patchwork of disparate notions haphazardly accumulated over time. A mixture of assimilated procedures and workflows without knowledge of the principles behind them. People who didn’t understand basic concepts like the metaphor of files and folders but were able to find, retrieve, and install a third-party utility which often had features already built in the operating system; something these users didn’t realise because they had never really learnt how to use the system in the first place. People who didn’t know how to change their desktop wallpaper but knew how to enter the command prompt and issue certain commands, “because my tech friend told me that if I do this and that I can free up more RAM” (then you asked them what RAM is, and often they confused it with disk storage).

It’s clear that, with this type of computer literacy, taking advantage of users isn’t such a hard task for malicious actors. Spreading malware or viruses masqueraded as benign software (even as antivirus applications) is easy when users are not really educated on spotting the difference between good and bad software.

When the Macintosh was introduced in 1984, Apple had noble goals in mind. They wanted to empower people by giving them a tool that was friendly and intuitive to use, that could make their lives easier, and even spark creativity and ingenuity. The computer for the rest of us. With hindsight, it’s a pity that the Macintosh lost the war against the IBM PC and Windows and did not become the most widely used platform. Because at the time, the difference between someone who got into computers via the Mac platform versus someone who had to learn to be proficient with a PC, mostly for work-related reasons, was palpable. The typical Mac user was — how to put it? — more organically tech-savvy, more confident in their approach with the machine’s interface, and generally more knowledgeable about the machine as a whole. 

When Jobs returned to Apple in 1997, that noble goal of giving people friendly and powerful tools, both hardware and software, was strongly reiterated. First came the fun machine, the iMac, then a few years later came a more powerful, a more stable, and in many ways more streamlined operating system, Mac OS X. In the 2000s many, many people switched to the Mac because they finally realised that it was an equally powerful, versatile platform, but less messy and inconsistent than Windows. And I remember that, while some long-time Mac users were frustrated by Mac OS X and its initial incompatibility with applications and peripherals they used for work, others were happy to finally leave behind the increasingly arcane management of system and third-party extensions and control panels.

The 2000s were an important decade because Apple at this point was offering both solid hardware and good-quality software, and as more people switched to the Mac, I noticed fewer people being intimidated by computers, fewer people being tech-averse. I still freelanced as tech support guy, but calls for assistance became less and less frequent. There was a brief surge when clients called me for help in transitioning from Mac OS 9 to Mac OS X, but after that things got mostly quiet. There were always exceptions, but I really started to notice that finally the average level of tech-savvy was increasing. 

Back to iOS: when the App Store was introduced, users’ tech-savvy was generally mature enough to handle sideloading from the start, but Apple chose the overprotective path

When Apple introduced the iPhone and iPhone OS (later simply called iOS), not only did they present a compelling device from a hardware standpoint compared to what the competition was offering in the phone market, but the revolution was also happening from a software and user interface standpoint. The intuitiveness of the iPhone Multi-touch interface simply destroyed the convoluted and antiquated UIs of other mobile phones. Apple had introduced friendliness and ease of use in the mobile landscape as well.

Then Apple had yet another fit of We know what’s best for our users and, well, things could have been handled differently. 

If you remember, at first iPhone OS didn’t support native third-party applications. At the time Jobs infamously proposed a ‘sweet solution’ where developers could instead write Web apps for the iPhone that would ‘behave like native apps’. I remember thinking at the time that this was kind of surprisingly myopic of Jobs and Apple, and it really felt as if the message to developers was something like, Don’t screw up our newborn revolutionary platform with your mediocre stuff. Thankfully this stance didn’t last long, and in March 2008 Apple announced the iPhone SDK. 

Back in 2007–2008, I assumed that Apple would approach third-party iPhone OS app development in much the same way they did with third-party Mac OS app development. A sort of loose, ‘anything goes’ approach, that is. My mistaken and somewhat naïve assumption came after hearing Jobs speak of iPhone OS as being essentially OS X in mobile form. I also thought that Apple knew their users (and developers) enough at this point to trust them and treat them like people who knew what to do with their devices. People who could decide for themselves what kind of experience to get from their devices. When the iPhone App Store was launched in mid-2008, I thought it would work a bit differently. The model I had in mind was more similar to how the Mac App Store would work later in 2010. In other words, I thought the App Store would be a place where users could more easily find & install selected (and approved) apps for their mobile devices, but that they would be free to look elsewhere if they so chose. 

Don’t make promises you can’t possibly keep, Apple

Instead of teaching users how to fish, Apple decided to position themselves as sole purveyors of the best selection of fish. Now, leave aside for a moment all the tech-oriented observations you could make here. Just stop and think about how arrogant and patronising this attitude is. Sure, I can believe the genuine concerns of providing users with the smoothest experience and protecting them from badly-written apps (or just straight malware) that could compromise the stability of their devices. But by not taking a more moderate approach (it’s either we lock down the platform or we’ll have the cyber equivalent of the Wild West!), you also deprive users of choice and responsibility.

The problem of appointing yourself as the sole guardian and gatekeeper of the software that should or should not reach your users is that you’re expected to be infallible, and rightly so. Especially if you are a tech giant which supposedly has enough money and resources to do such a splendid job that is virtually indistinguishable from infallibility. Instead we know well just how many untrustworthy and scammy apps have been and are plaguing the App Store, and how inconsistent and unpredictable the App Review process generally is.

That’s why the doom and gloom in Federighi’s speech sounds hilarious to me. His (and Apple’s) is textbook FUD spreading — this idea that, without Apple the paladin, all these poor users are left completely defenceless and at the mercy of the hordes of cybercriminals waiting outside the walls of the walled garden. The same paladin who rejects an app update from a well-known, trusted developer for ludicrous technicalities, but allows hundreds of subscription scams from pseudo-apps that are just empty containers to fool people into recurring payments. 

We’re not living in the 1980s and 1990s anymore. Today most people have a base level of tech-savvy that was almost unthinkable 20–30 years ago, and they’re much less intimidated by technology. But a point of regression is given by the constant convenience spoon-fed to users and the insistence on eliminating any kind of friction in the user experience — and I mean even that modicum of ‘good’ friction that makes a user more aware of what’s going on, more conscious of how a certain flow or interaction works. If you remove all cognitive load, users become lazy quickly, and even otherwise tech-savvy people can be lulled into a false sense of security, thus falling for App Store scams I’m sure they would recognise if carefully screenshotted and presented to them out of context.

Closing remarks: Sideloading should be seen as adulthood in our relationship with software. An occasion for being in control, making choices, and taking responsibility

Moving on, I think sideloading should regain a more neutral, or even positive connotation and should not be demonised. The term sideloading shouldn’t feel like the tech equivalent of moving contraband. It’s just the process of installing software, any kind of software, and not necessarily just the sneaky or malicious kind. Or the kind that weakens a device’s security. In fact it’s theoretically possible to offer software tools that increase security in certain parts of a system, exactly because they can access them. 

And on a more philosophical plane, sideloading ultimately means freedom of choice and giving back a bit of agency and responsibility to users. How Mac software works could very well work for iOS, too. There wouldn’t be the need to dismantle the App Store as it is today. Keep it as the curated place that it is (or wants to be), but allow iOS software to be distributed and installed from other places as well, with sandboxing and notarisation requirements in place just like with Mac software. And just like on Mac OS, at the user interface level you could warn the users that they’re about to install an app by an unidentified developer, outside of the App Store, and that if they choose to install it, it’s at their own risk. Let them make an informed decision. And let them set their preference in Settings, exactly like they do in Mac OS in System Preferences → Security & Privacy.

Again, Apple should refrain from continuing to bang the security drum when discouraging sideloading. They could maintain such stern stance if they were actually able to protect iOS users all the time, consistently and effectively. But they aren’t, and they simply cannot guarantee a 100% efficacy, which again is the fundamental requirement when you position yourself as the sole gatekeeper and imply that people would be lost and clueless without the protection you provide. In such a context, you can’t provide ‘good enough’ protection. Here, good enough is simply not enough.

That notch on the new MacBook Pros, and thoughts on hardware design

Tech Life

Oh boy, where to begin?

As usual, when faced with new designs and solutions — and with pretty much anything else, really — we have an emotional response followed by a more rational assessment. Sometimes, things that initially don’t seem to make sense to our emotional part, or that rub us up the wrong way, are later rationalised and we begin to understand, even accept, why they’re there.

When I was following the Unleashed Apple event on 18 October, and Apple revealed the design for the new 14 and 16-inch MacBook Pro models, I was initially surprised to see certain details that hark back to the Titanium and Aluminium PowerBooks — details that admittedly struck the right nostalgia chord in me. But when I saw that their displays featured an iPhone-like notch right there at the top, I went into a fit of rage and punched my side desk so hard that my G4 Cube woke up from sleep.

For the 10–15 minutes after that moment, I tuned out everything that was said in the MacBook Pro introduction. I was in a state that could be described as a sort of shell shock. I know it sounds so dramatic, but that’s how I was feeling. Then I came back and started processing everything, waiting for my rational side to kick in and help me analyse and understand this new design choice on Apple’s part that, on the surface, makes absolutely no sense to me. It’s better I cool down and write about this in a few days, I said to myself.

Well, here we are. It’s Rick’s rational side speaking, and this notch on the Mac makes absolutely no sense to me either. It’s a stupid, unnecessary detail that doesn’t really solve any problem, but creates a few. And while I understand that a notch is a compromise on the iPhone because the front camera array is more sophisticated as it has to take care of FaceID authentication, on the Mac this was completely avoidable. The front camera is just a regular webcam, though at least it’s HD.

The most common reactions I’ve heard from people who don’t oppose the notch are:

  1. It’s not a big deal: After a while you won’t even notice it. / It doesn’t really get in your way anyway.
  2. It’s actually a good thing because you gain more screen real estate. This added real estate is basically the area that should have belonged to the bezel at the sides of the webcam and that is now recessed and part of the display. See this tweet from David Pogue to visualise it.

Objection to №1, After a while you won’t even notice it. / It doesn’t really get in your way anyway.

I don’t think this is going to work like with the iPhone. On the iPhone, the interaction with the notch area is minimal. Your eyes start filtering out the notch because when you use the phone they’re often focused elsewhere on the screen. On the iPhone, the notch may become noticeable again whenever some activity happening on the screen makes it stand out, e.g. when playing a fullscreen video in landscape mode. 

On the Mac it’s a different story, in my opinion. On the Mac, the notch visually splits the menu bar, a UI element you interact with all the time. The notch covers, occupies a part of the menu bar that could be devoted to displaying menu items and menu extras. This isn’t a real problem when you have apps with just a few menus. But with more sophisticated and professional apps, with many menus on the menu bar reaching and even surpassing the middle point, then yes, the notch is definitely in your way and you can’t tell me you’re not going to notice it. When you launch an app with lots of menus on one of the new MacBook Pros, all the ‘excess menus’ will get moved on the right, and the notch will of course be a sort of gap between them. So, according to Linda Dong (Apple Design Evangelist), developers now need to take the notch into account when designing their apps (more unnecessary work for them, but who cares, right Apple?) and says:

Either way it’s still a great idea to keep menu bar titles short and consolidate menus when you can for usability’s sake! Hunting through a million menus is never fun even on pro software. 

And I say here what I said on Twitter: for usability’s sake there shouldn’t be a notch in the first place. Hunting through a million menus may not be fun, but it’s certainly better and clearer than deciphering tiny icons and controls in an app toolbar or panel. If you stop and think about it, it’s utterly ludicrous that a developer should alter their app design to accommodate an element which was arbitrarily put in place by Apple and that is so intrusive it can’t possibly help developers make their app better, UI-wise or usability-wise.

But the problems in the menu bar also come from the right: the increasing amount of menu extras (icons). If my 13-inch retina MacBook Pro had a notch, it would already be problematic and I would be forced to resort to third-party solutions like Bartender to hide most of the menu extras. Don’t get me wrong, Bartender is a great tool, but I want to see those menu extras all the time, because some of them indicate a state, and don’t simply function as a clickable element to access application options. 

Again, the notch is an unnecessary hindrance, because even in the best case scenario, it makes you reconsider the way you interact with menu bar elements.

Objection to №2, It’s actually a good thing because you gain more screen real estate.

I thought about this, and my answer is, You gain very little, and it’s not worth the hassle.

The added strip of pixels at the sides of the notch serves to accommodate the menu bar, so in normal use, and compared with a MacBook with a regular top bezel, what you gain vertically is just that, a bunch of pixels corresponding to the height of the menu bar. If you use an app in fullscreen mode, it won’t make use of the extra space on top. The app’s interface will be displayed in the ‘safe area’ below the notch. In other words, when fullscreen, you’ll have the same available space as on a Mac with a regular bezel.

In other words, you gain very little. This is the same misguided principle driving the redesign in Safari 15, at least initially, when according to the genius designers at Apple, having the address bar and the row of browser tabs on the same line is great because you would gain more vertical space to display a website. We are not living in the late 1990s anymore. We’re not dealing with screen resolutions of 640×480 or 800×600 pixels where every trick to gain vertical space was more than welcome. These are dense retina displays with 3024×1964 and 3456×2234 pixels for the 14 and 16-inch MacBook Pros, respectively. The vertical ‘gained space’ amounts to what, 30 pixels? Come on.

A few thoughts about Mac hardware design in recent years

By ‘hardware design’ here I’m not referring to the internals, but to the outer industrial design. A few days ago someone on Twitter said or maybe referenced an article saying that Mac hardware design has actually improved since Jonathan Ive’s departure. Someone else suggested that, since designing hardware is a time-consuming process that doesn’t happen over a few weeks, it was possible that the design process for these latest MacBook Pros started when Ive was still at Apple. I have no idea. I may not have liked every design decision made by Ive, and while he brought the notch on the iPhone, I seriously doubt he would have approved the same solution on the Mac.

Certain details and solutions of Ive’s designs may have been opinionated, but at least reflected a strong personality with actual opinions that shaped the design. The hardware design of recent Macs, instead, feels like the work of a committee… of design students. The M1 24-inch iMac looks like a design exercise where the assignment is Make the thinnest possible desktop Mac. Don’t question why it has to be the thinnest, just do it.

MacBook design is now at its most iterative and regurgitative. The current M1 MacBook Air perpetuates the same wedge-like profile as the late-2010 model, and the display assembly design is essentially the same as the 2015 12-inch retina MacBook. MacBook Pros have retained the same design since they went unibody in 2008. Over the years they’ve become thinner, their trackpads bigger and wider (too big and wide, if you ask me), and some models acquired a Touch Bar at the top of the keyboard.

If the design of the newest MacBook Pros finally breaks this decade-long iterative path, on the other hand it can be seen as a remix of previously-executed design cues. The truly distinctive details are the visibly protruding feet and the notch on the display. I am obviously not a fan of either, but I understand that those taller feet are part of the thermal design of the MacBook Pro, and will help in keeping the computer cooler when under load. The notch is the truly gratuitous, unnecessary novelty that sometimes I think was put there by Apple’s design team as retribution for having to remove the Touch Bar. 

Seriously now, and circling back to the notch: it was completely avoidable. You can justify it however you want, but it has the same fundamental characteristic as its iPhone counterpart — it’s just plain ugly. It is indeed a design compromise on the iPhone because on such a portable device on the one hand there’s the need to maximise screen real estate, and on the other there’s the simple fact that you have to provide a sophisticated front-facing camera with the necessary technology to enable FaceID. So you design a display with a screen that reaches the top where possible, i.e. the area surrounding the notch. You provide as many pixels as possible given the circumstances.

And yes, putting that notch on the MacBook Pros might have originated from the same impulse — maximising screen real estate. But while on the iPhone this was a need, on the Mac it’s just a want. Again, with displays as big and pixel-dense as those in the new 14 and 16-inch MacBook Pro models there’s no need to maximise screen real estate. You don’t need to carve a space up top where to shoehorn the menu bar, as if it were an annoying, restricting UI element, and splitting it up in the process. To me, this makes no sense from a design-is-how-it-works standpoint. It looks like an urge to make a design statement for design statement’s sake — as if Apple products needed some signature design quirk to be recognisable. This, among other things, makes me wonder whether there’s still a strong industrial design leader within Apple. Someone who looks at the final display design drafts, sees the notch, and utters, What the fuck is this?

As an outside observer and long-time Mac user, I feel a certain lack of direction and, dare I say, resolve in many areas of Apple’s hardware and software design. Look at the progression of desktop & laptop computer designs and port selection under Jobs’s tenure. How many times Jobs’s Apple made a hardware design decision that had to be overturned later because something about such decision went nowhere or was not well accepted? The only oddity that comes to mind (and it’s a rather mild one) was the late 2008 aluminium unibody MacBook (non Pro). When this MacBook was introduced, many thought Apple would bring aluminium and a premium finish even to the consumer-oriented MacBook line, after years of polycarbonate iBooks and MacBooks. But then, in 2009, this 13-inch MacBook became the 13-inch MacBook Pro, joining the 15 and 17-inch models, and the humble MacBook went back to being made in white durable polycarbonate for two more iterations.

Now we see ports that were previously ‘courageously’ removed making their return, triumphantly announced as if they were a magnanimous concession on Apple’s part because “Apple has listened to the feedback from their pro users”. If you need to be told that removing MagSafe, the HDMI port, and the SD card slot is a bad idea; if you need to be told — and showed, many many times — that the butterfly mechanism in MacBook’s keyboards is a bad implementation, then you’re not doing a good job at designing hardware. You just make edgy design choices to ‘try new angles’ and hope that you’ll be validated by your reputation. 

The Touch Bar is another odd case: I think the idea had potential, but it has felt like an unfinished project. It could have been iterated and improved upon in so many ways, but it’s like Apple gave up on it. Oh, you don’t like it much. Yeah, okay, we’re getting rid of it, whatever. Why not implement the Touch Bar as an additional strip placed at a slight angle above a full keyboard, instead of using it to replace the top row of keys? Heck, why not place the Touch Bar in the bezel area below the screen, making its customisable controls way more glanceable and operable?

I’ve said it too many times now: part of Apple’s software and hardware design today feels more random, haphazard and trial-and-error than before. I know well that trial and error is an important part of the design process, but with today’s Apple it feels as if this part of the process isn’t happening internally enough, if you know what I mean. It feels that we as users (or developers) are subtly getting involved in it. It feels like a public beta. Some actually like this — those who later write articles talking about how great it is that Apple listens to its users. I would like from Apple a more internally pondered design process that leads to more thoughtful design decisions, executed with the confidence that this is the path to follow and build upon. The notch is a quirk that goes nowhere.

Assorted musings on social media

Tech Life

While reading Mike Rockwell’s very good blog Initial Charge, I bookmarked a couple of link-posts he recently wrote, both about social media.

The first is from 9 September. The title is Reconsidering Your Relationship to Social Media ➝, and the post links to Scott Banwart’s The Inevitable Decline of Social Media. Mike quotes Scott’s introduction:

I have become disillusioned with the state of social media. At one time it was a fun way to connect with people I would otherwise not a have a chance to meet and talk about topics of mutual interest. Now it is largely a breeding ground for tribalism, intolerance, and general meanness. This is making me question why I would want to continue participating in this ecosystem. 

And at the end of his commentary, Mike writes:

Mastodon feels like the early days of Twitter to me — it feels new, fresh, and exciting. There’s no algorithmic timeline, boneheaded features designed to increase engagement, or “influencers” that are willing to say literally anything to get attention. It’s nice.

I’m not exactly a Twitter early adopter — I joined in March 2008 — but I’d say those were early-enough days that I know what Mike means. Those were the times where Twitter felt like leisure, not work. Like a public space where everybody hanging around was being personal and informal in a casual, fun way. It was ’social-good’. You followed people because you wanted to know what they were up to, what project they were working on, how their day was going. You didn’t want them to be a surrogate of the daily news, or to remind you how shitty this world can be, retweet after retweet. 

Over the years, Twitter expanded dramatically, and went from a peaceful town where most people know one another and exchange understanding nods, to the urban equivalent of a chaotic, cynical, divided, post-industrial megalopolis. I’ve always been good at filtering out the most unpleasant aspects of Twitter, but I nevertheless felt a bit overwhelmed and saturated just when App.net (also called ADN, for ‘App Dot Net’) came around in late 2012. I knew people who saw ADN’s great potential and jumped ship leaving Twitter behind altogether. I took a more moderate approach, and for as long as ADN lasted (until March 2017), I gave both Twitter and ADN the same priority. But ADN felt better, and in my experience stayed better until the very end. The social environment didn’t really deteriorate over time.

When ADN shut down, many of its hardcore users were naturally upset, and this diaspora gave birth (or renewed impulse) to other smaller social networks/microblogging sites which, as far as I know, have successfully maintained ADN’s positive social atmosphere and environment. Wanting to stay in touch with as many ‘ADN expats’ as possible, I opened accounts on all of them — pnut, 10Centuries, then Mastodon and Micro.blog — but it soon became apparent that keeping up with Twitter and all these other networks was not feasible. Today, Twitter is still my main social place online. I check on pnut fairly regularly, and occasionally post on Mastodon.

But why has Twitter remained my №1, when it’s possibly the worst among the social networks I mentioned above? The most succinct and perhaps catchy answer I can think of is, Because while Twitter has changed over the years, I have remained the same. Meaning that I have essentially been using Twitter in more or less the same way as I was using it back in 2008. 

To continue with the urban metaphor, as far as I’m concerned, the small town Twitter was at the beginning has become my reference neighbourhood within the chaotic and often toxic mega-city Twitter is today. 

In a more recent link-post, Some thoughts on social media ➝, Mike links to Chris Hannah’s post with the same title. Chris writes:

We can all see the distinction between what happens in real life and what appears on social media.

I think that is where Micro.blog has felt different to platforms like Twitter for me. In a sense, it feels slower, but at the same time, it feels like you are connecting with real people. Whereas when I use Twitter, most of the time it feels like I’m interacting with an online account rather than the person behind it.

I’ve definitely fallen into the trap before, where I’ve used Twitter as a place to share perfect photos, links to my blog posts, and anything else that can bring external validation. But I think I’m going to try and just use it like a normal person for a while, and see how it goes. 

And Mike comments:

This matches my experiences perfectly and is part of the reason I mostly left Twitter. Everyone’s vying for attention and thinking too much about metrics rather than having genuine interactions with real people. That’s why everyone has the same opinion — if you don’t agree, you’re not part of the club, and therefore will lose followers. […]

Although I fall into the trap of sharing almost exclusively the best photos on Instagram and Pixelfed, I try to be a bit more real on Mastodon. That’s the place where I can just share my thoughts — whether it’s complaining about software updates, posting links to music I’m listening to, or anything in between. 

Over the years, I’ve heard and read similar arguments from people who were ‘fed up with Twitter’ and wanted to either take a break from it or leave it for good. Note that I’m putting ‘fed up with Twitter’ in quotes not because I’m belittling a sentiment — I’m simply reporting the words they’ve used over and over again. Of all the people I know who wanted to leave Twitter for good, only two have truly acted on their words and intentions. Twitter’s gravitational pull is strong, especially for those who joined many years ago and have formed a subnetwork of meaningful bonds with like-minded people and friends.

It’s what you make of it

I ultimately think that social media, social networks, and Twitter in particular, are really what you make of them. And what I want to make of my Twitter experience is for it to be something that is constantly positive, where I can share my views and have exchanges with followers and mutual acquaintances that remain non-escalating even when we disagree about something. I want my Twitter experience to be a place where I can share the occasional rant or bad joke, and know that my followers are listening to my rant or eyerolling at my joke. And I make sure to reciprocate, listen to them when they rant, help them if they’re stuck and voice their issue, and so on.

This, of course, takes some work on my part. My Twitter experience isn’t something I’m exclusively, passively exposed to. It’s something I actively contribute to. This is something I fortunately understood at the beginning, after a few false steps where I just ‘didn’t get’ Twitter and thought about leaving myself.

This attitude of mine has been rather transparent from my early days on Twitter in 2008 onward. And I have without doubt reaped what I have sown, because I evidently attracted a lot of like-minded people and kindred spirits. And that’s why I don’t share Chris Hannah’s feelings when he writes that …when I use Twitter, most of the time it feels like I’m interacting with an online account rather than the person behind it.

The unspoken contract I’ve developed with anyone who interacts with me on Twitter is that what you see of me on Twitter is as real as if you met me in person. I’m honest, truthful, respectful of other people, and I ask for the same treatment. And a lot of people I’ve interacted with over the years seem to get this immediately, and our exchanges and social relationship have stayed healthy over time. And if a misunderstanding would arise, I’ve always tried to clarify things without letting a relationship go south or sour.

It takes work if you care about your experience

After thirteen or so years using social media and Twitter, I’ll reiterate, I feel you need to be willing to do some work if you want Twitter (or your social network of choice) to be a pleasant, beneficial experience. You can’t expect the network to enjoy and entertain you without giving something back. I’ve often heard people complain about their timeline being toxic, but apart from sponsored tweets, Twitter doesn’t really push anything extraneous on you that you don’t want. If your timeline is toxic is because you follow people who either post toxic content or are serial retweeters who routinely disseminate unbelievable amounts of crap. Or maybe your timeline is toxic because toxic people start following you for some reason and tweet abusive things at you all the time. Or maybe your whole experience is toxic because you spend literal hours doomscrolling and pay attention to every single stupid tweet you see. 

Twitter can deploy some tools to mitigate toxicity and, for example, reduce exposure to misinformation and fake news, but filtering toxicity is hard because the whole matter can be incredibly subjective and fine-grained. You are the best filter. Stop following people who flood your timeline with crap. Block people who tend to be abusive and gratuitous towards you. But also try to develop a way to approach and use Twitter that can prevent you from ending up having a miserable experience. 

I’m sharing this advice and observations thinking in normal terms for the average Twitter user. I am sadly aware of many cases of abuse and bullying and doxxing where the targeted person is simply too overwhelmed to do anything except maybe leave the platform, which is the goal of the harassers. These are extreme cases and no amount of personal work or personal filtering is enough to stop the hæmorrhage.

But back to more normal situations, I keep hearing people complain about their timeline as if it was some kind of demonic TV set that cannot be turned off and forces them to watch its programmes. Once again, my personal experience is that on Twitter, maybe more than anywhere else, you reap what you sow. Note that I’m not advising to put up appearances or behave in ways that may make you likeable, or always be politically correct to avoid debates or conflicts. 

I’m advising to be yourself, to be genuine, but also to behave wisely. Be personal if you want, but don’t put yourself in situations that make you vulnerable. You can definitely participate, even generate a heated debate if you trust your followers and interlocutors to engage in something constructive. Don’t pick fights with people you barely know just because they said something you don’t like. There is often the urge to ‘right the wrong’ on Twitter, but even when you’re objectively right (because facts back you up) and the other person is clearly wrong or believes in horrible things or spreads misguided notions, act wisely. Think before typing. Pick that fight, if you like, but prepare for any consequence and ask yourself if the fleeting pleasure of calling a moron for what they are is worth the potential subsequent grief. 

If you want to virtue-signal at all costs, I prefer the subtler, more intelligent approach of “show, don’t tell”. If your followers are well-adjusted, thoughtful, perceptive people, they’ll know that black lives matter to you even if you never use #BLM in your tweets. They’ll know if you are pro-LGBT and pro-Trans rights even if you don’t put rainbow flags in every tweet. What you post, what you retweet, what you reply and react to, all these things in the end define you socially online. 

You obviously can’t fully control your Twitter experience, and you may end up disappointed or dissatisfied with it no matter how hard you try to make it better, therefore seeking out alternatives that may be more suitable for you. This is good and understandable, and the position Mike, Scott, and Chris seem to have found themselves in. In fact, I’m not criticising them (I used their quotes here as a starting point for my reflections, not to teach them a lesson). I am more critical of those who complain about how bad Twitter is, how dreadful their experience is, just standing with their arms crossed and with a sense of entitlement as if to say, Someone needs to fix this for me; Twitter has to do something, anything. As if they had no part in how things shape socially online. In these cases, leaving the platform is just an empty, theatrical, rage-quit. You’re going to have the same problem in whatever next social network you dive in.

Magnitude is relative

And speaking of alternatives, it’s always fascinating to me how the ‘best’ experience often seems tied to the social network’s actual or perceived scale. Twitter is huge, millions and millions of users, therefore its scale must be one of the causes of its degradation. It can be, of course, but I also feel that the true magnitude (and impact) of Twitter is as big as your actual network of contacts/followers/people you follow within Twitter. After 13 years on Twitter, I still follow a reasonable, manageable amount of people: I don’t feel overwhelmed and I don’t feel as things are getting out of hand. My Twitter still feels like the small town of the early days. That’s also because my focus and priority is still the personal interaction, not the “I’m a channel broadcasting my stuff and I seek constant growth” attitude other people have on Twitter and social media in general. It really boils down to what you want from social. If all your needs are egotistical in nature (you want to attract attention, ‘grow your audience’, be an ‘influencer’, etc.), then you’ll be loud, superficial, and the resulting experience will be chaotic. Maybe in a way that pleases you, maybe in a way that pisses you off, but in either case you asked for it.

I prioritise people. Dialogue. Exchanges. Sharing interesting stuff, facts, links, observations, photos, music suggestions, and so forth. I’m naturally curious, I celebrate differences, I also do my best to listen to what people tell me. I don’t care about metrics, I don’t crave attention, or want to ‘grow my audience’. I’m not a cult leader. It’s the same as with my books or writings: while I would be certainly flattered if my fiction sold well, for the time being I’m more interested in a meaningful diffusion, in knowing that maybe this month I only sold 10 copies of one of my books, but then through feedback I learn that those 10 readers, or 5 readers, appreciated my work. On social I very much prefer having 1,100 followers than 100,000 fans. I hope I’m making sense here.

Instagram, Glass, barriers to entry

By the way, during the years I was active on Instagram (the pre-Facebook era), I was doing exactly the same there. But Facebook did poison the well, and weaponised and commercialised something that was fun, laid back, and casual. It has transformed a quiet place into something that flashes and autoplays and screams and shoves extraneous content down my throat every time I open the app. I still use Instagram to like and comment on other people’s photos and posts, but the experience of finding my contacts and exchanging comments with them feels like trying to find a friend of yours at a huge rave party.

Instagram is pretty much unsalvageable unless someone else acquires it and does a gigantic, radical reboot. In the meantime there’s Glass, a photo community which is doing a lot of core things right, in my book. So far, I’m enjoying the relaxed atmosphere there, and I’m happy there are no ‘likes’ or metrics. Comments are the only way to tell someone you like their photos. And they may be scarce, but (at least in my experience) feel genuine and articulated. This ‘going against the grain’ in Glass’s philosophy is admirable and it’s evident that comes from people who care to create a product that is successful in a quality-over-quantity way. 

But one aspect worth mentioning is the barrier to entry, which in my opinion is fundamental in setting the tone from the start when you launch a social product. I’m generalising and there are always exceptions, but typically a free product, a free social space, will inevitably attract terrible people, chaos and toxicity. Spraying graffiti over a building is fun. When the building is yours, even in a very small part, you’re more hesitant to deface it. At launch, ADN wasn’t entirely free to access. If I remember well, it was invitation-based, and the person you invited got a free trial period, but the backbone was made of paid accounts. I remember I kept paying for my account monthly ($5) instead of yearly, even if a yearly subscription was less expensive, because I wanted to support the platform as long as possible. Barriers to entry are a great first filter, they keep the cheapskates away, they keep advertising away, and generally ensure that all participants (or at least the majority) are invested enough in the place to make it pleasant for themselves and everyone else. People who argue, for example, that Glass will never be as successful as Instagram because it lacks this and that, are missing the point. There are many ways to measure success. Glass and Instagram are like apples and oranges. 

There is no conclusion

There is no conclusion or moral of the story. These are notes, not a narrative. But since I have to end the article in a way or another, I’ll share a note I jotted down in Notational Velocity a few years back when I wanted to talk about social networks: Your social presence is your own radio show, but make sure you take your listeners’ calls while on the air.

Share critically.