On sideloading

Software

I usually take for granted that my audience is largely made of people who are tech-savvy enough to know what I’m talking about. But here’s the definition of sideloading taken from Wikipedia:

Sideloading describes the process of transferring files between two local devices, in particular between a personal computer and a mobile device such as a mobile phone, smartphone, PDA, tablet, portable media player or e‑reader.

Sideloading typically refers to media file transfer to a mobile device via USB, Bluetooth, WiFi or by writing to a memory card for insertion into the mobile device, but also applies to the transfer of apps from web sources that are not vendor-approved. 

The Epic vs Apple lawsuit has inspired a lot of points for debate regarding Apple’s App Store management and policies, and in general regarding Apple’s anticompetitive behaviours. The possibility that regulators, in Europe and elsewhere, could order Apple to allow any software application — not just those approved by Apple — to be installed on iOS devices has generated a rather polarised discussion. 

Apple, of course, is strongly against such a possibility, and has manifested its concerns several times, most recently having Craig Federighi give a speech at Web Summit 2021 in Lisbon, Portugal. Federighi has reiterated Apple’s angle, i.e. that allowing sideloading would be a catastrophic blow to customers’ security. Reading the afore-linked article by Ars Technica, I find Federighi’s framing of the issue quite a bit overkill: 

Sideloading is a cybercriminal’s best friend, and requiring that on the iPhone would be a gold rush for the malware industry… That one provision in the DMA [Digital Markets Act] could force every iPhone user into a landscape of professional con artists constantly trying to fool them. 

But of course you have to exaggerate the risks if you want to position yourself as the Guardian Angel of all your customers and users. You’ll never hear a longer lists of threats to your life as when an insurance company is trying to sell you a life insurance policy.

I grew up in an era when software was just software, and you could simply start typing a BASIC program into the computer and execute it. Generally speaking, it was an era when tinkering — both in hardware and software terms — was unhampered and even encouraged. Philosophically, I can’t be against sideloading. I actually dislike how the term’s connotation has been hijacked towards negativity. On the contrary, one should think of it in terms of freedom to install any compatible software available for a certain platform. 

But what about malware? Yes, in a completely open scenario, malware can indeed be a risk. But the problem, in my opinion, lies elsewhere. It lies in the tradition of treating end users like ignorant idiots instead of training them to separate the wheat from the chaff.

A bit of a long-winded interpolation on how base tech-savvy in users has developed and ultimately evolved over the past few decades

In the 1980s, when computers started entering people’s homes, the need for clear, simple, extremely usable interfaces was evident. Back then, the majority of people weren’t tech-savvy at all. They were literally ignorant when it came to using a computer. The Apple Lisa and Macintosh computers were revolutionary, UI-wise, because their interfaces were the result of a painstaking research into how to present information to the user and the ways the user can interact with and manipulate it. The abundant, well-written documentation, the manuals and user’s guides accompanying those machines, taught people even the most basic operations — like using the mouse — because at the time these were completely new things to most people. What today looks and feels ‘intuitive’, back then was not.

What was great about those manuals and those first graphical user interfaces is that they truly educated people in the use of the personal computer without insulting their intelligence. And, at least in the beginning, people were also educated on the matters related to software: what software is and how to use it, how to deal with it. This unfortunately didn’t last long. First, the IBM PC and Microsoft Windows became the most widely used platform — and sadly this ‘winning’ platform was also less user-friendly. 

Then the software for this platform propagated at gold rush levels, and soon people found themselves overwhelmed by the sheer quantity of Windows applications. Needless to say, with great quantity inevitably comes a varying degree of quality. At the same time, the march of progress brought an increasing complexity in operating systems and related software, not to mention the great speed with which companies and businesses became computerised in the 1980s and 1990s. At this point a lot of people were also overwhelmed by having to learn to use badly-designed, user-hostile software for work. I’m going from memory here, so this may be me injecting anecdotal evidence into the narrative, but I distinctly remember how this was a shocking experience for many people who at the time didn’t have a computer at home, never used one before, and suddenly found themselves on work-mandated crash courses to quickly — and badly — learn to use one. Or rather, to use the two or three main applications the company required them to master.

No wonder that a lot of folks became intimidated by technology, computer-averse, or even flat-out unwilling to become more informed on technology matters, even when it was clear that technology would become extremely embedded in people’s lives in the years to come. This was a terrible phase, one I remember too vividly, which coincided with my freelancing as ‘tech support guy’. 98% of the people I helped out back then had a degree of computer literacy that, in a sense, was worse than being completely ignorant: it was a patchwork of disparate notions haphazardly accumulated over time. A mixture of assimilated procedures and workflows without knowledge of the principles behind them. People who didn’t understand basic concepts like the metaphor of files and folders but were able to find, retrieve, and install a third-party utility which often had features already built in the operating system; something these users didn’t realise because they had never really learnt how to use the system in the first place. People who didn’t know how to change their desktop wallpaper but knew how to enter the command prompt and issue certain commands, “because my tech friend told me that if I do this and that I can free up more RAM” (then you asked them what RAM is, and often they confused it with disk storage).

It’s clear that, with this type of computer literacy, taking advantage of users isn’t such a hard task for malicious actors. Spreading malware or viruses masqueraded as benign software (even as antivirus applications) is easy when users are not really educated on spotting the difference between good and bad software.

When the Macintosh was introduced in 1984, Apple had noble goals in mind. They wanted to empower people by giving them a tool that was friendly and intuitive to use, that could make their lives easier, and even spark creativity and ingenuity. The computer for the rest of us. With hindsight, it’s a pity that the Macintosh lost the war against the IBM PC and Windows and did not become the most widely used platform. Because at the time, the difference between someone who got into computers via the Mac platform versus someone who had to learn to be proficient with a PC, mostly for work-related reasons, was palpable. The typical Mac user was — how to put it? — more organically tech-savvy, more confident in their approach with the machine’s interface, and generally more knowledgeable about the machine as a whole. 

When Jobs returned to Apple in 1997, that noble goal of giving people friendly and powerful tools, both hardware and software, was strongly reiterated. First came the fun machine, the iMac, then a few years later came a more powerful, a more stable, and in many ways more streamlined operating system, Mac OS X. In the 2000s many, many people switched to the Mac because they finally realised that it was an equally powerful, versatile platform, but less messy and inconsistent than Windows. And I remember that, while some long-time Mac users were frustrated by Mac OS X and its initial incompatibility with applications and peripherals they used for work, others were happy to finally leave behind the increasingly arcane management of system and third-party extensions and control panels.

The 2000s were an important decade because Apple at this point was offering both solid hardware and good-quality software, and as more people switched to the Mac, I noticed fewer people being intimidated by computers, fewer people being tech-averse. I still freelanced as tech support guy, but calls for assistance became less and less frequent. There was a brief surge when clients called me for help in transitioning from Mac OS 9 to Mac OS X, but after that things got mostly quiet. There were always exceptions, but I really started to notice that finally the average level of tech-savvy was increasing. 

Back to iOS: when the App Store was introduced, users’ tech-savvy was generally mature enough to handle sideloading from the start, but Apple chose the overprotective path

When Apple introduced the iPhone and iPhone OS (later simply called iOS), not only did they present a compelling device from a hardware standpoint compared to what the competition was offering in the phone market, but the revolution was also happening from a software and user interface standpoint. The intuitiveness of the iPhone Multi-touch interface simply destroyed the convoluted and antiquated UIs of other mobile phones. Apple had introduced friendliness and ease of use in the mobile landscape as well.

Then Apple had yet another fit of We know what’s best for our users and, well, things could have been handled differently. 

If you remember, at first iPhone OS didn’t support native third-party applications. At the time Jobs infamously proposed a ‘sweet solution’ where developers could instead write Web apps for the iPhone that would ‘behave like native apps’. I remember thinking at the time that this was kind of surprisingly myopic of Jobs and Apple, and it really felt as if the message to developers was something like, Don’t screw up our newborn revolutionary platform with your mediocre stuff. Thankfully this stance didn’t last long, and in March 2008 Apple announced the iPhone SDK. 

Back in 2007–2008, I assumed that Apple would approach third-party iPhone OS app development in much the same way they did with third-party Mac OS app development. A sort of loose, ‘anything goes’ approach, that is. My mistaken and somewhat naïve assumption came after hearing Jobs speak of iPhone OS as being essentially OS X in mobile form. I also thought that Apple knew their users (and developers) enough at this point to trust them and treat them like people who knew what to do with their devices. People who could decide for themselves what kind of experience to get from their devices. When the iPhone App Store was launched in mid-2008, I thought it would work a bit differently. The model I had in mind was more similar to how the Mac App Store would work later in 2010. In other words, I thought the App Store would be a place where users could more easily find & install selected (and approved) apps for their mobile devices, but that they would be free to look elsewhere if they so chose. 

Don’t make promises you can’t possibly keep, Apple

Instead of teaching users how to fish, Apple decided to position themselves as sole purveyors of the best selection of fish. Now, leave aside for a moment all the tech-oriented observations you could make here. Just stop and think about how arrogant and patronising this attitude is. Sure, I can believe the genuine concerns of providing users with the smoothest experience and protecting them from badly-written apps (or just straight malware) that could compromise the stability of their devices. But by not taking a more moderate approach (it’s either we lock down the platform or we’ll have the cyber equivalent of the Wild West!), you also deprive users of choice and responsibility.

The problem of appointing yourself as the sole guardian and gatekeeper of the software that should or should not reach your users is that you’re expected to be infallible, and rightly so. Especially if you are a tech giant which supposedly has enough money and resources to do such a splendid job that is virtually indistinguishable from infallibility. Instead we know well just how many untrustworthy and scammy apps have been and are plaguing the App Store, and how inconsistent and unpredictable the App Review process generally is.

That’s why the doom and gloom in Federighi’s speech sounds hilarious to me. His (and Apple’s) is textbook FUD spreading — this idea that, without Apple the paladin, all these poor users are left completely defenceless and at the mercy of the hordes of cybercriminals waiting outside the walls of the walled garden. The same paladin who rejects an app update from a well-known, trusted developer for ludicrous technicalities, but allows hundreds of subscription scams from pseudo-apps that are just empty containers to fool people into recurring payments. 

We’re not living in the 1980s and 1990s anymore. Today most people have a base level of tech-savvy that was almost unthinkable 20–30 years ago, and they’re much less intimidated by technology. But a point of regression is given by the constant convenience spoon-fed to users and the insistence on eliminating any kind of friction in the user experience — and I mean even that modicum of ‘good’ friction that makes a user more aware of what’s going on, more conscious of how a certain flow or interaction works. If you remove all cognitive load, users become lazy quickly, and even otherwise tech-savvy people can be lulled into a false sense of security, thus falling for App Store scams I’m sure they would recognise if carefully screenshotted and presented to them out of context.

Closing remarks: Sideloading should be seen as adulthood in our relationship with software. An occasion for being in control, making choices, and taking responsibility

Moving on, I think sideloading should regain a more neutral, or even positive connotation and should not be demonised. The term sideloading shouldn’t feel like the tech equivalent of moving contraband. It’s just the process of installing software, any kind of software, and not necessarily just the sneaky or malicious kind. Or the kind that weakens a device’s security. In fact it’s theoretically possible to offer software tools that increase security in certain parts of a system, exactly because they can access them. 

And on a more philosophical plane, sideloading ultimately means freedom of choice and giving back a bit of agency and responsibility to users. How Mac software works could very well work for iOS, too. There wouldn’t be the need to dismantle the App Store as it is today. Keep it as the curated place that it is (or wants to be), but allow iOS software to be distributed and installed from other places as well, with sandboxing and notarisation requirements in place just like with Mac software. And just like on Mac OS, at the user interface level you could warn the users that they’re about to install an app by an unidentified developer, outside of the App Store, and that if they choose to install it, it’s at their own risk. Let them make an informed decision. And let them set their preference in Settings, exactly like they do in Mac OS in System Preferences → Security & Privacy.

Again, Apple should refrain from continuing to bang the security drum when discouraging sideloading. They could maintain such stern stance if they were actually able to protect iOS users all the time, consistently and effectively. But they aren’t, and they simply cannot guarantee a 100% efficacy, which again is the fundamental requirement when you position yourself as the sole gatekeeper and imply that people would be lost and clueless without the protection you provide. In such a context, you can’t provide ‘good enough’ protection. Here, good enough is simply not enough.

The Author

Writer. Translator. Mac consultant. Enthusiast photographer. • If you like what I write, please consider supporting my writing by purchasing my short stories, Minigrooves or by making a donation. Thank you!