The 2020 iPhone SE and small phones

Tech Life

IPhone SE 2020

 

Lots to love, less to spend. For the first time in years, an Apple’s tagline finally resonates with me fully. And for the first time in years, I’m completely fine with an Apple product. Many know I’m typically a budget-conscious customer, but the pricing of the 2020 iPhone SE is almost the least aspect that makes me happy. 

What truly makes me happy is that it exists. That it has sort of become a category. When the first iPhone SE was introduced in 2016, I really felt it was a one-off product, especially since it was labelled Special Edition. It felt like Apple’s farewell to a very successful (and very well executed) iPhone design, a last hurrah before retiring it. And I thought that combining the internals of an iPhone 6s with the design of the smaller iPhone 5/5s, and all at a very reasonable price, was a great move that would lead to a substantial demand from people who wanted a more powerful phone than the 5s, but the same compactness.

And the 2016 iPhone SE was indeed successful, to a point that apparently surprised Apple itself. Anecdotally, many friends and acquaintances of mine coming from an iPhone 6 and even 6s were happy to go back to a more manageable device. I was still using an iPhone 5 at the time, I had been using it for barely a year and was still very happy with it: upgrading to an SE would have been nice, but not strictly necessary, and at the time I was more interested in maybe upgrading my iPad or my aging MacBook Pro. But if I had been ready to upgrade, the iPhone SE would have been my first choice. (And when I later got one for my wife — also coming from an iPhone 5 — she was happy to have a better phone with the same design and handling as her 5).

Many people were so happy with the iPhone SE that some hoped Apple could keep doing the magic of offering a 4‑inch iPhone with updated specs and at an affordable price as a recurring spring release. Back then I even heard from people who told me they loved the iPhone 5/5s design so much that they would have been fine with Apple keeping that design unchanged and just updating the internals. I sympathised, but I kept thinking that Special Edition meant exactly that — a one-time thing. 

When spring 2017 and then spring 2018 came and went without iPhone SE news, that reinforced my belief about the uniqueness of the product, despite the occasional rumours talking about a new SE coming again sometime. And when, in early November 2018, it was finally time for me to upgrade from my iPhone 5, I was faced with a little dilemma: get an iPhone SE that was by then two and a half years old, but still a favourite design-wise, or go for something newer but also bigger and more uncomfortable to handle? Given my utter dislike for the iPhone X design, and their size, I ruled out the XS and XR models pretty much right away (although the XR’s colours briefly fascinated me). In the end, I chose the iPhone 8: it had that familiar design of iPhone past I loved so much (no notch, and a Home button with TouchID); and its size, albeit not ideal, was manageable enough. I would have loved an iPhone SE, but its chip was by then three generations old, and even if I typically am not a specs-obsessed guy, the 2016 iPhone SE was simply not a good investment overall. 

Going for an iPhone 8 in late 2018 was still not getting the latest and greatest, specs-wise, but it was such a great compromise for me that it really did not feel like a compromise at all. At the time, a friend commented on my choice by telling me, Aw, but the camera is nothing special. But I don’t really care about camera capabilities in a smartphone — I use it as a pocketable digital instant camera and nothing more. The photos I take that are meaningful to me and that I consider to be more artistic are almost always taken with a traditional camera. 

With time, I’ve grown accustomed to the bigger size of the iPhone 8, though it is the first iPhone that I dropped because of its size. Thankfully nothing happened to it, protected by the very same rubberised case whose excessive friction caused me to drop the phone in the first place, when I was trying to take it out of my front jeans pocket to answer a call.

I’ve written this perhaps long-winded introduction so as to give you a clear idea of where I am now with the iPhone, and as an explanation as to why it is obvious that the 2020 iPhone SE is going to be my next iPhone without any doubt. Even if I upgrade to it later this year, or a year from now, its specs will still be more than enough for how I use an iPhone.

About the ‘Special Edition’ moniker

While in recent times it had become clear that Apple was about to introduce an affordable phone much in the same spirit of the 2016 iPhone SE, and despite the numbering wouldn’t have made complete sense, I was really convinced Apple would name this new phone iPhone 9 or something along those lines. To me, Special Edition means… well, a special edition, something you only find once. I’ve seen it with vinyl records, watches, cars, cameras, where often the lines between ‘Special Edition’ and ‘Limited Edition’ are blurred.

So when I saw that this new iPhone model was called, again, ‘Special Edition’, I was puzzled. There’s nothing ‘special’ about it, was my immediate reaction. But then I read John Gruber’s lengthy explanation and it started to make more sense; this sentence in particular sums it all up nicely:

What makes “special edition” apt for the two iPhones bearing the SE name is the way they differ, strategically, from regular edition iPhones. 

Part of me, however, still doesn’t feel that ‘Special Edition’ is a completely apt moniker for the whole concept that Gruber masterfully explains. Again, taking the cue from the car, watch, or camera world — even from Apple itself with the iPod line — perhaps Classic Edition would be more fitting. It is, after all, the celebration of a design that routinely becomes a classic in Apple’s iPhone line.

But this is idle nitpicking. As I said at the beginning, I frankly have nothing to nitpick about the 2020 iPhone SE. I just hope they keep offering it in PRODUCT(RED) when it’s time for me to upgrade.

There is still one aspect to talk about, though.

4.7‑inch — the new small?

iPhone SE 2020 and 2016

The 2020 iPhone SE, like the iPhone 8 before it, is still Apple’s smallest phone, but it’s not exactly a small phone. I’ve grown accustomed to my iPhone 8’s size over time, and learnt a few tricks to handle it with my relatively small hands, but when it comes to pockets, the iPhone 8’s form factor isn’t as friendly as the iPhone 5’s.

When the iPhone 5 was my daily driver, any pocket was fine. Though I usually prefer putting my iPhone in a jacket pocket, sliding the iPhone 5 in my front jeans pocket was comfortable and the phone ‘disappeared’ after a short while. Since switching to the bigger iPhone 8, I definitely had to change my carrying habits. Not only putting it in any of my trouser pockets brings a certain amount of discomfort (the bulk of the phone makes its presence always felt, it never ‘disappears’ like the iPhone 5 did), I also have to choose which jacket pocket is the best fit, because the bigger size of the iPhone 8 has made certain zippered pockets difficult or impossible to zipper up.

I do understand the frustration of those who had hoped in a new iPhone SE that retained, even redesigned, the old small footprint of the iPhone 5, 5s, and 2016 iPhone SE. In his more recent commentary piece, What Is the Market for Smaller Than 4.7‑Inch Phones?, John Gruber concludes:

But Apple has always done fanatically detailed market research. They don’t talk about it because by any company’s standards for trade secrecy, market research is a trade secret, and Apple is, we all know, more secretive than most companies. I think what makes truly small phones  —  let’s say iPhone 5S-sized phones  —  hard to gauge the demand for is that no one has made one since the original iPhone SE 4 years ago. 

I know my observations are based only on anecdotal data and not on more formal research or surveys but, save for a couple of cases, all the people I know who currently have a 2016 iPhone SE are disappointed that the new SE is bigger and retains the 6s rounded-edge design they always felt too slippery for their taste. There are even a few of them with an iPhone 7 and an iPhone X who were eagerly awaiting a new iPhone SE to be able to go back to a more manageable size and were let down by the new SE.

Another thing I used to do often before this Coronavirus quarantine was to visit my local Apple Store and other big stores with an Apple-dedicated area inside, and listen to the comments made by people browsing the iPhones and asking the Apple staff for advice. One of the most common questions posed by many women and girls when looking at the iPhone 11, XS or XR was, Is there a smaller one? — and when shown the regular iPhone 8 they would often repeat the question. Other would remember the 2016 iPhone SE and ask about it, visibly disappointed when realising it had been discontinued. 

Again, this is based on first-hand observations accumulated over time. I’m not using them to prove anything. It’s simply a minor trend I’ve been noticing… but a trend nonetheless that I’ve personally found hard to dismiss. At least where I live, there are a lot of regular people who still prefer small physical size and handling comfort over sheer display real estate. I believe that if Apple introduced a hypothetical iPhone SE with the same footprint of the 2016 iPhone SE and the internals of, say, the iPhone 8, it would be very well received by a not-insignificant amount of people.

Boot Camp on an external drive

Software

Intro: my path to Boot Camp

I tend to prefer separate devices to perform certain tasks, and when it came to reacquaint myself with the Windows platform after many years ignoring it completely, instead of going the virtualisation route, I opted for acquiring a couple of vintage ThinkPad laptops; initially a T61, followed by a T400 (now a Windows 8.1 machine), and an X240, my Windows 10 machine. 

For those who have noticed my increasing criticism towards Apple in recent years, my interest in Windows may seem connected to that dissatisfaction, but this is true only in part, a very small part. I arrived to desktop Windows via mobile Windows — my surprisingly satisfactory explorations with Windows Phone 8.1 and Windows 10 Mobile in 2017–2018. But I’m also very interested in what Microsoft has in stock for the Surface line, with the Neo and Duo devices coming later this year. Ultimately, as I often reiterated as of late, today in tech we should keep our options open, and being proficient in more than one platform can only be beneficial.

Then there’s gaming, another interest of mine that has stayed dormant for a while, but was awakened in recent times due to the surprising amount of good-quality triple‑A games. Unfortunately, that often means Windows-only games.

My most powerful Windows machine is the 2014 ThinkPad X240, which is roughly equivalent in specs to a MacBook Air of the same vintage. Which means it definitely isn’t a gaming machine. While several indie titles run surprisingly okay, the more demanding stuff is simply unplayable.

In the end it was either buying a dedicated gaming laptop, or turning my 2017 iMac 21.5‑inch 4K into a Windows gaming machine. And while it’s not as powerful as the most recent PC offerings, it still has respectable specs to run more demanding titles. So, for the first time since Apple introduced it in 2006, I took a good look at Boot Camp. 

I was disappointed in finding how, 14 years after, Boot Camp still retains a certain inflexibility. I wanted to configure a separate Windows installation on an external SSD unit, but Boot Camp only lets you create a partition on the Mac’s internal drive. I didn’t want that because, a) it seemed a bit too disruptive an option, given that by now the internal drive of my iMac is 60% full; and b) my iMac’s internal drive isn’t an SSD or a Fusion Drive, just a regular 5400rpm hard drive, and I didn’t like the idea of stressing it too much with a Mac OS/Windows dual-boot setup.

So I started looking for solutions on the Web. And indeed, using an external drive with Boot Camp is possible, but all the methods I’ve found tend to be a bit convoluted.

The method I followed, with a few additional notes

The method that looked the most interesting to me, despite its length and many many steps, is the one outlined by OWC in their blog: Tech Tip: How to Use Boot Camp on an External Drive.

The procedure can be broken down into a few macro steps:

  • Download the Boot Camp Windows Support Software on a USB flash drive.
  • Obtain a licensed copy of Windows 10 on an ISO image or an install DVD.
  • Create a Windows 10 virtual machine and, through a Terminal trick, link the virtual disk of said machine to the physical external SSD you want to install Windows on.
  • Start the Windows 10 installation in the virtual machine, then finish the installation by restarting the Mac from the external drive.
  • Complete the Boot Camp configuration from within Windows by installing the Windows Support Software you previously copied on the USB flash drive.

Understanding the general logic behind the whole procedure, and being patient, are two important factors that will greatly help you achieve the goal.

Now, the method described in that article is fairly well written; but it was also written in May 2017, and as you can see in the many comments that have been accumulating ever since, it may not work as seamlessly as expected, especially if you’re doing this on recent Mac models. 

After three unsuccessful attempts, this is what worked for me.

1. Eject, eject, eject

In step 4 of the Use VirtualBox to Install Windows on the External Drive section, the author writes:

Now that we know the disk identifier, we need to eject the disk so it’s no longer connected logically to the Mac (it will still be connected physically). 

Now, as you follow the procedure, there will be operations you do in the virtual machine that will trigger a remount of the external drive. Keep an eye on the Desktop, and every time you see the external drive appear, eject it again. From this point onward, the external drive must never be mounted on the Desktop, otherwise the installation won’t work.

2. Quotation marks

In the section titled Use VirtualBox to Map the External Drive to a VirtualBox Disk, the Terminal command you issue to create the virtual disk is written as follows:

sudo VBoxManage internalcommands createrawvmdk -filename “bootcamp.vmdk” -rawdisk /dev/disk4

You should remove the quotation marks around the file name and type the command this way:

sudo VBoxManage internalcommands createrawvmdk -filename bootcamp.vmdk -rawdisk /dev/disk4

3. Enable EFI

The suggestion mentioned in the comments, to enable EFI in the virtual machine settings, is crucial. Especially if you’re doing this on a recent Mac model. I didn’t the first time, and when I got to the point where you reboot the Mac from the external drive, the external drive was not detected at boot. To enable EFI, you enter the Settings of the virtual machine, you click on the System tab, and then you click on the Enable EFI checkbox:

Enable EFI

You must do this after the virtual machine is first created, and before you start it to begin the Windows installation from the ISO image.

4. More than formatting

There is a point you reach during the installation where you’re still in the virtual machine, the Windows Installer has just begun, and it’ll ask you where you want to install Windows. In the reference article, this happens in the Install Windows on Your External Drive Via VirtualBox section. I got to these steps…

3) Select the “Custom: Install Windows only (advanced)” option.

4) The Windows setup will display the currently available drives and partitions that Windows can be installed on. Because the external hard drive has been assigned to the VirtualBox machine, it is the only drive that will be listed.

5) When we used Disk Utility to format the drive, we chose MS-DOS (FAT), which is incompatible with Windows 10. We did this because Disk Utility can’t format with NTFS, but the Windows installer would recognize MS-DOS. All we need to do now is change the drive’s format to NTFS.

6) Select the drive, and then click the Format button. 

…But after clicking the Format button, Windows was still telling me that it couldn’t install on the selected partition. After some trial and error, I found that the way to go was to select the (only) partition displayed, click Delete, then click New. Windows warned me that it may create additional partitions as needed by the system. I confirmed, and then Windows created four partitions, three rather small, and one being essentially the remaining 98% of the drive. I selected this large one, and at that point Windows happily proceeded with the installation. 

5. Driver check-up

This bit might be subjective, and your mileage may vary, but after booting in Windows on the external drive, when Windows was running the setup assistant, I noticed that the wireless card of my iMac was not detected. The only ‘network’ Windows was seeing was the Ethernet port. I told Windows to proceed ‘without Internet connection’, and finished the initial setup. 

Even after successfully installing the Boot Camp Windows Support Software from the USB flash drive, I still noticed I had no Internet connection and that the wireless card was not detected. When I opened the Device Manager on Windows to check what was up, in the devices tree I noticed a whole section of devices that had no driver installed (I’m going from memory here, as I didn’t take a screenshot, but I think Windows threw up an Error Code 28 when checking the device Properties). 

What I did was simply to go into each problematic device, tell Windows to ‘Update the driver’ and point Windows to the directory WindowsSupport\$WinPEDriver$ in the USB flash drive with the Boot Camp Windows Support Software. Each time, Windows was intelligent enough to fetch the correct driver, and then everything worked as it should (Wi-Fi, Bluetooth, etc.) 

6. Additional advice

I didn’t need to do this on my system, but I saw it mentioned a few times in the comments, and I think it’s worth noting:

  • If you have a Mac with the T2 Security Chip, you may need to enable External Boot and disable Secure Boot if it’s active, otherwise the Mac won’t detect the external Windows Boot Camp drive at startup. Follow the steps outlined in this Apple Support page to enter the Startup Security Utility, then select Allow booting from external media at the bottom of the window. If that’s not enough, you may also have to select No security under the Secure Boot section. In case you’re unsure whether your Mac has the T2 chip, check this other Apple Support page.
  • Some have reported that enabling Full Disk Access (a feature present in Mac OS 10.14 Mojave and higher) may help. I don’t know, since my iMac is still on Mac OS 10.13 High Sierra. But in case you have to do this, the procedure is neatly explained in this support page on the Avast website.

Final considerations

I’m aware that the guide on the OWC website plus my notes here above might make the whole process seem extremely complicated. And it certainly is convoluted. Maybe there’s even a simpler procedure I don’t know about. But for the most part, it’s the classic situation where the explanation is lengthier than what you actually have to do. But as I said above, if you break down the process into a few key parts, if you understand what each part does, and if you’re patient, everything should go well.

You don’t even need to do everything in a single session. After my initial, rushed attempts, I finally decided to slow down, and divided the work into three stages I then carried out in separate moments:

  1. Initial formatting of the external SSD. Installation of the Boot Camp Windows Support Software on the USB flash drive. Download of the Windows 10 ISO image from Microsoft’s Software Download page.
  2. Windows 10 virtual machine setup in VirtualBox and Windows installation up to the point where you have to shut down the VM and restart the Mac.
  3. Final Windows installation after rebooting the Mac in the Windows Boot Camp external SSD.

As a general observation, this is what annoyed me the most: a) Why doesn’t Windows simply allow to perform an installation of the operating system on an external drive? Even if you try from a PC, the only option you’re given is to install Windows on the internal drive; b) Why doesn’t Boot Camp allow to perform a Windows installation on an external drive? It’s 2020, and we’re still stuck with byzantine procedures, rituals, and incantations whenever we want to do things slightly differently. Oh well.

As a final note, I’ll share a suggestion a dear friend of mine proposed as I was telling him about my Boot Camp frustrations. He said, Why don’t you first install Mac OS on the external SSD, reboot the Mac from the SSD, and then you configure Boot Camp from there?

I don’t know if this works, I haven’t tried, and maybe it’s worth a shot if you’re looking for a simpler way to do this. If this works, I don’t know if you’ll have to reboot your Mac twice to be able to restart in Windows, or if by restarting the Mac holding down the Alt key, the Mac will be able to detect the Windows partition on the external drive and boot directly into it. 

Anyway, I hope all this will help. As I’m going to face a couple of very busy weeks, work-wise, I won’t be able to offer further assistance in case you encounter an issue with this procedure. Again, things should be fine if you’re patient and careful.

How I’d live this quarantine if it was 1990

Tech Life

Joshua Topolsky, from Thank god for the Internet:

But thank god for the internet. What the hell would we do right now without the internet? How would so many of us work, stay connected, stay informed, stay entertained? For all of its failings and flops, all of its breeches and blunders, the internet has become the digital town square that we always believed it could and should be. At a time when politicians and many corporations have exhibited the worst instincts, we’re seeing some of the best of what humanity has to offer — and we’re seeing it because the internet exists.
Now, I’m not letting Mark Zuckerberg or Jeff Bezos off the hook, but we also can’t deny that there is still good, still utility, still humanity present here — and it’s saving us in huge ways and little ones, too. In the shadow of the coronavirus, the sum of the “good” internet has dwarfed its bad parts. The din of a connected humanity that needs the internet has all but drowned out its worst parts. Oh, they’re still there, but it’s clear they aren’t what the internet is; they’re merely the runoff, the waste product. 

John Gruber, quoting this passage above, remarks:

So true. Feeling isolated? Cooped up? Me too. But imagine what this would’ve been like 30 years ago. This sort of crisis is what the internet was designed for, and it’s working. 

 


 

These two posts have inspired me to make a sort of thought experiment. Considering I’ve been self-isolating a bit earlier than when the quarantine became mandatory in my country (Spain), I’m now on Day 33 of my stay-at-home life. So I’ve started wondering, What would I do if this was 1990 instead of 2020? How would my ‘quarantine lifestyle’ be like?

It’s easy and natural to say, Oh, it would be horrible without the Internet, but that’s because in our time-travelling musings, we approach 1990 with a 2020 mindset. It’s the same thing as when we try to imagine/remember how life before mobile phones was like. And I wrote a piece about this very subject back in 2011. Here’s a relevant quote, with emphasis added:

People who have never really lived in a world without mobile phones […] might think that daily life at that time was unnecessarily complicated and ‘harder’. Organising meetings, finding people, finding places around you, having to use paper maps instead of having a portable device with GPS functionalities built in, not being able to look things up in Google or Wikipedia at any time. The truth is, people knew how to organise themselves with the tools they had available. Daily life had a completely different pace and style, built around the tools available at that time. It really isn’t a matter of ‘worse’ or ‘better’ — life was just different. People were equally able to organise their meetings, to communicate with one another, to go to places never before visited by using a map or tourist guide, to search for information at public libraries, and so on and so forth. 

In 1990 I was still living in Italy, finishing high school. I already had a room full of books (I developed a passion for reading at a very early age, mostly thanks to my grandfather, an erudite with a vast personal library). I already was fascinated by computers and technology, already keeping up with the news by devouring several computer magazines (my dad had subscribed me to one of the best at the time — it was called Bit[1]).

In 1990 my home computer was a Commodore 64; by that time it was a pretty souped-up setup, with its dedicated monitor, tape drive, disk drive, and printer, and I even expanded the C64’s RAM. I mostly used it for gaming, but also more ‘serious’ stuff thanks to GEOS — an operating system with a graphical user interface which turned the humble C64 into a ‘poor man’s Macintosh’, at least visually. 

But my dad had also started bringing home discarded IBM and IBM-compatible PCs from the company where he worked, so I was using old word processors like WordStar to write school assignments and also my first short stories, which I would print using a loud IBM dot-matrix printer my dad had also procured.

If this was 1990, and a pandemic had struck and forced people to stay at home, the only thing I would miss would be going to the studio I was apprenticed to, where I was learning desktop publishing on a Macintosh SE + LaserWriter workstation. But for the rest, I’d say I would have a lot to keep myself entertained:

  • I’d have lots of books to read (or finish reading) at my disposal.
  • I’d have tools and materials I could use to write my fiction, from an Olivetti electric typewriter, to a old IBM PC AT connected to a printer.
  • I could keep in touch with friends and relatives via landline telephone.
  • I could get the news and a bit of entertainment from TV, radio, and papers.
  • I could listen to vinyl records, CDs, and cassettes on the home Hi-Fi stereo, or in my room with my old Walkman. My parents owned a fair amount of records, there was always music in our home.
  • In 1990 I was still living with my parents, so if we wanted to spend time playing together, we would take out our boxes of board games and cards.

These are just a few things off the top of my head. The Internet has brought a lot of good and bad things into our lives, but it’s not that the world was hopelessly boring and grim before the Internet and social media as we know them today even existed. As for the distractions, thirty years ago they felt — how can I put it? — less wasteful. 

At least in my case, sometimes today there’s a certain depressing aftertaste after, say, spending a couple of hours deep down some rabbit hole in YouTube. The entertainment I may have felt during those two hours, or whatever I may have learnt during those two hours, quickly evaporates afterwards, and I’m left with the sinking feeling of having wasted two hours of my life, of having burnt a precious resource — time — that I’ll never get back. 

In 1990, an hour spent on the phone talking with my best friend felt enriching. An hour spent playing some games on the Commodore 64 felt good because it was usually followed by several attempts at understanding how the game’s BASIC program worked. Music was less of a background, and it used to inspire a lot of my writings; often it immersed me in the perfect mood to jot down ideas for a story. Same with films. Perhaps the fact that in 1990 I didn’t have access to the staggering amount of information and choices I have today, made me more focused on what was available to me. And while more limited in scope, I had a deeper knowledge in those selected areas/subjects. In contrast, today I’m faced with such an amount of information and choices that I often feel like a lot of my time is spent ‘just browsing’, gathering crumbs of information, rather than forming deeper nodes of knowledge, if you know what I mean.

In case this is starting to feel like a nostalgia trip to you, the gist of this, in the end, is what I wrote in that 2011 piece I quoted above: It really isn’t a matter of ‘worse’ or ‘better’ — life was just different. If this was 1990, I don’t think we’d feel much more isolated than we’re feeling now. It’s all in what I call the ‘time-travelling bias’ — if we were transported back in 1990 with all our current baggage of habits and conveniences, then yes, it probably would be a dreadful experience for many. But if you, like me, were already alive in 1990 and remember how life and your life was back then, then you probably realise we would have found plenty of ways to spend our time in isolation.

 


  • 1. You can see some of the issues on the Internet Archive; it was first inspired by BYTE Magazine, and was published between 1978 and 1997). ↩︎

 

Yes to everything — Addendum

Tech Life

My previous article about the iPad, Yes to everything, was difficult to write because, as I was drafting it, one observation led to another couple of thoughts, which in turn begot other thoughts… It was getting hard to provide a cohesive discourse. What I did was to gather as many thoughts as possible in a coherent whole, and leave additional stray observations as a coda. But that ultimately resulted in a very long piece. So, before hitting the Publish button, I decided to leave those stray observations out and write a standalone ‘addendum’ piece — the one you’re reading now. 

Meanwhile, I received the most varied feedback and remarks about Yes to everything, and so in this piece I’ve also added my responses to a few remarks worth of consideration.

1. The feedback

It went pretty much as expected: 1) The article was largely ignored by the higher circles of Apple and tech punditry. 2) I received positive feedback and praise by some people, a few of them exactly catching what I meant to say. 3) I received a considerable amount of private communications, the sheer majority from incensed iPad fans, many completely misunderstanding every word I wrote. (How do you manage to do that, by the way?)

When I write something, I want to express my thoughts and observations as clearly as I possibly can. For me, it’s never a matter of ‘being right at all costs’. If I get some facts wrongly, I have no problems in admitting my mistake. Constructive feedback is always very welcome. What I do not tolerate are personal attacks, people who put words I haven’t said in my mouth, and people who write me emails nerdsplaining me things as if I haven’t used all kinds of computing devices since the early 1980s or seen a goddamn user interface for the past 35 years or so. 

What I do not tolerate is the utter toxicity some people display the second your views start differing from theirs even in the slightest. What’s even more tragicomic is the underlying misunderstanding and point-missing: writing me a message in all caps ‘shouting’, You don’t get it! You don’t know what you’re talking about! The iPad is awesome and I use it to do all the things I need — when in my piece I literally wrote:

Others mistake my criticism for the iPad at the conceptual level for criticism aimed at the device itself. Nothing could be further from the truth. I do think the iPad is an impressive device. I don’t deny it’s an engineering feat. I absolutely think you can do all kinds of serious work on it. And I’m happy for all those who are able to make the most of it. 

So hey, keyboard warriors, how about you re-read my articles more slowly before jumping at my throat with your nonsense? 

2. So, what kind of innovation would I like?

A few people asked me more or less the same question: Then what kind of technological innovation are you craving for? Well, apparently one I haven’t seen in a while: ideas, projects, designs, plans. One of the things that struck me most about computer scientists of decades past — one thing I was reminded of as I transcribed the interviews I’ve recently published here, especially the one with Alan Kay — is that their approach seemed to be something like ‘ideas first, technology later’. 

Some of them had complete visions of what they wanted computing to become, and then they started working on them to do everything they could to make those visions become true. Sometimes there were no detailed plans, but intuitions, insights, that were enough to point towards a direction. When a technological advancement was achieved, such as the microprocessor, it made previously-theorised designs and applications happen for real. 

What I’m seeing today is more like the opposite approach: a laser focus on technological advancements to hopefully extract some good ideas and use cases from. Where there are some ideas, or sparks, they seem hopelessly limited in scope or unimaginatively iterative, anchored to the previous incarnation or design — like, How can we make this better, sleeker, more polished? Whereas there seems to be a dearth of questions like, What’s next? Where do we go from here? How can we circumvent these interface limitations? How can we meaningfully change the way X is done? and so forth. General questions, larger in scope, not tied to a single product. Heck, not tied to the previous iteration of a product, even.

Today, both manufacturers and users have this fascination for the product, the gadget, the tool. People want the faster horse, tech companies give them faster horses and focus almost exclusively on how to make the next horses even faster. Perhaps I’m being hopelessly idealistic here, but I would like to see more fascination for the purpose, for the exploration of different ways to do things and achieve goals, for the end more than the mere means to an end. When Project Courier first emerged back in 2009, I remember Microsoft being harshly criticised for making concept videos instead of releasing an actual product. I still think that a ‘concept video’, when thoughtful, may have some value, in that it presents an idea, or even a complete design, of a possible alternative path or solution. And while it may still be unfeasible in the here and now (for lack of essential technologies, or simply because it breaks a lot of conventions), it remains an inspiration, a concept that is now out there and may turn out to be the decisive spark towards something truly innovative.

3. User interface comparisons: ‘But the Mac UI isn’t great either…’

A lot of people keep turning this matter into a Mac vs iPad shoot-out. It is not. Think about it this way: if one criticised the user interface of an MP3 player, would you respond by saying “But the user interface of a Hi-Fi stereo system isn’t great either”? Well, you could, but that would be missing the point, because the MP3 player’s whole reason of being is that it’s supposed to be an easier-to-use device than the stationary Hi-Fi stereo system you have at home. The way it is designed, the use cases it’s been designed for, demand more immediacy, simplicity, and friendliness. 

Similarly, if one of the iPad’s core reasons for being is to be a more immediate, friendlier, easier to use device than a traditional computer, its user interface must take all this into account. User interface complexities are expected and somewhat more forgivable on traditional computers — because of the very nature of the computer and the ways you use it. Despite some superficial interface similarities, these are different systems with different approaches and expectations. 

If in becoming the perfect laptop replacement, the iPad becomes effectively a laptop, with a user interface that is just as complex, then what’s the point? Having a touch interface as a differentiator? Other laptops have it. Having pen input? Other laptops have it.

So, I’m not criticising the user interface of the iPad by saying it’s ‘worse’ than the Mac’s. I’m criticising it because I think that a device like the iPad ought to have a better one, period. A device that is aimed at being the better alternative to a traditional computer ought to have the better user interface it can have for such a role.

From what I’m seeing, though, iteration after iteration iOS on the iPad is incorporating an increasing number of features and concepts that come straight from traditional computing. Not a bad thing, per se, but it adds weight to an otherwise sleek user interface. And since familiarity is the easiest shortcut, these features are added in such a way that makes the iPad’s user interface become more similar to a traditional computer’s interface. The conceptual challenge would be: how to incorporate the functionality without adding the same look and paradigm? How to incorporate the feature without adding weight and friction?

Soon you’ll be able to use Photoshop (or a similar app) on the iPad in laptop configuration, moving around the app’s interface with a mouse or trackpad, using it to draw and select stuff. And the experience would be strikingly similar to the one you had in the 1990s, using Photoshop on a PowerBook with System 7 or Mac OS 8. That’s why I keep wondering if maybe there’s another way to do things, one that’s not so riddled of old paradigms and déjà vu.

4. The ‘obsession’ with Jobs

Of course my mentioning Steve Jobs yet again caused controversy and triggered exasperated responses. I’m not ‘obsessed’ with Steve Jobs, as some of you wrote me. I know he’s not around anymore. I know that wondering what he would do if he were still around has no practical value. I was simply speculating on ‘the road not taken’. It may be useless at the pragmatic level, but I find it to be an important thought experiment. A way to zoom out of the specificity of the device and make more general observations. Using a well worn metaphor, we’re so mesmerised by the trees today, that we can’t seem to wonder about the forest anymore. 

Despite what you may think, I don’t idolise Jobs. He had his flaws and his blind spots, but he was a better thinker than many others in this industry. On the one hand, he had that ‘ideas first, technology later’ approach I was talking about in point №2 above; on the other, he had the ability to turn his vision into something commercially feasible (and often successful). Wondering what Jobs would do, ultimately, means wondering what someone with his perspective and mindset would do, not always and necessarily Steve the man. I miss figures who can think like him today. Today we have a lot of business strategists in tech, but so very few ‘practical visionaries’. 

(By the way, I also wonder what Nikola Tesla would do if he were still alive. The use of clean energy would probably be more widespread. I want another Nikola Tesla, more than someone producing vehicles in his name.)

5. The nerve I struck

It seems that, in all this, the nerve I managed to hit can be summarised as follows: that while I consider the iPad a great device, I don’t think of it as being ‘special’. Or as special as a lot of iPad users seem to consider it.

I respect the fact that the iPad may be a revolutionary device for them personally. I very much agree that the iPad has been an important device in making a lot of people less averse to technology by being less intimidating than a computer. But as iPadOS develops, I’m wondering for how long the iPad can be used as best example of an intuitive, non-intimidating device.

I was in one of the stores in Spain the day when the iPad became first available in 2010. I saw all kinds of people — small children, older folks, non-technical people — immediately knowing their way around it. In 2020, I’ve seen more first-timers struggle with it. Maybe they just tap on an app and fiddle with it. Or they swipe a bit around the interface, but there’s more hesitation. Some, once they’re in an app, find difficult to get out of it. I know, for us geeks it’s hard to put ourselves in such inexperienced shoes. But not everyone shares our interest or involvement in these matters. 

6. Nothing personal

Understand this: I’m not against the iPad, nor am I against iPad users. I’ve been told that in my criticism towards the iPad I have also sounded judgemental and dismissive of those who have chosen it as their preferred solution. I’m not. But I certainly am judgemental and dismissive of more toxic iPad users whose attitude comes across as very smug, as if to say, We’re the enlightened, we’re living the future, and you don’t get it.

I certainly don’t think less of a person who has chosen the iPad as their main or sole environment. That would be quite immature on my part. But I’ve received some unpublishable feedback from people who made very clear they do think less of me because I have not chosen the iPad way. This is what happens when people get religious about their preferences.

And finally, speaking of preferences, if I’m staying on the Mac there’s nothing religious about it. In recent times I have been quite critical of Mac OS as well, in case you’ve missed it. It’s simply the environment in which I work best. I am waiting for the next best thing, and at the moment the iPad is just not it for me.

Yes to everything

Tech Life

Every time I gather observations and thoughts for a piece on the iPad, I feel I keep returning to the same old insights I’ve had for years. I knew Apple would complicate the iPad’s user interface this way. That many people are happy with it doesn’t mean it’s inherently a good idea. 

Anyway. The other day, Apple introduced new iPad Pros, and an updated MacBook Air line-up. Most notably on the iPad hardware front, along with improving whatever feature was improvable, Apple has presented a new accessory — the Magic Keyboard. It has a trackpad. And on the software front, the upcoming iPadOS 13.4 will offer full mouse and trackpad support. 

Trackpad support was of course well received by iPad fans and all the people using the iPad as a main (or sole) computing device for work and leisure. Some praised the innovation of the new cursor, which Apple in their marketing describe as being The biggest thing to happen to the cursor since point and click. (Let me pause and eyeroll for a moment here). It’s an interesting take and a good execution. It’s also the least Apple could do on such a device — devising a cursor that is more context-aware and responsive than the one you find in a traditional computer is frankly more consequential than innovative.

As is consequential the fact that now the iPad supports mouse/trackpad input. Some of the comments I saw floating around mentioned how Apple has finally given in to the pressing requests from the iPad community, from people who wanted a more ‘Surface-like’ approach for the iPad, so as to make it a more suitable device for productivity.

While that may also be true, what I think is that Apple has actually given in to adding mouse/trackpad support to the iPad because they were essentially out of options. And because for them it is a convenient problem solver. It’s Mr Wolf in Pulp Fiction: the one you call when you need a professional to clean up your mess.

And the iPad’s user interface still looks a bit messy. You may be accustomed to it. You may be so proficient at moving inside of it that you even love it. I’m not here to criticise your preferences or the iPad as a device. You wanted a ‘faster horse’ — enjoy your faster horse[1]. I’m simply speaking from a conceptual standpoint. And from that standpoint, what I see is that the iPad’s user interface is a patchwork. Features, gestures, combinations of gestures, user interface layers, all stitched together over the years. 

Steve Jobs was quoted as saying: “People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are. You have to pick carefully. I’m actually as proud of the things we haven’t done as the things I have done. Innovation is saying ‘no’ to 1,000 things.”

By contrast, it appears the iPad is increasingly saying yes to everything.

Those who have no problems with the poor discoverability of several gestures or features still see the iPad as a flexible device that adapts to the needs of its users. They say, “If you feel that the multitasking interface is opaque, it’s okay. You’re not accustomed to it, and you probably don’t need it. The iPad keeps being intuitive for those who only use it at a basic level.”

From a visual standpoint, there might be very little difference between a feature that is not visible and a feature that is out of the way. Conceptually, this is a big deal instead. A feature that is not visible and your only way to find it is by reading about it somewhere, or seeing a video tutorial, is something undiscoverable and poorly executed. A feature that is out of the way, but you get hints of its existence by the system, is an indication of at least a modicum of design-oriented thinking behind it. If the iPad’s user interface were truly well thought-out, the more so-called ‘pro’ features would be more discoverable. I wouldn’t get feedback messages from regular folks telling me, I didn’t know I could do this on my iPad, with some even adding that they discovered some gesture or feature while erroneously performing a known one.

The more layers of interaction you give to the device, the trickier things get. If the solution to a previously undiscoverable feature is to make the feature (more) discoverable through the use of a different input source, you may have found a way out of the dead end you got stuck in, but it’s not good design, strictly speaking. (I remember an exchange between a woman and an electronics shop’s employee: after buying a Windows laptop she returned to the shop to complain about the poor trackpad performance, and the employee told her to “just use a mouse”. Why not make a better trackpad, instead?) 

The comparison with Microsoft’s Surface

The iPad getting proper mouse input support, and the new Magic Keyboard for the iPad featuring a regular trackpad, have naturally invited people and reviewers to draw comparisons between the iPad and the Surface. But I don’t see it as Apple ‘catching up’ with Microsoft. I see it more as Apple bringing their racing car to a different kind of championship.

Microsoft’s Surface may have its flaws. Its user interface may have its inconsistencies and limitations, but it doesn’t bear the signs of the iPad’s long-standing identity crisis. The Surface and the iPad have different origin stories, and those are reflected in the way you approach and use these devices.

The Surface wasn’t really born as a pure tablet with a tailored mobile operating system on it. The concept Microsoft wanted to contribute was of an ultracompact laptop first, with tablet functionalities added on as a convenient alternative to perform quick tasks as needed, without burdening the user with a device fixed in its laptop configuration and behaving like a laptop all the time. 

Still, all the devices in the different Surface product lines are essentially laptops (of different weights and capabilities) that can work as, or transform into, tablets when the need arises. Even the first generation of Surface devices back in 2012–2013 were hardly ever seen in the wild without their keyboard, despite it being ‘optional’. They’re very much touchscreen computers with a tablet mode, with productivity as their main purpose. Technically, their Apple counterpart would be something more akin to a ModBook than an iPad.

Their operating system, in a way or another, has always been some version of Windows with additional touch- and tablet-friendly features enabled, to make the Surface a more versatile device. 

The Surface knows what it is. And Surface users know what to expect from it, in terms of functionality and interface. The user interface could be improved here and there, but it’s not ambiguous. The levels of interaction comfort aren’t either. There is a distinctive best/good/okay comfort range as you go from operating a Surface like a Windows laptop, to using it as a tablet with pen input, to using it with touch input with just your fingers. But that feels fine because that’s the experience the Surface is supposed to provide. 

What Microsoft has strived to do over the past eight or so years has been to improve the Surface experience within that model, within that paradigm, and I’d say they’ve been rather successful at that. The next step is represented by devices like the Neo and the Duo, that introduce the new dual screen idea in form and function. The aim is, again, to improve productivity by creating a literal dual space to multitask and facilitate interoperation between apps and tasks, if and when needed. 

The iPad, on the other hand, has had a more varied history, and has been more of a chameleon — with regard to both purpose and interface. It was born as a separate device with unique characteristics to fill the perceived void between a laptop and a smartphone. In 2010, when introducing the iPad, Steve Jobs said, In order to really create a new category of devices, those devices are going to have to be far better at doing some key tasks. They’re gonna have to be far better at doing some really important things: better than the laptop, better than the smartphone.

And in its first iterations, the iPad was exactly that; its identity pretty clear — ‘a big iPhone’ that could be just as easy to use as an iPhone, but better at doing certain things due to its bigger display. And better than a laptop because certain basic tasks and operations were simply more intuitive to carry out thanks to the multi-touch interface. That really killed all the remaining netbooks still in use at the time, and many non-tech-savvy people were happy to use a small laptop-sized device that was much less intimidating to use than a traditional computer. All thanks to its user interface and its very operating system, that was not Mac OS X slapped on a touch-based device, but something that felt much more integrated and suitable for such a device. The learning curve was also low because people already knew iOS thanks to the iPhone’s success.

Then, unfortunately, Steve Jobs passed away.

I can see your eyes rolling from here, but bear with me. Although I’ve never denied my utter preference for Jobs’s leadership over Cook’s, I’m not trying to argue that the iPad would necessarily have had a better development and trajectory under Jobs, but it’s undeniable that the iPad is perhaps the device that has suffered the most from Jobs’s absence. Under his tenure, Apple released the first-generation iPad and the iPad 2. The iPad 2 was a first real improvement over the iPad 1: it was thinner, more powerful, and it had cameras. The iPads that came out afterwards, between 2012 and 2015, were essentially the same thing as the iPad 2, with obvious improvements in the hardware, and some improvements in the software. Conceptually, very little moved forward. The iPad Air 2, produced between 2014 and 2016, for all intents and purposes was just like the first iPad, just faster, better, and with more capable apps.

As for its conceptual evolution, as for changing the computing experience altogether, however, the iPad felt like a device stuck in stagnant waters. And it still felt pretty much like a device that didn’t know what it wanted to become. It was created as a consumption device first, with the ability to serve as an artistic tool for creation and to do the occasional productivity task if you tried really hard, with the right apps, and jumping through the right hoops. Styluses and external keyboards have always been usable on it, but the iPad has always been a ‘touch-first’ device, meant to be used like a tablet, not like an ultraportable laptop. I can’t speak for Jobs here, but I’m pretty sure he would have said something like, If you need to use the iPad as a laptop replacement, maybe it’s better if you just used a real laptop.

But then an increasing number of people, especially tech nerds, started to demand from Apple something more akin to Microsoft’s Surface in features and functionality. And Apple, from 2015–2016 onwards, started to oblige, little by little. And so they have been repurposing the iPad as it goes along without really jettisoning anything. The process has been utterly additive. Employing the famous Jobs’s analogy of trucks and cars, I’d say that from its origins as a sports car, the iPad has progressively become a sports car that can be retrofitted with a trailer, off-road tyres, a 4WD transmission, and so forth. 

Some look at the latest iPad Pro, at the full support for mouse input in iPadOS 13.4, at the new Magic Keyboard with trackpad, as a winning combination of tools that make the iPad a truly versatile device. And maybe it is so from a practical standpoint. Again, conceptually, I look at ten years of the iPad and I see its trajectory as going from being a ‘jack of some trades, master of some’ to being a ‘jack of all trades, still master of some, but not all’. 

The story and evolution of Microsoft’s Surface are perhaps simpler and less ambitious, but over the years have proceeded with a much clearer process, iterations, and intentions. Apple now probably aims for the iPad to be a sort of blank-slate device, so technically capable that it can do anything you want it to do. But all this retrofitting to make it also behave like a compact laptop has been — still is — a painful process to behold. I keep feeling the iPad could have been so much more in so many different, countercurrent ways, and all it has done in ten years is to become something more conventional.

Where the iPad is truly at the forefront today is hardware (industrial design + manufacturing + tech specs). But idea, concept, purpose? Not anymore. Others are trying to match the iPad in hardware, Apple is borrowing ideas and purposes from others. If there’s combined progress in all this, it’s inertial.

Again, I can’t be sure, I don’t have the ability to see alternate timelines, but I truly wonder what was Jobs’s ultimate idea for the iPad. What direction he wanted to point it. I’m not saying that things would have been better if Steve Jobs were still among us. But I’m sure we would have felt a stronger sense of direction for the iPad. A clearer vision, even if more polarising, perhaps. 

What I felt back in 2010–2011 was that Jobs’s plan could have been to gradually evolve the iPad into a unique computing device, using the tablet format and the multi-touch interface to effectively revolutionise what it meant to be productive using something that is not a traditional computer; to end up with a device that could go beyond the old and established paradigms and metaphors of traditional desktop computing. If he had wanted the iPad to progressively become a Surface-like device, he would have probably sherlocked the aforementioned ModBook and create a touch MacBook with Mac OS X.

Maybe this is the root of my general feeling of disappointment in the iPad — that Apple didn’t make enough efforts to come up with a transformative UI that could revolutionise how people can be productive on a tablet, without having to resort to traditional paradigms and input devices. Without reinventing the computing wheel for so many tasks just so they can be easily carried out on an iPad, even when it would make much more sense to just use a laptop.

Yes, maybe my expectations have always been high on this front. But not unreasoningly so. Is it really too much to ask of a tablet today, after seeing how innovative certain parts of the Apple Newton’s user interface could be more than 20 years ago?

For some, having an iPad acquire more Surface-like capabilities may be a success, a much awaited move that will solve so many things. For me this move, that brings the iPad even closer to a Mac laptop in functionality, in turn makes the iPad even less compelling. 

The big picture

Judging by previous feedback I received after publishing other articles on the iPad and ranting about my disappointment, a lot of people think I’m still clinging to the past, to the Mac and traditional computers, that I’m averse to change, that I’m ‘old’ and not flexible enough to adapt to this bright future of computing spearheaded by this incredibly awesome and innovative device that is the iPad. 

Others mistake my criticism for the iPad at the conceptual level for criticism aimed at the device itself. Nothing could be further from the truth. I do think the iPad is an impressive device. I don’t deny it’s an engineering feat. I absolutely think you can do all kinds of serious work on it. And I’m happy for all those who are able to make the most of it. (Yes, whenever the iPad vs Mac debate rages on Twitter, I have indeed indulged in some sarcasm. But come on, who doesn’t on Twitter?)

However, as someone who for several years has cultivated a deep interest for the history of computing and the user interface, I simply can’t look at the iPad (or the Surface, for that matter) and see real progress. Again, I’m not talking about computing power and features. The iPad Pro today is so much more powerful than a supercomputer from the 1970s. I’m talking conceptually. The ideas that drove the computer scientists at RAND corporation to create the RAND tablet in the mid-1960s were more advanced in scope than the ideas behind any tablet available today. And in certain respects more daring, as that tablet was meant to be operated without any keyboard whatsoever. It had an amazing handwriting recognition for the time, and all input came via its stylus. And some of the capabilities of Sketchpad, the groundbreaking program written by Ivan Sutherland in 1963, are still hard to beat in intuitiveness and execution, almost sixty years later.

So when I see a tablet device in 2020 become more usable thanks to it finally supporting mouse input of all things, and not because of some other advancement in touch technology, input method, user interaction or user interface design, forgive me if I feel underwhelmed and a bit disheartened. What we do with our devices today is something people like Alan Kay envisaged in the 1960s and 1970s. So no, I’m not clinging to the past or averse to change. I see where we are today and I’m baffled we haven’t advanced further. Or rather, the hardware has. But concepts, paradigms and metaphors are still the ones that have been circulating for more than sixty years. Today I see future-looking hardware marred by backward-looking software, interfaces, and interactions. In a sense, everyone’s clinging to the past, in a way or another.

Then why do I still choose the Mac over the iPad? Until I see real progress on those fronts I mentioned above, why should I waste time, money, and energies to be able to do on an iPad the same things I can already do with ease, experience and efficiency on a Mac? I would gladly undergo the re-learning process if that meant mastering a new device or interface concept that would bring significant benefits over ‘the old ways’ in terms of interaction, productivity, fulfilment, and so forth — or even something new in a meaningful way, something that was not possible before. But for now I keep seeing ‘the old ways’ re-emerge here and there behind the external layer of coolness of the iPad. I can’t be averse to change when I don’t even really perceive change in the first place.

 


  • 1. A reference to the famous quote attributed to Henry Ford (used by Steve Jobs as well): “If I had asked people what they wanted, they would have said faster horses.” ↩︎