After WWDC 2020: bittersweet Mac

Tech Life

I won’t be going through everything that was announced at the WWDC 2020. There have been so many other sources writing very comprehensive overviews already; and I’m terrible at overviews anyway — I always end up forgetting something. 

I must also confess that when I started watching the WWDC keynote, I was anxiously waiting for the Mac segment and the talk about the transition from Intel architecture to ARM (which Apple calls Apple Silicon, at least for now, in their characteristically generic nomenclature). So I paid very little attention to the Apple Watch news, and enough attention to the iOS/iPadOS part to be at least aware of the main changes.

Oh, and speaking of iOS, I’ve been a bit amused by how a lot of pundits and commentators have talked about Widgets in iOS 14 as being this huge deal. Granted, it’s the first major change in iOS’s Springboard since Folders in iOS 4 ten years ago. And granted, the widgets they showed look cool and their implementation appears to be well executed. But to my eyes the mix of widgets and regular apps gives the Springboard a busier look and feel. On Android, widgets can be positioned on the screen with less constraints and can be kept visually separated from app icons. And despite the behaviour of iOS’s widgets has been considered more akin to Windows Phone’s Live Tiles, Windows Phone still manages to look more elegant because in that operating system there’s virtually no distinction between ‘app icon’ and ‘widget’ — everything that is displayed on the main screen is, well, a tile, with customisable sizes. 

The transition to ARM: unexpected magnanimity

As you know, I was very worried about how Apple would handle the Intel-to-ARM transition. In my pre-WWDC post, I wrote:

The previous transition, from PowerPC to Intel chips, gave users an inordinately long grace period when it comes to software and backward compatibility. […] If you consider that the last minor release of Snow Leopard (10.6.8) was released in June 2011, this means that you could still run a PowerPC app on an Intel Mac as late as five years after the transition was complete, hardware-wise. 

That long grace period was in large part made possible by Apple releasing Rosetta, a dynamic binary translator included in the Intel versions of Mac OS X, allowing people to run PowerPC apps at almost native speeds.

For this next transition, I speculated that Apple wouldn’t bother investing time and resources in developing a similar software tool. I predicted more pragmatism on Apple’s part and said that the company would require developers to rewrite their apps to run under ARM Macs, and if you still needed to run Intel apps, well, you could keep your Intel Mac around until you would be ready to make the jump to ARM. In other words, I predicted that Apple would largely place the burden of transitioning to developers and users, in an Either you follow us or you’ll be left behind fashion.

I was genuinely surprised, and relieved, when I saw the Universal 2 and Rosetta 2 icons appear on the screen. It seems that Apple is willing to stick to the same approach they chose for the PowerPC-to-Intel transition, and that’s a good thing in my book. They said it will take them two years to complete the transition, and that Macs with Apple Silicon will start to appear later this year. I particularly appreciated the way they’ve delivered the message about leaving Intel behind: while internally they probably cannot wait to get rid of Intel chips inside Macs, their public-facing stance is much more nuanced: we’re not dropping Intel support overnight, there are still new Intel Macs in production, and, in Cook’s own words, We plan to continue to support and release new versions of Mac OS for Intel-based Macs for years to come. Again, this feels quite relieving. 

If you’ve been reading me for a while, you know I’ve spent these past years worrying about how much Apple really cares about the Mac, and those periods in the recent past with lack of meaningful updates, the degrading software quality, and the way Apple handled the butterfly keyboard fiasco, didn’t exactly give me hope. But I’ll be honest: after hearing how Apple plans to handle this next Mac transition, and especially after watching John Gruber’s discussion with Craig Federighi and Greg Joswiak, I feel more reassured about the future of the Mac.

Mac OS Big Sur

The snag is Mac OS’s new look. I’m not a fan. But you knew that. I’ve already mentioned this on Twitter: if you want to have a good review/recap of iOS 14 and Mac OS, watch this video by Quinn Nelson. When you get to the part where he criticises Mac OS Big Sur’s user interface, pay attention to his commentary, as we’re pretty much on the same page. He makes some funny remarks like, System Preferences look like Apple paid a guy on Dribbble $30 to make it in 2 days. Or, This looks like something that Xiaomi would call MiMac. Or, Looks like Apple tried to knock off their own OS.

I get that the major force driving this visual change is to make Mac OS look and feel more similar to iOS and especially iPadOS. Apple is decidedly marching towards a homogeneousness across its major platforms that soon will embrace both software and hardware. But in everything I’ve seen of the user interface of Mac OS Big Sur so far, I’ve noticed how Apple seems to prioritise looks over function. In his video, Quinn Nelson also makes a more serious remark about the UI: It just seems too simple and yet too cluttered; which was exactly my very first impression as soon as I saw the demos.

In the WWDC keynote, there’s this bit from the short video with Alan Dye (Apple’s VP of Human Interface) where he says:

We’ve reduced visual complexity to keep the focus on users’ content. Buttons and controls appear when you need them, and they recede when you don’t.

And that’s one of the main things that bother me about Big Sur’s UI. I’m not a VP of Human Interface, but I’d say that a desktop operating system you interact with using complex and precise input methods and devices, can in fact afford a certain visual complexity without getting in the user’s way. Which is what I (and I suspect many other people) have always loved about Mac OS. An operating system characterised by a user-friendly, easy-to-use, but not-dumbed-down interface. I’d hate to see a progressive oversimplification of the Mac’s UI that could potentially introduce the same discoverability issues that are still present in iPadOS.

I’ve always considered the look of an operating system to be a by-product of how it works, rather than a goal to achieve, if you know what I mean. If something is well-designed in the sense that it works well, provides little to no friction during use, and makes you work better, it’s very rare that it also ends up being something ugly or inelegant from a visual standpoint. How it works shapes how it looks. If you put the look before the how-it-works, you may end up with a gorgeous-looking interface that doesn’t work as well as it looks.

The renewed insistence on transparency and the alarming amount of reduced contrast present in many places of the UI makes the experience look as if it was designed by twenty-somethings with perfect vision for twenty-somethings with perfect vision. The Accessibility preference pane looks more and more like a place that is not devoted to people with physical impairments, but to people who are not on Apple’s design team or who are not within the trendiest segment of the intended target audience.

It’s just the first beta, though. I hope things will improve as betas progress. I hadn’t felt this kind of visual-change shock since the introduction of Mac OS X 10.10 Yosemite, with all that UI flattening, bold colours, and poorly-chosen Helvetica Neue as system font.

Quietly optimistic

I usually avoid posting my impressions right after a big Apple event, especially one as filled with new stuff as this WWDC 2020. This time I’ve purposefully forced myself to wait, and take in other people’s impressions and observations first. My first reaction to Mac OS’s new look was of shocked incredulity, some of my comments were bitter and destructive, and while it’s still hard to look at Big Sur without wincing, rage-quitting a platform after using it for 31 years isn’t really a thoughtful alternative. I’ve decided, now more than ever, for a wait-and-see approach. I’m still not upgrading my current Macs, but I’m considering getting another Mac that is modern enough to run Catalina and the Big Sur betas, and use it as a disposable test machine. I prefer sharing any detailed criticism about Big Sur’s UI after I’ve used it myself on a Mac, in real-life, real-production conditions.

The hardware part of this transition is admittedly what’s keeping me interested and downright excited at the moment. I’m very curious to see how having powerful and power-efficient processors will affect hardware design in future Macs. While I bet that Apple can’t wait to get back at designing thinner and thinner laptops, it would be interesting to see whether they release even slimmer desktop machines as well. I just hope ports won’t also keep disappearing as devices get thinner. Anyway, it’s clear that Apple has plans for the Mac, and while I may not fully agree with the direction they want to push it or how they want to transform it, it’s still better than not having the Mac around at all. 

I think.

A few thoughts before WWDC 2020 on the next Mac transition to the ARM architecture

Tech Life

Like many, I have the feeling that this year’s WWDC is going to be a particularly meaty one. But frankly, of all the things Apple’s doing now, what concerns me the most is the rumoured Mac transition from Intel to ARM chips.

Looking back on my very blog, I’ve realised that it’s been at least two years since the tech world has been musing on it. But in 2018 it was mostly a ‘what if’ scenario. Today we know that something is really about to happen, one way or another.

Since the publication of Mark Gurman’s article, I’ve been reading a fair amount of commentary about this next transition. Most of the more balanced takes seem to share some optimism about how Apple will handle the transition. Since the company has already handled two chip architecture transitions in the past (Motorola 68K CISC → PowerPC RISC in 1994, and PowerPC → Intel in 2005–2006) — and things went rather smoothly in both cases — they guess that this third transition will play out just as smoothly. 

I don’t entirely share this confidence. The past two transitions were handled by two very different Apples, and this next one will be handled by yet another different Apple. Same company, but different times, different people, different leadership, different priorities.

Back in 2018, in Speculation and dread for the next transition, I wrote:

All these major transitions [In the article I also included the major operating system software transition, from Mac OS 9 to Mac OS X, that took place in 2001–2002] have common characteristics:

  • They were all rather user-friendly and customer-friendly.
  • They weren’t particularly rushed: there was both preparation and confidence on Apple’s part, and they unfolded over a long period of time and at an acceptable pace. Users had to update eventually, but they were given plenty of time to do so.
  • All these transitions were for the better. […]

Like others, I’m sure that the ARM architecture, coupled with Apple-designed custom chips, will certainly benefit the Mac from a performance/consumption standpoint. But what still makes me apprehensive about the whole thing is how Apple — this Apple — will handle the software transition.

The previous transition, from PowerPC to Intel chips, gave users an inordinately long grace period when it comes to software and backward compatibility. Shortly after Jobs announced Apple’s plans at WWDC 2005, many developers started converting their PowerPC-only apps into Universal Binaries (apps that could run on both architectures). And for all those older PowerPC apps that were not converted into Universal Binaries or recompiled to run exclusively on Intel Macs, a dynamic binary translator called Rosetta — included in the Intel versions of Mac OS X Tiger and Leopard, and optionally available under Snow Leopard — allowed people to run PowerPC apps at almost native speeds. If you consider that the last minor release of Snow Leopard (10.6.8) was released in June 2011, this means that you could still run a PowerPC app on an Intel Mac as late as five years after the transition was complete, hardware-wise.

For this next transition, I really don’t expect such a generous grace period. When it comes to Mac OS, in recent years Apple’s attitude has been like, Let’s try and get rid of whatever we don’t have the time or the manpower to fix. To be fair, Apple’s willingness to drop as much baggage as possible whenever possible has always been one of its most characteristic traits for the past 20 years, but Cook’s Apple seems particularly interested in doing so considering just how many platforms they’ve chosen to juggle.

From what I understood, on a technical level, there’s this:

  • Recompiling an Intel app to work under the ARM architecture should be a relatively easier task (with exceptions) compared to what was recompiling a PowerPC app to work under the Intel architecture 15 years ago. Given the appropriate tools, of course.
  • Emulating an Intel app so that it runs in an ARM-based Mac, while possible, doesn’t offer adequate or usable performance. One should not expect Rosetta-like results, in other words.

Some have speculated that Apple, to ease the transition, could initially ship Macs that contain both an ARM processor and an Intel processor, the latter taking care of running x86 code at native speeds. Sort of a hardware Rosetta, if you like.

The scenario I’m fearing, however, sees a more pragmatic stance on Apple’s part, and it would play out like this:

  • Apple announces their transition plans at WWDC 2020, previewing the tools the company will make available to developers so that they can recompile/rework their apps to run under the new architecture.
  • Apple will offer new ARM-based Macs gradually, so that if you still need to run Intel-based apps that are either not updated anymore or whose redevelopment needs time, you can keep using your Intel Macs to run them; and if you still need such apps after the hardware transition is complete, well, you better keep that old Intel Mac close to your chest, because ARM-based Macs will only run ARM apps.

In other words I think that Apple, once the wheels of this next transition are set in motion, will do the bare minimum to make this transition smooth for developers or end users. The music will be: Developers, here’s what’s new. Get to work, the sooner the better.

If this happens, it’ll certainly result in even more pruning of all the software that, for a reason or another, won’t be ported to the ARM architecture. And this, after the already bitter pill of Mac OS Catalina dropping support of 32-bit apps (a vast catalogue of perfectly good software), will be another hard pill to swallow. 

The excuse will be that ARM Macs are going to be more efficient, more powerful, more secure, and with a fresh catalogue of optimised apps to run under the ARM architecture. Apps notarised by Apple, with the company’s seal of approval. Everyone wins! Well, everyone except those who would like for the Mac to keep being a versatile and ‘open’ platform, where you can install apps developed by anyone if you want; where you can use Boot Camp to reboot your Mac into Windows if you want (or — gasp — need to); where you can maybe enjoy all those nice vintage 32-bit Intel games every now and then.

Apple can put in place a lot of solutions for developers and users to ease the hassle of this new transition, and I really really hope they’ll be gracious enough to do so. But when we start looking at what can possibly be the incentive to do so, I can only think of “To avoid further alienating Mac developers and users”, and I can’t help thinking that Apple is quite ready to take such risk. 

This is, in a nutshell, the main reason of my current apprehension. I’m waiting for Monday with trepidation, hoping to be wrong — or to have been too pessimistic — about this.

Observations after an SSD failure

Tech Life

Intro: going SSD and enjoying the fast lane

Believe it or not, in late 2016 my 2009 MacBook Pro was still my primary machine. Evidently, my type of work doesn’t require cutting-edge performance; still, the Mac’s internal 5400rpm hard drive was by then more than 80% full, and every time I had to restart it was a pain, especially considering the amount of services and login elements that get activated at boot. From the initial Apple logo to a responsive Desktop, the MacBook Pro was taking more than 4 minutes to boot. It was time to get a solid state drive.

I had heard from other people who already upgraded their older Macs to SSD storage, and they were all astounded by the sheer increase in overall responsiveness. The consensus was that their Macs were getting a new lease of life, that the difference was so noticeable it was almost like having a new Mac.

Since the internal optical drive of the MacBook Pro had died a few years earlier, and the hard drive was still working well, the perfect upgrade solution was to purchase an OWC Data Doubler Kit so I could replace the dead optical drive with an SSD, while retaining the original hard drive. The new arrangement became a sort of poor man’s Fusion Drive: I would use the SSD for the system and applications, while leaving all space-consuming data (photos, videos, music, etc.) on the hard drive.

And what other people had said was true. After installing Mac OS X and rebooting the Mac, I couldn’t believe my eyes. The full boot process had gone from 4 minutes and 40 seconds down to about 35 seconds. Everything became incredibly responsive. Most applications opened instantly, and saving a few seconds for every little thing you do on your Mac while working means that ultimately you end up saving a considerable amount of time on the whole.

The SSD had also been a lifesaver for another reason. The time to upgrade my MacBook Pro was clearly drawing near anyway, but at this time Apple had released the new MacBooks with the dreaded butterfly keyboard design, and that, combined with the much increased prices, didn’t feel appealing at all. The SSD, with the performance increase it had provided, had also bought me time to decide how to proceed with an eventual upgrade.

Anyway, to make a long story short, I was able to delay an upgrade until mid-2018; I got a 21.5‑inch 4K retina iMac instead of a laptop; and then in late 2018 I also purchased a used 2013 11-inch MacBook Air, thus splitting portability and sheer performance in two different setups. 

Warning signs?

By the end of 2018 and in the early months of 2019, the old MacBook Pro had begun displaying several worrying issues that made me think it was on its last legs: random shutdowns, inability to access the discrete graphics card, temperature sensors acting up or not displaying information at all, and general unreliability.

A couple of times, after waking the MacBook Pro from sleep, everything was frozen and I had to force a reboot. Then, on another two occasions, the SSD was not detected at boot and the Mac started from the old recovery partition in the hard drive. In both cases, shutting down the Mac and leaving it alone for a while was enough to fix things, and at the next reboot the SSD was detected as usual. When it happened another time, it appeared that only a reset of the SMC would take care of the issue.

Warning signs? I’d tend to agree, except that the SSD kept working fine for many months afterwards, with no strange behaviours or reduced performance. 

Just like that

Fast forward to March 2020, when I start noticing that the MacBook Pro’s trackpad isn’t clicking properly and it takes more effort to get a consistent click out of it. After close inspection, it was clear that the MacBook Pro’s battery, located underneath, had started to swell, pushing upwards against the trackpad.

Knowing the hazards of using an electronic device with a swollen Li-Ion battery, I decided to act promptly. I turned off the Mac, opened it, removed the battery very carefully, gave the fans and vents a quick clean, closed the Mac, and switched it back on. The SSD wasn’t being detected. 

I did like the previous times. Turned off the Mac again and left it alone for a few hours. Nothing. Reset the SMC multiple times. Nothing. Swapped the SSD and hard drive (in case something was wrong with the SSD connector). Nothing: the hard drive was being detected as usual, but not the SSD. I removed the SSD and installed it in an external SATA enclosure with its own separate power supply. Nothing. 

The SSD was gone. Just like that.

I didn’t have a backup of its contents but — before you start accusing me of carelessness — a backup wasn’t really necessary. When I purchased the iMac in mid-2018, I transferred the MacBook Pro’s data to it using Migration Assistant. In the end the actual data loss was fortunately limited to some stuff I had downloaded and/or archived in the Downloads folder. Overall, the failure of this SSD hasn’t been a catastrophic event.

Still, I found myself thinking about this a lot. Because it could have been. It could have been catastrophic. I thought about all those regular folks who are often told that SSDs are more reliable than hard drives; folks who are sometimes lulled into a false sense of security and think that backups with SSDs aren’t that urgent, after all.

In my 30+ years’ experience with computers, while I certainly have seen hard drives fail just as suddenly, in many cases their failure was progressive and gradual enough to allow the user to salvage at least some data. The original hard drive of my iMac G3 back in 2001 took almost three weeks to fail completely. I was able to salvage 95% of all the data I hadn’t backed up already. An SSD doesn’t give you a grace period; it’s like a lightbulb — one day you flip the switch and pop, it’s blown.

I don’t know enough about data retrieval when SSDs are involved. Maybe it’s easier to extract it from a dead SSD than it is from a dead hard drive. But the relatively recent practice on Apple’s part to ship computers with soldered flash storage just gives me cold sweats. Especially because if something else fails at the motherboard level, your data get compromised in the process. While you may get some assistance at an Apple Store for retrieving the data, the way Macs are designed internally today introduces — unnecessarily — new points of failure.

T2 chip and Catalina, your overprotectors

On this matter, I found a recent video from Louis Rossmann to be particularly illuminating. If you don’t know him, he runs a repair shop in New York City that specialises in Apple laptop repairs. On his YouTube channel he often posts videos showcasing specific repairs, and talks about the issues he encounters. He’s known to be opinionated, and he also uses his channel to talk about other things that aren’t strictly related to his job or to technology topics. He can be polarising, no doubt, but I don’t follow him for his opinions. I’m interested in his technical expertise.

Back to the video, it’s called An important message from Louis Rossmann but in the thumbnail you can see the more specific message: T2 + Catalina = No data! The video is short, 6 minutes, so I suggest you watch it in its entirety. The gist of it is that sometimes the firmware in the T2 security chip (which, if I’m not mistaken, is basically inside every Mac produced today except the iMac line) gets corrupted and that leads to corrupted data. Before Mac OS 10.15 Catalina shipped, Rossmann says, the problem was relatively easy to fix. Catalina, however, automatically opts you in enabling Secure Boot, a feature that, as the Apple Support page states, “make[s] sure that only a legitimate, trusted operating system loads at startup”. 

Now the problem here,” continues Rossmann, “is that if you enable Secure Boot, I can’t boot the machine into an external operating system in order to try and grab files off of your corrupted operating system the way that I used to, and I’m not able to access the drive as well via the ‘Lifeboat connector’ to get information off the soldered-on SSD because there is no more ‘Lifeboat connector’ after 2017.”

Now, what’s really really bad here is that if you have Secure Boot enabled and your T2 firmware just decides [for] whatever reason it’s going to die, the only way that I can get the computer to work again is by… destroying all of your data. I need to erase it in order to get the computer to work again. But your data wouldn’t have been retrievable anyway, because the computer is dead. So, when the computer is dead your data is there, but to get the computer to not die, I need to erase your data. And what’s really bad with Catalina is that it seems to opt people into this by default. […] Every single customer that we have explained this to has said I don’t remember opting into that, I don’t remember choosing that”.

Rossmann’s suggestion to all people using Catalina is therefore to go and disable Secure Boot if they want to have a chance at recovering their data should a failure of the T2 chip occur.

I realise this doesn’t have much to do with my personal misadventure, but I thought it was a tangential subject worth mentioning. It’s another of those puzzling Apple decisions that make me less enthusiastic about a platform and an ecosystem I’ve been using and endorsing for decades. For now I’m pretty happy with my iMac + 11-inch MacBook Air setup (and the MacBook Pro is making a comeback with a new SSD and, soon, a new battery) and for now I feel I’ve dodged a bullet. But what about the next time, when a new upgrade will be inevitable down the road? 

As I’ve already said several times, I still think a good strategy (for myself) is to get a new, separate Mac to run Mac OS Catalina and higher, while keeping the tried-and-trusted machines on High Sierra or Mojave.

But another part of the strategy — in general, and especially when getting a Mac with a T2 security chip — is to back everything up as often and as paranoically as possible. It’s infuriating to see that loss of data is a problem that doesn’t seem to go away as technology progresses (much like configuring printers, haha). What’s worse with SSDs is that loss of data is always sudden. Sure, you read that SSDs have a certain life expectancy and data can be read and written millions of times before degradation and failure, but there are so many factors at play that make SSDs possibly more unpredictable than hard drives. I don’t have valid statistical data here to use for meaningful comparisons, but 3 years and 3 months of normal use seems to be an awfully short lifespan for an SSD. I have two hard drives in a Power Macintosh 9500 that are more than 20 years old and have been operational 16 hours a day for at least 10 years, and they’re still working to this day.

Backup solutions

As for backups, my preferred strategy is still relying on manual operations combined with Time Machine backups and having the most crucial, must-not-be-lost data redundantly copied in the cloud. I feel particularly lucky because I’ve been using Time Machine since Mac OS X Leopard and I never had a problem with it. (No, really.) But Time Machine has proven to be particularly buggy with APFS and Catalina, so I wouldn’t recommend you rely entirely on it. Other trusted products I have used in the past and still use on specific occasions are SuperDuper! and Carbon Copy Cloner. Both their developers have made heroic efforts to fight against Catalina’s quirks and bugginess in order to ensure the compatibility and reliability of their software solutions. I truly recommend these products without reserve.

This short, strange hiatus

Tech Life

I haven’t updated this blog in a month, and the truth is that there isn’t any particular reason as to why. I have indeed been feeling low-spirited and uninspired lately, but this lack of inspiration has felt different than other times in the past. 

Often, what happened with my past (tech) writer’s blocks was that, while recognising there was some inspiring subject to talk about, I was simply stuck, unable to find a worthwhile connection to what I felt I was supposed to talk about. These days, at the risk of oversimplifying this, the lack of inspiration felt more like the tech world’s fault than a possible omission on my part. If that makes any sense.

What should we talk about? Let me check my feeds, see what others are talking about. 

That Siri is still pretty much a dumb assistant even after 9 years? Tell me something new.

 

Podcasts, whether free or paid, or Spotify-exclusive, and the like? Eh. I won’t go as far as saying they’re all a waste of time. It would be a lie. But my advice is that you should think carefully about how much of your listening time you’re willing to invest. In a recent chat with a friend, as we touched the subject, he told me, You know, I do listen to a few podcasts, but the odd thing is that I always end up retaining very little of what was discussed, of what occurred. And not because I was distracted, as I actively listen to them. Some are just meant as entertainment, so I guess it’s okay if I just enjoy them in the here and now. But others… I don’t know, it feels their impact should last longer than it does. I absolutely understand what he means. I noticed the same before deciding to unsubscribe from everything and only listen to the occasional episode here and there. This retaining problem is also why I can’t for the life of me enjoy an audiobook. It’s a much more fleeting experience than reading the written page. But I guess this depends on the way each one of us is wired, and your mileage may definitely vary.

 

What else, then? The iPad? No, please, let’s skip this eagerly and completely. It’s a religious topic by now, and talking about it as objectively as possible is tricky. Not because it’s impossible, but because it’s a loaded debate. It’s like you’re supposed to take a side (pro iPad or against it), and even when you clearly aren’t, or your views are nuanced, someone out there will always believe that, deep down, you really really are an iPad fan or an iPad hater. Well, I am neither. But judging from the feedback I receive privately, I am officially an enemy of the iPad Cult.

 

Well, how about the Mac? It’s a subject that increasingly fills me with unease. While Apple has convinced me that, at least as far as hardware is concerned, they still care about it, I remain concerned about the general direction Mac OS is going. Buggier, unnecessarily strict, less flexible, and with an increasingly locked-down app ecosystem, with Apple as the gatekeeper, granting entry only to registered developers and their apps, which must be notarised. “Security reasons, y’see?”

By the way, I keep receiving feedback about Mac OS 10.15 Catalina via email. It’s still not good. After more than 200 emails, the negative experiences still make up for more than 95% of the messages. I’ve also recently realised just how disruptive the T2 chip-Catalina combo can be.

 

Augmented reality? Virtual reality? I wish I could find something interesting to say. If gaming ends up being the only meaningful implementation of VR, I’ll actually be fine with it. While an excellent game like Half-Life: Alyx is alone still not enough to convince me to get a VR headset, if more titles with this level of quality and immersion start appearing, I may change my mind, who knows. On the other hand, AR still feels the kind of endeavour that keeps burning resources and has little to show for it, at least for now. Apple’s insistence on it makes me think two things:

  1. They’ve come up with a really useful and innovative way to implement it, and they’re working hard to ship their idea.
  2. Instead of coming up with an innovative solution to try to solve a real problem regular people have with technology, Apple is trying to find some kind of original idea in the AR field that others may have overlooked or dismissed, and they’re working hard to simply be the first at it, and hope the public can be subsequently convinced that it is indeed a cool idea.

For the record, I still haven’t made up my mind. One day I think it’s scenario 1, one day I think it’s scenario 2. Which means there’s probably a third scenario I haven’t thought about. Which is okay, as I’m pretty much indifferent to this whole AR business anyway.

Unusual things I’ve been doing these past days

Due to a surprisingly busy (but not entirely unwelcome) work schedule, I’ve had little time to tinker with my assorted collection of computers and devices, and little time to devote to experiments. But a couple of things are nonetheless worth mentioning:

1. Being a Windows user

For the past eight days or so, I’ve been using my iMac booted into Windows 10 in the Boot Camp installation I managed to perform on an external SSD. It’s been two years now since I decided to familiarise again with Windows after many many years, so the fact that this recent experience was not unpleasant or riddled with friction didn’t surprise me. 

It was nonetheless striking to think that, hey, in case of a massive Mac OS platform catastrophe, I could manage the switch to Windows. Would it be ideal? Not fully. Would I love it? Not completely. But it could definitely be a bearable-to-pleasant arrangement. The truth is, this 21.5‑inch 4K retina iMac makes for a very nice hardware environment for Windows. The thin icons and system font are more readable, and certain UI details can be appreciated much more on a sharp display with rich colours. But more importantly, there are places where this Windows installation feels much faster and more responsive than Mac OS. Sure, Windows is installed on an external SSD and Mac OS on the internal spinning hard drive; and sure, I have more services and programs that load at boot on Mac OS, while on Windows I’ve enabled only OneDrive for now; as a consequence, in everyday use, Mac OS uses more RAM than Windows. 

Still, the overall feeling when using the iMac under Windows is that it’s a faster computer with a user interface that often displays zero lag when doing what one would consider basic tasks, like opening apps, updating windows, reacting to user input, etc. Not that Mac OS is awfully sluggish in comparison, but when I’m in Mac OS certain basic operations seem to take just that tiny bit more that it becomes noticeable. Finder windows that take several seconds before showing their contents. Apps that sometimes take an unusually long time before launching, as if the system were retrieving them from the dark, unfathomable depths of a networked volume instead of the internal drive. Spotlight search that sometimes gives you the feeling it’s waking up from some kind of slumber, instead of being ‘always ready’, and so on.

Again, I realise this is not an entirely fair comparison. I will soon move my Mac OS installation from the internal hard drive to an external Thunderbolt 3 SSD, and then I’ll be able to compare the two systems on a more similar ground. But for now, this is what I have and how it feels like.

Anyway, while I’m certainly not switching to Windows, it’s reassuring to know that, should the need arise, I could still function and be productive using a system that doesn’t feel alien or constantly in your way[1].

Last but not least, the fact that now I can play Windows-only triple‑A games that were previously unavailable to me as a Mac user, is rather exhilarating. Just before the Coronavirus lockdown was in effect, I was about to purchase Dishonored (a game I’ve wanted to experience for years) for my PlayStation 3. I was able to take advantage of a discount and purchase the PC version online, which runs very very smoothly on my iMac.

2. Back to a smaller phone and display

The second ‘why not’ experiment I carried out these days has been going back to experiencing smaller smartphone displays. For about two weeks, I’ve used an iPhone 4S as daily driver, and it has been an interesting ride.

iPhone 4S in hand

Like with my Windows experience detailed above, while using the iPhone 4S as my main phone, I’ve had the same feeling over and over, i.e. I could still make this work if my iPhone 8 left me stranded for some reason. Of course in this case there would be real limitations, but they would involve more app availability and functionality rather than physical display size. In other words, if in theory I wanted to go back to using an iPhone 4S permanently, the main problem wouldn’t be the smaller display, but the fact that this phone can’t be updated past iOS 9. And if you, like me, want the smoothest iOS experience on an iPhone 4S, you’ll want to downgrade it to iOS 8.4.1 — which further restricts app availability.

And in fact the purpose behind my little experiment hasn’t been to see if I can go back to using iOS 8 instead of iOS 12 or 13, but to see if — in the age of smartphones with huge displays — I can go back to using a smaller phone with a (gasp) 3.5‑inch display.

The short answer is yes. The long answer: there are trade-offs, of course, and they’re all the obvious ones. If you watch a lot of videos, if you’re a heavy Instagram consumer, if you do a lot of photo editing on the device, a small display like this will definitely feel cramped. For other uses, it’s more a matter of habit than true discomfort. For example, while the idea of going from, say, a 5.5‑inch display to a 3.5‑inch display makes you think that reading text is going to be harder, it’s really not that big of a deal, especially in the case of an iPhone 4S, which sports a retina display. You’ll have to scroll more, sure, but on the flipside you’ll have in your hand a device which is so much more comfortable to hold that you can do practically everything with just one hand.

And for me, for my hands, comfort is important when handling a smartphone. When I look at certain apps on the bigger iPhone 8, and see that what you essentially get in their interface is more white space rather than more information, then the advantage of the bigger display becomes less significant. 

In the end, it all boils down to visual comfort versus operational comfort. Text, information, controls look nicer on a bigger device; interacting with them feels nicer on a smaller device. Oh, and app design definitely makes a difference. Whether you like Spotify’s UI or not, it scales well on a 3.5‑inch display:

Spotify - iPhone 4S

 

Watching YouTube videos, as I said, is obviously better on a bigger display. Funnily enough, however, when browsing videos inside the YouTube app, the amount of information you see at a time is essentially the same; here you can see two comparable screenshots taken from my iPhone 4S and iPhone 8.

YouTube comparison

 

As for editing photos, the well-designed interfaces of apps like Snapseed and Pixelmator made the process surprisingly easier than expected. 

Snapseed iPhone 4S
Snapseed’s main interface

 

Pixelmator iPhone 4S-1

 

Pixelmator iPhone 4S-2
Pixelmator’s interface

 

To conclude, I’m aware that it was easier for me to make this ‘back to small screen’ experiment also because this is a period of reduced mobility and so forth. And I’m also aware that for many people reverting to a display that’s almost half the size of what they’re currently using sounds like madness. But what I absolutely confirmed while using my iPhone 4S these days is that a smaller display is far from being ‘unusable’. On the 4S I’ve done email, web browsing[2], read RSS feeds and articles saved on Instapaper, used Google Maps, listened to music, used Twitter; and I even took and edited photos, watched videos, checked my friends’ photos on Instagram. All in a smartphone that fits the palm of my hand, can be operated in total comfort, and disappears in my jeans pocket.

 


  • 1. The only notable exception is character/keyboard support. On Mac OS, I am accustomed to just type certain key combinations to get diacritics or other commonly used typographic symbols (curly quotes, en dashes, em dashes, etc.). On Windows, these key combinations are simply not present. You have to either look for certain characters and symbols using the Character Map app, or you have to remember the good old ALT+ASCII code combinations (e.g. ALT-0151 to type an em dash). This terribly slows me down as I’m typing. I can add keyboard layouts and switch to them as needed to get additional diacritics and symbols when I’m writing in Italian or Spanish, but the hardware keyboard remains the same, with British layout, and I don’t always remember which key to press to get à, è, ì, ò, ù, á, é, í, ó, ú, ñ, and the like. ↩︎
  • 2. I know what you’re thinking: how on earth can you browse today’s web on a 3.5‑inch display!? It sometimes had its challenges, like privacy- or cookie policy banners that completely took over the entire visible area. And ads are absolutely disruptive on a small display. I was lucky enough to download the Brave browser for iOS as soon as it was launched, and its first versions supported iOS 8 and 9, so that’s the browser I’ve been using the most these days on the iPhone 4S. Without it, I must say my web experience wouldn’t have been this tolerable overall. ↩︎

 

Traditional iPhone vs X-style iPhone interaction models

Software

While reading John Gruber’s review of the 2020 iPhone SE, this part caught my eye:

What this source told me is that while developing the iPhone X, members of the team would typically carry two phones with them: a prototype iPhone X they could use, but (of course) not while in the presence of anyone who wasn’t disclosed on the project, and an older iPhone they could use in front of anyone. These team members would spend time, every day, using both phones. They knew they were onto a winning idea with the new interaction design for the iPhone X when they started instinctively using the X‑style gestures on the older iPhone, and never vice versa. When a new design is clearly better than an old one, it’s a one-way street mentally.

I believed that then, but I believe it more now after spending the last week with the iPhone SE. I’ve used it exclusively for hours at a stretch and I never stopped expecting it to act like a post-iPhone‑X device. I swipe up from the bottom to go home or multitask. I expect it to wake up just by tapping anywhere on the display. I pull down from the top right corner expecting to see Control Center. I can’t stop doing any of these things unless I’m consciously thinking about the fact that I’m using an old-style iPhone. Even if I locked my personal iPhone 11 Pro in a drawer and touched no phone other than the new SE for a week or two, I still wouldn’t shake my iPhone X interaction habits unless I abandoned my iPad Pro too.

Once you get used to the post-iPhone‑X interaction model, there’s no going back. A week with the new SE has not shaken my belief that the X‑style interaction design is superior. Not one iota. 

I politely object.

When you get accustomed to a new way of doing something, going back to the old one is always an exercise in friction. Sometimes the new user interaction model is indeed an improvement over the old one. But once the new user interaction model is ingrained in your muscle memory, wanting to apply it even when you’re interacting with an older device, or with a device that still uses the older model, is only natural. But we shouldn’t confuse this naturalness with ‘better design’ — it’s a form of bias.

Even if the members of the iPhone team were using a prototype iPhone X together with a regular iPhone with a Home button, I’m willing to bet the prototype ultimately got more of their attention, since it was the one they were working on. So it’s understandable that “they started instinctively using the X‑style gestures on the older iPhone”. If you think about it, it’s more natural to want to apply a new swiping gesture to the screen of an older iPhone than to want to push a physical Home button on an X‑style iPhone that doesn’t have one.

When I was considering upgrading to a newer X‑style iPhone[1], I did a similar thing: I spent some time carrying both my iPhone 5 and an iPhone X that was kindly lent to me for research purposes. Interacting with both phones was generally a confusing experience, gesture-wise, but since I ultimately had to interact more with my own iPhone 5, you have no idea how many times I would try to swipe up from the bottom of the iPhone X to bring up Control Centre, only to either quit an application or go back to the Springboard. It was maddeningly frustrating. And when wanting to multitask, while I consciously processed the absence of a Home button in the iPhone X, I constantly had to pause for a fraction of a second, and recall the gesture in my mind. Much like when on the Mac you’re starting to forget a keyboard shortcut you don’t use that often.

Subjective experiences, however, shouldn’t be used to evaluate whether something is good design or not. The remark I’ve often heard from users who upgraded to X‑style iPhones since day one is that the newer interaction model ‘feels more natural’ than the older. I believe it’s an illusion. Which doesn’t mean I believe they’re lying. It means that before the iPhone X existed, the older interaction model felt just as natural as the new one today. This is what happens when your muscle or spatial memory is fully trained and accustomed to a way of doing things.

If we put apart ingrained habits, both interaction models have their merits, and neither is superior to the other as a whole.

  • Double-clicking the Home button to bring up the multitasking interface is both quicker and precise than the mindful swipe from the bottom on X‑style iPhones. The gesture is unambiguous: double-clicking the Home button does only that; there is no overlap with another gesture.
  • The gesture of switching between open apps appears to be more efficient in the X‑style iPhone interaction model, as opposed to always double-clicking the Home button and then leafing through the app ‘cards’.
  • The gesture of invoking Control Centre is controversial. I can totally understand the dilemma here: a flick upwards from the bottom of the screen is by far the quickest gesture, therefore it should be associated with a high-priority/high-frequency task. In the absence of a Home button, multitasking and Control Centre became the two main contenders for that spot. I suppose a judgment call was made, and multitasking was deemed a higher priority. Quite understandable. But then where to move the Control Centre gesture? Swiping down from the top right corner feels like the kind of decision where someone in the room proposes it, there’s a long silence, and then they ask, Anyone has a better idea? And then there’s another long silence.
  • As a result, the new X‑style iPhone interaction model has two gestures that partially overlap when you start from the bottom of the screen (quitting the current app, invoking the multitasking interface), and two gestures that partially overlap when you start from the top of the screen (invoking Notification Centre and Control Centre). In the older interaction model, these four gestures/operations are more distinctly assigned. You may consider a single click of the Home button to exit an app and the double click to invoke the multitasking interface as a partial overlap, but the two gestures don’t require precision or mindfulness on the user’s part to be consistently executed all the time. They’re clearly distinct and can be executed mechanically.
  • Overall, the feature arrangement of the traditional iPhone interaction model seems to have a better degree of usability. Due to the increasing size of current flagship iPhones, the reachability of Control Centre in particular has unquestionably worsened.
  • Using Apple Pay in the new interaction model is also a bit more cumbersome than it is on iPhones with a Home button. Where you simply placed your finger on the Home button sensor to authenticate and pay, now you have to double-click the side button, then glance at the iPhone to authenticate with Face ID.
  • In theory, taking a screenshot has gone from a two-handed gesture on traditional iPhones (simultaneously press and then release the Home button and the Sleep/Wake button), to a potentially one-handed gesture on X‑style iPhones (simultaneously press and then release the side button and volume-up button). But this latter gesture makes you hold the phone awkwardly and you can accidentally press the volume-down button and end up performing the gesture to invoke the Emergency SOS.

I’d say that the X‑style iPhone interaction model is perhaps the best compromise they could come up with at Apple after removing the Home button. In the following gesture reshuffling, some gestures have turned out to be equally effective (exiting an app), and some even more effective (switching between open apps), but I don’t think we can conclude that it’s a better design as a whole. The two interaction models work well with the hardware they are implemented on, and work well when considered separately. They’re two different beasts. 

Making comparisons is where things get tricky and deeply subjective. As a user interface enthusiast, I’ve tried my best to point out some merits and downsides of the main gestures of each interaction model. Personally, I’m more accustomed to the Home button design, and it’s hard for me to view the X‑style iPhone interaction model as more than a necessary workaround, a gestural chain reaction following one single design decision — the removal of the Home button. 

 


  • 1. While I dislike the notch and the buttonless design, I was briefly tempted to get an iPhone XR when it was time to upgrade from my iPhone 5. ↩︎