Autocorrect and predictive text have regressed since iOS 10

Software

Yesterday, John Gruber posted an intriguing piece called iOS 13 Autocorrect Is Drunk in which he argues that autocorrect on iOS 13 and 13.1 has got worse than before. This reminded me of something I started noticing last year when I upgraded to an iPhone 8 (with iOS 12) from an iPhone 5 (with iOS 10); something I had meant to investigate further but ended up postponing for the usual lack of time. What I had noticed is that autocorrect and predictive text appeared to be worse on iOS 12 than on iOS 10.

I wasn’t entirely sure about it, and for a while, a bit like Gruber, I thought that maybe it wasn’t a real issue but an ‘anti-placebo’ effect, as he called it. Perhaps it’s a matter of retraining the new iPhone to have suggestions and corrections more relevant to my writing style, I thought. But between yesterday and today I made some time to explore this thing in more detail, and after countless tries with my iPhone 5 and iPhone 8 side by side, I think I have pinpointed the main issue: while it seems that in iOS 12 autocorrect and predictive text have generally worsened compared with iOS 10, they have regressed specifically when using an iOS device with multilingual keyboards.

On my iPhones I usually keep three different keyboards: English, Italian, and Spanish. These are the languages I’m fluent in, and when using apps involving text entry (chat apps, social network apps, email clients, etc.), I often switch from a language to another. I communicate with my mother and closest friends in Italian; I exchange messages with my wife and her family in Spanish; I tweet, do most of my email, and generally communicate with the rest of my international contacts in English. What happens on a daily basis is that I launch e.g. Telegram, start writing something to my wife, but I realise mid-word that the keyboard is set to another language. And that’s when things get bad on iOS 12 (and I presume iOS 13 as well): autocorrect and predictive text don’t consider the whole word I’m typing, but start making suggestions from the point I resume typing. This behaviour didn’t happen under iOS 10, which was smarter enough to embrace the language change and adjust its suggestions accordingly.

Let me clarify this with an example. Follow this process closely:

1. I open Telegram. I want to write something in Spanish. I start typing Espero [meaning, “I hope”] but I realise I have the English keyboard selected. This is what I’m seeing on iOS 10 (left) and iOS 12 (right) after typing “Es”:

10v12 01

For starters, note how much more pertinent predictive text is under iOS 10, while under iOS 12 suggestions just look all over the place.

2. At this point, I tap the Globe button to switch keyboards. On both phones, the keyboard first switches to Italian, then to Spanish:

10v12 02

10v12 03

Note here how iOS 10 adapts to the new languages selected, while iOS 12 doesn’t appear to have a clue, at first.

3. Here’s the kicker now: when I resume typing with the newly selected keyboard, and proceed to type “pe” (the second syllable of espero, the Spanish word I meant to type from the start), iOS 12 starts suggesting words that begin with “pe-” instead of “espe-”. In other words, when in iOS 12 I start typing a word using one keyboard, and switch to another keyboard mid-word, the predictive engine starts making suggestions without considering the word I was writing as a whole, but from the point I switched keyboards (in this example, after typing “Es”). This makes no sense, and is the cause of constant friction, as I’ll show later with another example. Note instead how iOS 10 behaves more logically here:

10v12 04

4. Here’s another example: I wanted to type the English word “Estimation”. After typing “est”, I realised the keyboard was set to Spanish. I switched to English and continued typing “ima”. iOS 10 adjusted on the fly and correctly started suggesting English words beginning with ‘estima-’.

iOS 12, instead, started suggesting words beginning with ‘ima-’:

10v12 06

 

Above I said that this maddening behaviour under iOS 12 is cause of constant friction. That’s because, when you switch keyboards mid-word, finish typing the word, and hit Space, autocorrect will often insert a suggestion based not on the word you meant to type, but based on whatever you typed after switching keyboards. This happened earlier today when I wanted to send a tweet using Tweetbot; the tweet was meant to start with “This morning”, but I soon realised I had the keyboard set to Spanish, so I switched to English after typing “This mo”. Look what happened (click to enlarge):

 

So, autocorrect starts suggesting English words when I continue typing “-rning”. Its best suggestion, as you can see, is toning. It completely ignores what I have typed before (“mo-”), therefore, when I tap Space to write the next word, it autocompletes to “motoning” instead of “morning”. When I backtrack with backspace, the replacements proposed again make no sense — or rather, they’re just useless in this case.

That’s why after switching from my iPhone 5 with iOS 10 to the new iPhone 8 with iOS 12, my initial impressions were that autocorrect was simply worse and felt ‘untrained’. I didn’t realise that what messed things up was the keyboard switching while writing a word.

But even within a single language, autocorrect under iOS 12 does indeed feel less smart than under iOS 10. Here’s another example: I wanted to write “through” but intentionally mistyped the word and started writing “trhou-”. Both predictive engines suggest “through” as the possible word, but in iOS 10 it is the preferred choice, while iOS 12 thinks I want to write “Thou”, which is an archaic word I certainly don’t use on a regular basis and it’s definitely less probable than the more common “through”:

10v12 05

Final notes:

  • I don’t know if this behaviour changed with iOS 11, because I went from iOS 10 to iOS 12. Both iPhones are updated to their latest available updates — iOS 10.3.4 and 12.4.1.
  • I haven’t tested iOS 13, so I don’t know whether autocorrect still behaves as it does under iOS 12 (I assume it does, though).
  • If you think these examples are not exhaustive, you have to understand that I couldn’t possibly upload fifty different comparative screenshots. My results are consistent and, as far as I could observe, reproducible.

 

It’s all about the camera…

Tech Life

Part 1: Some geeks are going to have a fit

Local Apple Store. Tasty and telling exchange between a middle-aged man and one of the employees. The man is enquiring about the new iPhone 11 and 11 Pro models. By the looks of it, he appears to be a prospective iPhone upgrader. He also doesn’t seem someone who keeps himself up-to-date with tech news much.

It’s not that I enjoy overhearing conversations, but I was just there trying out the devices, and this man and the store clerk were basically by my side all the time — and they were talking loudly due to the noise and the crowd in the store. I’m quoting by memory (and translating from Spanish to English in the process) but I swear I’m not making any of this up.

– My daughter told me the new iPhones would be on display today…

– Yes, this here is the iPhone 11, and these two are the iPhone 11 Pro and Pro Max… [the clerk starts blurting out technical specifications, then instantly realises he’s going nowhere with that, so he changes approach]. What would you like to know about them? I can show you the differences between them, help you decide.

[The man reacts as if he were talking with a car salesman] Well, the first thing I’d like to know is: what’s the big deal about them?

– W‑What do you mean, sir?

[The man takes out his phone] You see, I have this. It works great.

– Oh, a 6s Plus, yes.

– …Battery was starting to show its age, so a few months ago I got it replaced by you people. It’s fine now. Like new. So… what do these new models have that’s worth putting my phone in the drawer and get one of them?

– Ah! Well, the ‘big deal’ about these phones, as you can see [the clerk shows the back of the iPhones to the man] they have really great multiple-camera systems. The cameras and the way the iPhone processes photos and videos have improved so, so much. I think you’ll really notice the difference compared with the 6s Plus you currently own.

[The man’s reaction at this moment is so point-blank I have to keep my cool and avoid bursting out in laughter] Okay. But what if I don’t care about the cameras?

– Huh?

– What if I don’t care about the cameras? What else is new here?

[The clerk clearly didn’t see that coming, and it takes a brief moment for him to answer] Ah! Oh, well, these new iPhones have more powerful chips… Much more powerful than your iPhone. They’re four generations newer. You’ll certainly feel the jump in performance. [The clerk gives a brief demo, I can’t see exactly what he’s doing, but it looks as if he’s showing gestures, animations, applications…]

– That’s certainly nice, and I see this one has a bigger screen [the man points at the iPhone 11 Pro Max], and its display looks really great.

– Definitely. It’s 6.5 inches, while your 6s Plus is 5.5. It’s denser, more contrasty, it’s an OLED panel, has TrueTone [briefly explains what it is; the man nods in understanding], Wide Colour Display…

– Where’s the button?

– Uh, sorry?

[The man points at the Home button of his iPhone 6s Plus]

– Ah, no, that button is gone. Now you get more screen real estate, and you can quit applications and get back to the Home screen by swiping like this…

– Looks awfully awkward.

– Well, yes, at first I suppose it does. But really, after a few days you get accustomed to this.

– Hmm. So, no button, no fingerprint scanner either?

– No. Let me explain how FaceID works.

[Halfway into the explanation, the man stops the clerk]

– Ah yes, now I remember… My daughter showed it to me. I don’t like it. It really seems a bit cumbersome, less immediate, you know… especially for paying stuff.

– …

– Look, how about you show me the last iPhone that still has a button, a fingerprint scanner, and a screen at least as big as the iPhone I have? [The man now looks at me, and adds My ageing eyes, you know…] I’ll take that.

– That would be the iPhone 8 Plus.

– Do you still sell it?

– We do, yes.

– So it’s not very old.

– It’s a model from two years ago. It’s not very different from yours, size-wise… Here, let me show you.

– Excellent.

At this point, the man and the clerk went to another table, and I didn’t follow them, although I admit that that exchange had left me very intrigued indeed. However, as I was moving to another table to take a look at the Apple Watch, I saw another clerk carrying a few packages (clearly what some of the customers had ordered) and there was an iPhone 8 Plus box among them.

Upgrading from an iPhone 6s Plus to an 8 Plus might seem utterly unthinkable to many of the folks who are probably reading this. And for the smug geeks out there who are perhaps laughing at that man, I’ll say that this is part of what happens when you descend from your ivory towers into the land of ‘regular people’. You may spend time gushing about new powerful smartphone camera arrays, improvements in display technology, the marvels of 5G networking. Then you hear people in stores asking questions like “Does it do Whatsapp?”, “Can I send photos to other people directly?”, “Do I have to recharge it every night?”, etcetera.

It’s not that people are ‘dumb’. A lot of them simply don’t care about certain technological aspects as much and as obsessively as you do. That’s why I felt an incredibly exhilarating sensation when that man in the store replied, matter-of-factly, But what if I don’t care about the cameras? These are simple questions or statements that just pull the rug from under your feet, so to speak.

Part 2: Some observations about smartphone photography

You see, in a sense, I don’t care about smartphone cameras all that much, either. My investment in smartphone photography has been on a declining curve for a while. Mind you, I find all these new advancements truly interesting from a theoretical standpoint. I love what Apple has been doing with image processing and camera technology in recent years. Computational photography is really something. Smartphone cameras have undoubtedly made noticeable progress with regard to image fidelity, and as I said in my previous article, soon we’ll reach a point where our smartphones achieve WYSIWYG — or rather, What You Get Is Exactly What You Saw — photography. But that’s not the photography I’m personally interested in.

I’ve read many reviews of the new iPhones, I’ve watched as many video reviews on YouTube. I’ve seen many sample photos taken with the iPhone 11 and 11 Pro. And previously, same thing with the Samsung Galaxy S10, the Google Pixel 3, the Nokia 9 Pureview, and so on. And what I’ve seen are often great snapshots, with great lighting, impressive dynamic range and colour science. Every time a new iPhone comes out, I take a look at Austin Mann’s reviews to have an idea of how a professional photographer uses the iPhone as a tool. And well, in the vast majority of cases, I find smartphone photography lacking. Sometimes it’s depth, often it’s that je ne sais quoi I could call ‘mood’. Smartphone photos taken with the best phones out there — and straight out of the phone’s camera — all tend to have (at least to my eyes) a sort of clinical, precise, documentary look. Or also a glossy, slick look, if you like. (Many of the ‘Shot on iPhone’ photos I saw on Apple’s site last year after the contest was over looked like aspiring images for Apple posters and billboard ads.)

When they look more punchy, when they have a touch of mood, when they start to really transmit something to me, it’s usually because they have been post-processed and ‘made imperfect’ through the use of filters and other retouching techniques (ah, the irony). Image fidelity is important, just as sound fidelity is. But just as a lot of people still prefer the sound of vinyl records to the sound of CDs and digital music files despite these being technically superior, image fidelity is not necessarily — or not always — what makes great photography.

The technology behind Google’s Night Sight and Apple’s Night Mode is indeed impressive. Like me, you’ll surely have seen a lot of with/without Night Sight and with/without Night Mode photo comparisons by now. And when you’re truly interested in capturing a night scene in a way that you can actually show people what was there, instead of a mostly-dark frame, these Night modes are fantastic tools. But when your intent is less documentary and more… emotionally driven? artistic? — you may want to render a night scene in such a way that visual impact becomes more important than the minute portrayal of what was actually there. Just like when a grainy black & white photo just hinting at certain details turns out to be more gripping and memorable than a perfectly-rendered noise-free colour snapshot.

I’ve been a photo enthusiast since I was a teenager; over the years I’ve taken photos with the most varied gear — 35mm film cameras, medium format film cameras, point-and-shoot compact cameras, digital cameras of different sizes and formats, feature phones and smartphones — and again, I certainly appreciate all the technical advancements we’ve all been witnessing in photography in the past couple of decades. And in a sense I even agree that the smartphone today is the best camera you have with you. It’s the first thing I reach when I want to capture a certain moment. Today’s technology is making it the fastest, most accurate tool for this task. I have indeed captured many otherwise uncatchable moments, and accumulated a series of scrapbook memories. And sharing some of these has been easy and fun. But for me — for me — smartphone cameras are not enough when I want to try capturing something in a truly memorable way, when I try capturing something with any kind of artistic intent or purpose. This is why I very often carry a camera with me.

Additional closing notes:

  • While I can’t help raising an eyebrow every time Apple or other smartphone manufacturers talk about ‘Pro cameras’ in their phones, when it comes to video it’s a completely different story. I really think that the flagship smartphones released in the past few years can be used to shoot video at a professional level, and the iPhone 11 Pro demos at the latest Apple event have indeed impressed me.
  • So far, the only smartphone I’ve handled whose camera experience and results have felt like handling a ‘real’ camera has been the Nokia Lumia 1020 (a phone from 2013). Photos taken with this smartphone have always had something about them, a distinctive quality, or look, or mood, right straight out of the camera. Very rarely I felt the need to adjust things afterwards.
  • I remember a time when I was definitely more attracted by the iPhone as a camera to try stuff and experiment with. It was roughly from the iPhone 3GS to the iPhone 5 era. I remember when folders in iOS still had an app limit, and I ended up filling three different folders with photo-taking and photo-editing apps. I kept wanting to take snapshots everywhere I went, to then import my snaps in this or that app to apply a vintage filter, or a ‘grunge’ effect, or convert them into fake polaroids. When I finally updated from my aging iPhone 5 to my current iPhone 8 last year, I thought that the better camera and processing power of the new iPhone would rekindle this impulse to keep shooting and experimenting, but somehow it did not happen. I mean, I still use apps like Hipstamatic to experiment with different effects, but it’s just not like before. I’ve tried to rationalise this, and the only explanation I’ve come up with is that the iPhone 8 is so effortless in taking generally good shots that I don’t feel compelled to further bother with them. Previous iPhone cameras were good phone cameras for the time, but their limitations kind of urged me to be more creative after the fact. Another factor may be that some years ago there was a huge wave of great, diverse photo apps for iOS. While today there are indeed a few excellent apps worth having, the rush feels over and the landscape saturated.
  • We have amazing cameras in our smartphones today, but then I look at the photos regular folks show me every now and then and yes, they’re as crappy as ten years ago. But it’s fine — not everyone cares about the cameras…

 

A few brief observations after Apple’s “By Innovation Only” event

Tech Life
  • The introductory animated video was fun. I liked that it was just that, an intro video, without trying to drive home some grandiose stance. As time goes by, though, every time I watch the videos Apple makes for these keynotes, I’m constantly left with the feeling that they’re 20–30 seconds longer than they should.
  • There’s something Jon Prosser (of Front Page Tech) has been saying repeatedly in recent times: The market does not need innovation right now; it needs compromise. And I absolutely agree with this. Compromise means acknowledging that you can’t play the innovation card over and over, especially in times when there’s clearly more iteration than groundbreaking new technologies and devices. Like Prosser, I too think that Apple has finally got the memo about this, and it showed during the event.
  • In fact, in my opinion, the true highlight of the whole event were the prices of most of the offerings. Quite a lot of bang for a relatively little buck. (‘Relatively’ because it’s always Apple we’re talking about). I’m not interested in subscribing to Apple Arcade or Apple TV+, but pricing each subscription at $5/month is a very good deal. Offering one year free for purchasing a new Apple product is another good deal. The new 7th-generation iPad, the regular entry-level model, features very nice improvements and it retains its great $329 price tag. The new Apple Watch Series 5 again starts at $399, but the killer move is to keep selling the still very good Series 3 at the reduced price of $199. That is another great deal.
  • Similarly, the positioning of the new iPhones has finally struck the right tone. In 2017, the ‘cool new innovative’ iPhone was the iPhone X, and it came with a $1,000 premium price tag. But hey, you could have the regular, ‘boring’ iPhone design by getting the iPhone 8 and 8 Plus at more affordable prices. In 2018, the premium XS and XS Max were the iPhones to get, while the XR was positioned as the fun colourful cheap alternative ‘for the rest of us’, the ‘poor man’s iPhone’. Finally Apple has switched positions: the evolution of the XR is now the standard iPhone 11, a very capable, all-round iPhone. You start from there, and if you want to go deluxe, you can choose the more expensive and feature-packed iPhone 11 Pro and Pro Max. It’s subtle, but it sounds right this time.
  • During the event, I didn’t tweet as much as I usually do, and the usual snark I’ve been reserving for Tim Cook’s Apple in recent years was reduced as well. It’s not that I’ve changed my mind — Apple still deserves criticism and still deserves absolutely no slack cutting — but the thing is, I liked a lot of what they introduced the other day. Plain and simple.
  • What I didn’t like was how the event was organised and how products and services were presented. The pace was terrible. Each presenter talked about how excited they were, but this excitement was just words — it didn’t show at all. And all this passing the baton around… I understand that Apple wants to communicate they’re serious about diversity, etc., but that’s starting to look a bit too much on-the-nose. In any case: do you want to fragment the event by putting a dozen people on stage? More power to you, but at least have them prepare and rehearse thoroughly to avoid sleep-inducing technical explanations that distract people and make them lose interest. It feels amateurish.
  • (On stage, Steve Jobs embodied that famous intersection of technology and the liberal arts. He could get into technical details while never forgetting the theatrical part of a keynote. He communicated intent, his excitement could be very contagious and it very often felt genuine. These keynotes feel designed by committee; Cook communicates via platitudes and slogan-sounding phrases. I have no doubt that he’s excited as he says, but he sounds contrived nonetheless.)
  • The software side is getting increasingly messy. As usual, you’ll find the relevant links and contributions on Michael Tsai’s blog (Apple’s Fall Release Schedule). If Steve Jobs were still alive™, I imagine he would be incredibly pissed about all this. And sorry for sounding yet again like a broken record, but I still think Apple should hire more talent for their Software division, so that they can actually deliver on what they promise; or at least they should reconsider this self-imposition of a yearly schedule they clearly can’t cope with anymore. I know exactly zero people who would be bothered if Apple hypothetically decided to ship iOS 13 or Mac OS Catalina next spring; especially if the delay meant better, less rushed software. This obsession with ‘keeping the pace’ seems particularly absurd when the next major OS updates introduce drastic changes at a fundamental level, like in the case of Mac OS Catalina. You can’t plan and design an effectively disruptive new OS version, and have it in such beta stage this late in the schedule — In the schedule you arbitrarily made up (because tech has to move fast or because market or because investors or because what-have-you.)
  • Today, innovation in smartphones is 95% focused on their camera technology. And every iPhone introduction has increasingly been about its camera(s). I understand and appreciate all the research, development, and innovation Apple has built inside such a tiny space over the years; but I keep thinking these incredible devices are so much more than point-and-shoot cameras and selfie-takers. Even if computational photography is bringing us to a point where, with a smartphone, we basically achieve WYSIWYG photography, I feel these devices end up being under-utilised, especially by regular people. More and more people upgrade their phone at a slower pace and hold on to previous models not only because they’re more and more budget-conscious, but also because smartphones are getting so saturated with regard to features and capabilities, that people don’t really notice a significant performance gap or capability gap when they go to a store and try out the new phones.
  • And speaking of cameras… these new camera arrays on the back of the new iPhone 11 and 11 Pro do look ungainly from an industrial design standpoint. If one wants to be pragmatic, sure, look at what these camera systems can do — who cares about the back of the phone? Well, lately I’ve been looking at a lot of Android phones with multiple cameras on the back, and let’s say that Apple’s doesn’t strike me as the best design out there. Samsung’s S10 and Note10 lines have better-designed backs. The Nokia 9 Pureview has five cameras, but they’re arranged in a more pleasing way and, most importantly, they don’t protrude. The new iPhones have camera arrays that look as if they have been hastily bolted on the chassis of last year’s iPhones.
  • One More Thing — .

     

  • (I might be adding new observations in the next days; if I do, I’ll inform you of any updates via Twitter.)

Post-holiday miscellanea

Tech Life

1.

I’m finding it a bit hard to write about technology, lately. My decreasing enthusiasm has to be age-related: I’ve been interested in technology for many years now, always nurturing a curious eye and attitude towards new hardware, new devices, new solutions, new software, and at the same time trying not to make such excitement linger on a specific innovation, but rather attempting to look at it against the big picture. The big picture being progress, technological advancement and how it affects humans and everyday life. My current general feeling of disappointment towards the tech world could perhaps be summarised this way: I’m liking that big picture less and less. Why? Just read the news, or open Twitter, to get a taste.

2.

In possibly the best article I’ve read this summer (or this year so far), Absolute scale corrupts absolutely, Avery Pennarun writes:

The Internet has gotten too big.

Growing up, I, like many computery people of my generation, was an idealist. I believed that better, faster communication would be an unmitigated improvement to society. “World peace through better communication,” I said to an older co-worker, once, as the millenium was coming to an end. “If people could just understand each others’ points of view, there would be no reason for them to fight. Government propaganda will never work if citizens of two warring countries can just talk to each other and realize that the other side is human, just like them, and teach each other what’s really true.”

[Wired.com has an excellent article about this sort of belief system.]

You have a lot to learn about the world,” he said.

Or maybe he said, “That’s the most naive thing I’ve ever heard in my entire life.” I can’t remember exactly. Either or both would have been appropriate, as it turns out.

I used to share this sort of belief system. The disappointment that ensued came slowly but steadily. My exposure to Usenet in the late 1990s-early 2000s gave me a relatively hard lesson on how much people can communicate online and how little that helps to understand one another. But also on how people can weaponise the communication tools they have at their disposal to twist narratives, spread false notions, hurt others. My idealism was curbed, but not completely defeated. I remember thinking that Usenet was just an environment, an enclosed (cyber)space people treated as a sort of communication playground, just like how it was at school. Fast forward to now, however, and the playground is everywhere. The virus has breached containment.

3.

Every day I stumble onto some kind of obstacle or problem that makes me think, This could be easily solved with the level of technological progress we have today — but the tech world is still dominated by a certain strain of nerd mindset that makes too many people focus on less important stuff, like how many cameras, megapixels, and what kind of low-light performance the next wave of smartphones are going to have; or how we can improve Siri so that it can understand simple commands we’re still faster to execute ourselves in the first place; or when the stupid self-driving cars are finally coming. An immense amount of resources spent on trying to solve the wrong problems, or the non-problems, or the problems tech companies have created in the first place. We obsess on what the artwork on the Apple invitation for their September 10 event may imply, while the public building I’m in now completely lacks any solution to help blind and otherwise disabled people navigate its various floors.

4.

I clearly have iPad fatigue. The other day I tweeted, I really can’t explain why, but my enthusiasm/excitement for the iPad has been decreasing as the device has actually, progressively become more powerful. I suspect this ‘iPad fatigue’ comes from too much exposure to stupid tech debates like “Can iPad become your only device?”

I knew from the start that the iPad wouldn’t stand a chance at becoming my only — or my main — computing device. When it comes to personal tech choices, I’m too intellectually curious to be a minimalist. The iPad for me has been, still is, a fantastic computing wingman; the perfect consumption solution; the perfect platform for quick checks and fast tasks. When it wants to act like a pro, it can, but there’s too much friction; so much as to make the device utterly unappealing. I’m faster on a Mac. I’m faster on a Windows 10 laptop. Or even on the older Windows 8.1.

And it took me less time finding my way around Android on the Xiaomi Mi A2 I’ve recently purchased (for work and personal UI research) than fully mastering the beta of iPadOS 13 on a borrowed iPad.

In 2012, I couldn’t wait to spend my hard-earned money on the upcoming third-generation iPad, the first with a retina display. I truly made the most out of it in the following years, and it’s still my main iPad, together with a first-generation iPad whose simplicity (and beloved pre-iOS 7 user interface design) still makes it a joy to use. 

In 2018–2019, I started considering the idea of getting a regular, entry-level iPad, to keep up with the latest iOS releases on a tablet, and maybe have a more powerful portable machine when I’m out and about and don’t exactly need a workstation. But the purchase of a second-hand 11-inch MacBook Air made me realise that I didn’t really need a new iPad or iPad Pro. I had found the perfect power+portability package I was looking for, for much less money. From then on, I really had a tough time justifying the purchase of an iPad, so I kept prioritising other stuff, like photographic equipment, an Android phone, and my next investment is very likely going to be a pair of Sony noise-cancelling Bluetooth headphones, which cost just like a regular, entry-level iPad. My 5‑minute experience when I tried them out has been comparatively more life-changing, I kid you not.

5.

So, rumour has it that the next iPhone line-up will have a naming scheme that’ll give us an “iPhone 11 Pro” model, and while Gruber keeps nodding and saying that it makes a lot of sense, I find that using the ‘Pro’ moniker on iPhones is just silly and ridiculous. I know that with Apple, ‘Pro’ doesn’t always necessarily mean ‘professional’; that in many cases it’s just a way to indicate a premium or deluxe device. But I think it’s a poor choice of label, one that makes ‘Pro’ become more and more meaningless. Imagine if Apple introduced an improved version of the AirPods and HomePod, kept selling the original models, but called the new ones “AirPods Pro” and “HomePod Pro” respectively. Yes, that’s how “iPhone Pro” sounds to my ears.

6.

This summer, my wife and I didn’t travel by plane or train, but by car. We went to Italy passing through France, and we made a couple of stops (Nice and Montpellier) on our way back. Our car is a bit old, and doesn’t have a GPS or an integrated satnav. Previous car trips had seen successful navigation by simply relying on our iPhones and Google Maps’ suggested routes and turn-by-turn navigation. This time things worked flawlessly… 96% of the time. And the 4% when things didn’t work, we actually found ourselves in stressful situations for which we had to quickly improvise a way out. In crucial moments — like taking the right turn to properly go back to the autoroute — Google Maps on my iPhone 8 seemed to briefly lose or miscalculate our position; the arrow representing our car would start wandering off; the map would start drunkenly rotating; and Google Maps would start ‘rerouting’ just when we had to decide whether to turn right or proceed straight for another 200 metres. 

We lost ourselves a couple of times. And when you’re too focussed and too reliant on a navigator, and it fails you, you get startled and feel you’re suddenly in the middle of a jump and the safety net has vanished beneath you. Once Google Maps mistakenly thought we had arrived at our destination, when we were actually driving a few parallel streets away, so it triumphantly displayed the ‘You’ve reached your destination’ screen and obviously stopped giving directions. And we had a brief, panicky What now!? What now?! moment. (You might find this funny, but when it’s late evening, you’re driving for the first time in a city you’ve never been before, you don’t speak the local language fluently, you’re near the historic centre and you really need to find parking… Well, amused is definitely not what you feel.)

I still wonder what happened, though. So far, Google Maps has truly been 100% reliable for me. My brother-in-law suggested it might have been something with my iPhone’s GPS or compass — but before trying the silly 8‑shaped calibrating gesture, I switched to Apple Maps, and it was locking on our position without straying unexpectedly. I still prefer Google Maps’ interface, but admittedly Apple Maps came to the rescue at least on one crucial occasion.

7.

A few days ago, I finally decided and bought an Android smartphone. I’ve had this idea for months — not to switch from iOS to Android, but to get a modern Android device to give the platform another close look, among other things. The last time I tried Android it was late 2014, on a 2011 smartphone running version 4.0.4. Of course my experience was a bit meh. What can I say? User interfaces are probably one of the few areas of interest keeping my tech fireplace burning, and I’ve been following Android’s progress by watching a lot of Android smartphone reviews. Many phones have surprised me for their capabilities, and I would really be a fool fanboy if I didn’t recognise that — when it comes to hardware design, features, and software polish — the gap between iPhones and Android phones has been drastically reducing in the last few years.

For my purposes, investing in a flagship Android headset was frankly overkill. Especially now, that brands like Nokia, Xiaomi, Oppo, OnePlus, Huawei, are making very good midrange products, with respectable tech specs and affordable prices. Additional preferences were: a notchless phone, a sufficient amount of RAM and horsepower (so that the phone doesn’t get obsolete too soon), a phone included in the AndroidOne initiative (so that I could have the ‘pure Android’ experience), and finally a smartphone that wouldn’t hit my wallet too hard. 

I found a good deal in the Xiaomi Mi A2. Yes, it’s a model from 2018, but it ticks all the items in my wishlist. After considering buying it second-hand in a local shop, I actually saved money by purchasing it directly online at Xiaomi’s website. I got the variant with 4 GB of RAM and 64 GB of storage, in black, for less than €150. I will share my observations in a dedicated post at a later date, but for now I’ll go ahead and say that after two days with the device, I had virtually no trouble getting used to Android. The Mi A2 is a nice, capable performer, with a very iPhonish design (with the screen turned off it could be mistaken for an iPhone 7/8 Plus when looked from across a room). For being a phone with a 5.99-inch display, it’s surprisingly manageable, and the fingerprint sensor on the back is as fast as TouchID on my iPhone 8. 

My very first impression of modern Android is that it has matured a lot as a mobile operating system and, at least in its ‘pure’ form, it feels way more stable and much more visually consistent than it was just a few years back.

8.

Hotel Wi-Fi offerings are still mediocre. Case 1: Wi-Fi worked, but we had to create a free account to use it (and relinquish some personal data; nothing major, but it really felt unnecessary). Case 2: Wi-Fi worked, connection seemed reliable but not particularly speedy (let’s say acceptable for most tasks) — but WEP encryption!? Really!? Case 3: Wi-Fi didn’t work; credentials were entered correctly, devices connected to the network, but no traffic whatsoever. Maybe that router goes to sleep after 2 AM?

9.

Games I’ve recently played (and still playing) that I recommend: Stories Untold, GRIS, and Kona. I’ve been enjoying them a lot. Your tastes and mileage may vary.

Speaking of games, there are two game-related YouTube channels I’m subscribed to, and both deserve many more subscribers than they currently have:

  • Nick930 — Nick produces terrific game reviews and game comparisons. He’s very honest and balanced, and keeps his reviews pleasantly short (8–12 minutes on average); but he also makes very nice documentaries about the histories of various game franchises. The material is well-researched and well-presented. I stumbled on his channel by chance while looking for videos about Tomb Raider, and found his great History of Tomb Raider (1996–2018). You’ll see that Nick has currently more than 146,000 subscribers, and while it’s a respectable audience, I still think his channel is very underrated and that he deserves at least 5 times that number.
  • Tench Froast — Now, here’s a truly criminally underrated channel. I already talked about Lady Tench Froast (no real name given) back in January, and I still stand by what I said then:

    She’s a smart, witty, genuine, sarcastic, entertaining woman. She loves trying out indie games for the most part, and when you follow her in her playthroughs, you have this feeling of really being there too. She’s engaging, completely direct and spontaneous, and whether you like the game she’s playing or not, you’ll certainly gain a few good laughs from the experience. But I also like the fact that she’s not just in for the jokes and the lulz, she also makes smart observations and criticism during the gameplay, explaining what she likes and why, and what she thinks doesn’t work and why. 

    I also like her “No Commentary” series, where she just shows gameplay without commentary, so that you can focus on a particular game; and the “Toast Bites” series, little compilations of ‘bites’ taken from 5–6 different indie games, with a brief review for each of them. This series is perfect to discover indie titles you may not have heard about, and which could be interesting to try out.

    At the moment, she only has 347 subscribers(!), and it’s a damn shame. She’s a fun, intelligent person producing good quality material only three hundred people will see, while there are a bunch of idiot YouTubers doing idiotic things who have hundreds of thousands of subscribers. It just isn’t fair. She hopes to reach 1,000 subscribers by the end of the year. Really, check out her channel!

Security Monitor

Handpicked

This excellent article by Paulo Andrade (found via Michael Tsai, of course) got me thinking. It’s titled The Alert Hammer and discusses “the increasing number of security alerts Apple has been adding to macOS, both with Mojave (10.14) and the upcoming Catalina (10.15)”.

I’m still on Mac OS 10.13 High Sierra on both my main Macs, and the more I read about the annoying barrage of security prompts in Catalina, and the amount (and kind) of bugs still present in Mojave, the less I feel the urge to upgrade to either.

I fondly remember a time when I used to upgrade to a new version of Mac OS X as soon as it was released, due to the implicit trust I had in Apple to deliver a better, improved Mac OS X version over the previous one. This went on until 10.9 Mavericks. I skipped 10.10 Yosemite entirely (and I still feel I’ve dodged quite the bullet with that one). Then it was 10.11 El Capitan (but I still waited at least until 10.11.3 before upgrading). Then another jump to 10.13 High Sierra (here I skipped Sierra for technical reasons — my 2009 MacBook Pro didn’t support it, and the new iMac I purchased in 2018 came with 10.13 preinstalled). 

Now I’ve switched to ‘active distrust’ mode towards Apple. I don’t feel 10.14 Mojave brings anything particularly useful to me, and 10.15 Catalina even less so. Nothing really worth leaving High Sierra and its general stability behind. Everything I’m reading about Catalina, the experiences of those valiant people trying out the beta, and the technical observations of the more expert users and Mac developers, gives me the impression that Catalina is perhaps the first version of Mac OS that is more useful to Apple rather than their users, if you get my drift.

But I’m digressing as usual. Back to Andrade’s post, I especially agree with him here:

Apple started adding user consent alerts way back in High Sierra. The first time an app would try to access your location, contacts, calendar, reminders or photos a system alert would prompt the user for consent. Mojave expanded these prompts to automation, camera and microphone. And now Catalina adds screen recording, keyboard input monitoring, access to folders such as Desktop, Documents and Downloads, user notifications and Safari downloads…

These alerts are just another step on a long path Apple has been taking to protect user’s data. Previous steps include code signing, sandbox, gatekeeper, the “curated” Mac App Store and notarization.

But security features are most useful when they’re invisible. All previous steps were mostly invisible. This last one… not so much.

[…]

Note how on one end of the spectrum alerts are useless for users that don’t understand the implications of allowing such access and on the other end experts want to turn them off.

So for the benefit of a few power users in the middle of the spectrum that feel more secure with these, every one else gets to be annoyed.

In short, alerts can be useful but they really must outweigh the cost of having them in the first place. And this is where I think Apple is failing badly. They are so excited with this new found hammer that they can’t help themselves but to hammer on.

This made me think about an alternative concept that could bring back some invisibility when it comes to security features. Before proceeding, a disclaimer: this is just an ‘off-the-top-of-my-head’ idea, and I don’t have enough programming expertise to claim that what I’m suggesting is feasible. At an empirical, logical level it should be. Still, I’m a terrible chess player, and perhaps the tool I’m suggesting could be fooled or circumvented by a malicious-enough software/attack.

Here’s my humble proposition: Security Monitor. It would be an application you find in your Utilities folder, and it would behave in a similar way as Activity Monitor. Maybe its interface could be made a bit more user-friendly, so that it could be readable by non-geek users as well. In its main window, you would see all active processes from a security perspective: what they are accessing in your system and, more importantly, whether their behaviour complies with the permissions they have been given — by the system and by the administrator user account. 

The user would still receive alerts to allow apps to access basic sensitive stuff like location, contacts, photos, camera, microphone, etc. but the system would have a more ‘innocent until proven guilty’ approach with the installed software. When checking the main window of Security Monitor, there could be a semaphore colour-coded way to show problematic behaviour. You would see green dots next to apps and processes that are behaving as they should. A yellow dot could indicate an app or process that is trying to access parts of the system that are off-limits to it (the app is trying to do so without explicitly asking permission to the user, and the system is actively preventing access to it). A red dot would mean a security concern requiring additional action on the user’s part.

Of course, in case of a ‘red dot’ situation, Mac OS would alert the user in a very prominent way, with a persistent notification the user can’t just dismiss by clicking OK. A dialog box would appear saying, for example: Keylogging detected — The app ‘Awesome Markdown Editor’ is recording your keystrokes without your knowledge or permission. The only button the user could click is Open Security Monitor. From the app, the user could see additional clues like Awesome Markdown Editor’s attempts to use the network to contact an external server. Security Monitor could give the user the option to quarantine the app and its documents; to quarantine the app but keep its documents accessible; but also to allow the app to record keystrokes because, say, it’s necessary for a certain feature to work (e.g. the app offers a predictive typing option and needs to ‘see’ what you usually type, etc.). The latter would be a risky choice, and Security Monitor could provide an additional confirmation dialog informing the user about the risks involved. But it ultimately would be the user’s choice, and the user’s intelligence wouldn’t be insulted in the process.

As I said, this is a rough idea, and I’m sure there are all kinds of issues with it. My reasoning is, just like Activity Monitor constantly monitors CPU usage, memory usage, energy impact, etc. in a way that is invisible to the user and that doesn’t burden the user, while alerting the user when something out of the ordinary happens, security in Mac OS could be treated the same way. Instead of the paranoid approach — All this area of the system is read-only! You have to authenticate every time an app wants to write something in a folder! App A can’t talk to App B! Alert, alert, alert! — we could have a more reasonable approach where everything is allowed to work normally (the user still needs to grant specific permissions manually, of course, especially when access to sensible locations is involved) but it’s constantly monitored. The system could deal with those apps and processes subtly trying to stray from expected behaviour in the background (yellow alert), and only interrupt normal operations in case blatant violations are detected (red alert). When the user accesses Security Monitor, they could also have the opportunity to review previously-handled yellow alerts.

I’d love to hear your thoughts about this.