EU mandates USB-C as standard for charging ports. Good.

Tech Life

Josh Centers at TidBITS:

It’s finally official. After years of discussion and failed attempts to get the industry to standardize, the European Union has mandated that new rechargeable electronic devices sold in the EU must have USB‑C charging ports by the end of 2024. The law applies to mobile phones, tablets, digital cameras, headphones and headsets, handheld videogame consoles, portable speakers, e‑readers, keyboards, mice, portable navigation systems, and earbuds, and it will extend to laptops in early 2026. The new law’s’ goal is to encourage more reuse of chargers and reduce electronic waste. 

I have been loving the controversy about this in the tech sphere. If you follow me closely on Twitter, I apologise in advance for rehashing stuff I already wrote there. Sometimes I use this space to collect thoughts also to the benefit of those who just read my blog and don’t care about following people on social media.

I find sadly ironical how so many people seem to be just fine with whatever tech (and Big Tech) companies impose on them and on their experience as customers and users of their products, but the moment the EU mandates USB‑C as charging standard, this becomes a scandal.

Of all the silly arguments I’ve heard against this mandate, the silliest is perhaps the one that goes like, This stifles innovation, implying that tech companies — and Apple specifically — should be left free to decide what’s best for their customers. 

Should we take a look at a few decisions Apple took in recent years to offer a ‘better’ experience in the name of innovation?

  • Starting from the release of the iPhone 7 in 2016, Apple arbitrarily decided to remove the headphone jack from iPhones, forcing people to either get wireless headsets or use wired headsets via a Lightning-to‑3.5mm jack adapter; or resort to a different adapter in case they want to use headphones while charging the iPhone. I’m sure Apple is otherwise pleased with the success of their AirPods. The AirPods’ design doesn’t allow for the tiny internal batteries to be replaced, which means more e‑waste at the end of their relatively short life-cycle.
  • In their effort to make thin laptops for thinness’s sake, Apple introduced a new type of keyboard with a redesigned key mechanism called butterfly mechanism. The idea was to improve things, but it turned out to be a poorly-designed solution that resulted in a high rate of failing keyboards, with many many customers having to bring their MacBooks to get their keyboard replaced at least once — but I know of many instances where people had to have their MacBook’s keyboard replaced two or even three times, and sometimes even out of warranty. This of course for customers meant additional expenses, not just headaches.
  • With the new MacBooks introduced in 2016, Apple dropped any port that wasn’t USB‑C/Thunderbolt, leading users to resort to USB adapters for anything — the infamous ‘dongle life’: want to connect a flash drive? Use an adapter. Want to read the SD or CF card of your camera? Use an adapter. Want to connect the MacBook via Ethernet to a wired network? Use an adapter. Want to connect a video projector for a presentation? Use an adapter. And so forth.

Some technophiles are quick in labelling the EU politicians as being idiots, ignorant bureaucrats that don’t know how technology works. Given the examples above, are we so sure tech companies really know what’s best for their customers? 

And what sort of benefits would bring keeping Lightning around, exactly? What’s the ‘innovation’ there? In theory, the Lightning specification would allow for more uses than just charging, but even Apple itself has been under-utilising Lightning. So, if Lightning is essentially reduced to just being an alternative, proprietary charging solution, then I think it makes pragmatic sense to want to standardise charging solutions. Let’s don’t forget that mandating USB‑C will also make Micro-USB connectors and cables hopefully disappear. Along with all those cheap AC adapters (Lightning or Micro-USB) that come with non-detachable cables.

But anyway, what kind of innovation in charging technology this EU mandate is impeding? The only bit of innovation I’ve seen in this field in recent times is wireless charging and fast charging. With fast charging, we’re at a point that a smartphone is mostly recharged in little more than half an hour. Wireless charging still has room for improvement, in my opinion, but mandating a USB‑C port on the device won’t certainly impede progress in perfecting how wireless charging is implemented.

In other words, I think charging isn’t exactly a fast-moving aspect of technology that warrants being immune from standardisation attempts. If it were for the Silicon Valley types, people would have to change their power plugs and outlets every 5 years or so because ‘innovation’.

But even if we embrace the innovation argument, consider the following scenario: one year from now, Samsung comes out with a new charging technology and a new charging port. A proprietary port, of course. The astounding performance of this technology is touted as yet another feature to convince people to switch to Samsung devices. Now we have yet another charging port to deal with. Imagine those professionals who — either for personal or work reasons — are typically multi-device and multi-platform. When travelling they would have to deal with Lightning cables, USB‑C cables, different AC adapters, and the new Samsung cables on top of all that.

Having only USB‑C to deal with simplifies things a lot, and if some people bothered to look beyond their personal use cases, they would understand this. Having only USB‑C means that if you’re travelling with a MacBook, an iPhone, an iPad, and a modern camera, you’re most likely fine by just taking two USB‑C cables with you, and not even additional adapters other than the MacBook AC adapter, if you really want to travel light.

The reduction of e‑waste is something that isn’t admittedly apparent straight away; and that’s why, I assume, many people complain that this EU mandate solves next to nothing in this regard. The efficiency of having only USB‑C for charging starts being noticeable over time, though. When a device or appliance that uses Micro-USB, Lightning, or even other proprietary cables for charging, fails or you get rid of it otherwise, you’ll have to throw the corresponding cable or adapter as well (unless you find a way to reuse it with something else). If a device or appliance that charges via USB‑C fails or get thrown away or sold, both the cable and the AC adapter can be reused with every other USB-C-powered device. You don’t have to throw away proprietary cables/chargers, and you don’t have to buy new ones either. This, over a certain amount of time, and at scale, could be compelling. Less wasteful. Responsible.

Perhaps even innovative, in a sense.

 

EU mandates USB‑C as standard for charging ports. Good. was first published by Riccardo Mori on Morrick.me on 10 October 2022.

Not a great strategy

Handpicked

Via Nick Heer, I’ve learnt that a third-party Instagram client that launched as recently as a week ago was removed from the App Store.

TechCrunch:

Last week, a startup called Un1feed launched an Instagram client called The OG App, which promised an ad-free and suggestion-free home feed along with features like creating custom feeds like Twitter lists. The app raked up almost 10,000 downloads in a few days, but Apple removed the app from the App Store for violating its rules earlier this week.

Separately, Un1feed said that Meta disabled all team members’ personal Instagram and Facebook accounts.

[…]

This app violates our policies and we’re taking all appropriate enforcement actions,” a Meta spokesperson told TechCrunch. The company also pointed to a blog post about clone sites.

Nick Heer:

Thereby illustrating the difference between what some users value about Instagram and what Meta values. Users want to view friends’ photos and videos on their own terms; Meta wants them to watch suggested Reels and shop. 

I’ve titled this brief post Not a great strategy because it’s what I would say to both the Un1feed guys and Facebook/Meta.

Launching a clean Instagram third-party client that actually makes the Instagram experience better, is praiseworthy; but expecting that Facebook/Meta would be okay with it, that’s naïve. By the way, in an update to the story, TechCrunch adds:

Apple told TechCrunch that it removed The OG App as it was accessing Instagram’s service in an unauthorized manner, which violated the Meta-owned platform’s terms. The company cited section 5.2.2 of its App Store review guidelines, which states that if an app is displaying content from a third-party service, it should do it in accordance with the service’s terms of use. 

So yes, this client wasn’t bound to last for very long.

But what about Facebook/Meta’s strategy? They have been progressively morphing Instagram into something, some thing that wants to keep being relevant by mimicking what a more successful competitor — TikTok — has already nailed. 

It’s very unlikely that those who are already addicted to TikTok decide to drop it and switch to Instagram. Maybe they’ll watch a reel or two if it’s from one of their friends (which isn’t super-easy in itself, given just how much Instagram pushes content created by people you don’t know), but that’s it. And it’s very unlikely that someone wanting to express themselves and create stuff in TikTok format would favour Instagram over TikTok. It’s simply too late to think you can beat TikTok at what it does best.

Meanwhile Facebook/Meta is leaving behind what Instagram has done best for quite a long time: a place to share photos and moments from everyday life, and also a place to even showcase your work in a more professional and commercial manner.

No one among my friends and acquaintances likes Instagram now. And it’s not just early days nostalgia. It’s that the experience within the platform has become confusing and user-hostile. A friend commented that it’s like watching a TV channel where the contents are 5% movies and 95% TV commercials, and you never know when you’ll be able to watch the movies.

Sometimes I think that all the extraneous suggested reels and promoted content and non-linear timeline are a way to keep users doomscrolling so that they spend much more time within the platform that they normally would. The problem is that the ratio is wrong — the extraneous content is simply too overwhelming, and as a consequence people get frustrated and exit the app. 

Or stop using Instagram altogether. I used to be a heavy Instagram user until Facebook acquired it. At the time, I didn’t want to delete my account, but I stopped uploading photos and kept my account active so that I could continue to comment and connect on other friends and follower’s photos/videos/stories. But even this kind of activity has become difficult and unpleasant simply because Facebook/Meta have decided to throw unwanted content in my face as I scroll the fucked-up timeline in the hope of finding a friend’s photo or moment to react to. As a result, I’m finding myself accessing Instagram more and more infrequently. And I’m definitely not alone in this.

Tech companies today are obsessed with evolving because the idea of keep doing what you do best doesn’t seem viable in the long run. But I disagree. Of course I’m not saying that one shouldn’t change anything at all and stay still, but deviating too much from the formula that made you extremely successful isn’t a great strategy either, as we can see in Instagram’s case. Despite its missteps and flaws, Twitter has done a better job at this. Twitter today is very different from what it was in 2006, it has certainly become richer and more complex, but the core idea is the same. Twitter, too, has been adding intrusions to the timeline and has pushed for a non-linear timeline, but the non-linearity is fortunately still optional, and the intrusions aren’t overwhelming to the point that you stop seeing tweets from your friends and people you follow. 

Instagram on the other hand has made insecurity its instability and volatility.

Apple’s Far Out event: a few observations

Tech Life

1. Enrich people’s lives

Yet another presentation where Tim Cook has used the expression, Enrich people’s lives. I know it’s Apple’s mission, but the man is really starting to sound like a broken record in his introductory speeches.

2. Save people’s lives

The first segment was about the Apple Watch. Those testimonials with ‘regular people’ recounting how their Apple Watch ‘saved their lives’ felt so off to me, so contrived, and ultimately lacking taste. Yes, yes, Apple, you desperately want people to think of the Watch as a useful tool first, luxury gadget second, but sometimes it’s enough to let the device speak for itself. 

Those testimonials were meant to sound gripping and moving, but turned out to have almost the opposite effect for me — they sounded artificial, they felt more docu-drama than documentary, and in some cases borderline ridiculous: if my heart rate spiked to 187 beats per minute, I would notice there’s something wrong without having a smartwatch tell me it’s better I call emergency services. I had a terrible, frightening panic attack in early 2004 triggering an episode of tachycardia I had never experienced before or after. When the paramedics coming to my apartment checked my heart rate and told me it was 178, I was already feeling a little calmer thanks to their very presence, therefore my heart rate must have been even higher when I decided to call them earlier that night. I didn’t have a smartwatch telling me my heart was racing; I felt it myself.

3. Useful, boring, not for me

Keep in mind I’m not particularly interested in the Apple Watch as a product. I’m glad it exists and I’m glad many people love it and find it useful in their daily lives. It’s not a product for me, though. Despite my previous observation, I don’t question its usefulness for fitness and health. It’s simply a device that does too much, throws too much information at the user, has a complex user interaction design (too complex for what I want in a watch) and — last but not least — I just don’t like its visual design. 

Speaking of visual design, the main Watch Series keeps looking iterative, and I joked on Twitter, 2015–2022: seven years of Apple Watch looking essentially the same. The Apple Watch Ultra, on this front, feels fresher finally. 

4. Push those boundaries harder

Someone high in Apple’s hierarchy must really love this stupid naming scheme based on Pro, Max, Ultra because they’re surely sprinkling these suffixes like stardust across their product lines. Apple Watch Ultra just sounds ridiculous to my ears. Anyway. 

Considering the Watch Ultra’s target audience (explorers, athletes, scuba divers, rugged outdoor adventurers), I’m a layperson, and as a layperson its set of features initially felt very cool and useful. But my friend Alex Roddie, an experienced outdoorsman, is not impressed. He shared his first impressions with me on Twitter as the event was unfolding:

As someone who has tested countless GPS watches actually designed for mountain use, I’m not impressed by it. 36 hours battery life is pathetic. 65 hours would be just about competitive these days. 

And by 65 hours I mean 65 hours of full-burn GPS tracking. I doubt that the Watch Ultra can cope with even a third of this, which makes it years behind the competition.

It’s so painfully obviously a device designed by urban people who want to ‘disrupt’ a market they don’t understand.

I look for a device that will last at least 13–14 hours of full-burn GPS tracking in a day, and then do this day after day without charging, offline, in sub-freezing or wet conditions, and with zero babysitting.

If I need to charge it more than twice a week, or go online more than occasionally, then I’m not interested! Just like every other Apple Watch, this requires too much babysitting for serious mountain/trail use.

Admittedly, for a watch that is designed to ‘push the boundaries’, 36 to maybe 60 hours of battery life doesn’t feel like a lot of pushing. Which makes me wonder, why not design the Ultra in a different way, removing every possible battery-draining feature in the first place, instead of having that huge, bright OLED display? Maybe they wanted to guarantee maximum readability, but again Alex Roddie chimes in:

Almost every serious outdoor GPS watch has a transflective display, perfectly readable in sunlight, with a backlight that’s off by default and sips power. A power-hungry OLED is the wrong choice.

And adds:

I also saw nothing in that presentation about backcountry mapping software. Yes, you can install WorkOutDoors, but that’s a third-party app, depending on the work of a single developer. Where’s the first-party topo mapping support?

My guess is that the Apple Watch Ultra will be a success, overall, but its sales will be mostly driven by a less extreme audience — Sunday hikers, recreational divers, and people who want to look cool with the more ‘rugged watch’. I’m not in the market for an Apple Watch, but if I were, I’d probably get an Ultra just because the regular Apple Watch design is so iterative and boring that the Ultra’s looks quite fresh in comparison. 

I have the feeling that many of those people who really push the boundaries, the people Apple wants to market the Ultra to, have already realised that this watch is too limited — or simply inadequate — for their needs, and will keep relying on their Garmins, Suuntos, and Casios. Sure, the Ultra may have potential, but you don’t purchase a tool that must have your back in highly dangerous situations based on what it may be capable of in a future iteration or software update. 

5. Small earphones, short observation

New AirPods Pro. Hard pass. I’m sure they’re great at what they do, but I simply cannot use this type of in-ear earphones. I’ve tried several, from many brands, but they simply don’t stay put in my ear canals. If I had to choose an ideal model of true wireless earphones, the third-generation regular AirPods would be my pick. They’re not in-ear, and their stems are short enough as not to be ridiculous like the first AirPods. Though I’m not sure I’d spend €200 for a product whose life cycle coincides with the one of its tiny, non-replaceable battery.

6. The fourteens

Ah, the new iPhone 14 line. As all rumours anticipated, the iPhone mini form factor is no more. There is a regular iPhone 14 with a 6.1‑inch display. Then there’s a new, bigger regular model — the iPhone 14 Plus with a 6.7‑inch display. Both these models feature the same notch and the same A15 Bionic chip of last year’s iPhone 13 line. There are subtle differences if you read the tech specs carefully. Probably the most notable (i.e., the least insignificant) difference is that the A15 Bionic chip of these two iPhone 14 models has a 5‑core GPU, while the A15 in the iPhone 13 models has a 4‑core GPU. And of course the newer iPhones have the new emergency features touted at the event — Emergency SOS via satellite, and Crash Detection. 

If you don’t like big phones, you’ll have to hold on to your iPhone 12 or 13 mini for a bit longer (or there’s always the SE). Pro or not, the new fourteens only come in two sizes — big (6.1″) and bigger (6.7″).

The iPhone 14 Pro and Pro Max are of course the more interesting devices. They do feature a new chip, the A16 Bionic, and a more sophisticated and capable camera array on the back. At this point, several professional photographers have already chimed in, explaining and showcasing what kind of improvements you should expect, and how everything compares to last year’s iPhone 13 Pro. I liked Ted Forbes’s first impressions video and the obligatory annual in-depth feature by Austin Mann. Check them out, they can surely guide you through the details better than I could.

My takeaway is that if you’re a hardcore iPhone-only photographer, and you’re constantly looking for the best camera experience in an iPhone, you’ll probably want to upgrade from your 13 Pro. If you just use the iPhone as the quickest shortcut to take a photo, and want to take the occasional good-looking photo, I suspect a regular 14, and even the previous 13 and 12 models will be enough (13 mini and 12 mini if you, like me, prefer smaller phones).

And if you, like me, don’t really care about camera specs and performance in an iPhone, because you still prefer using traditional cameras, you’ll end up saving even more money. If you also hate big-ass iPhones with notches, then you’re welcome to do as I did — purchase a third-generation iPhone SE.

7. Dynamic Island, the place where fanboys get high

If you’ve been following me for a while, you know I’m really passionate about user interfaces, and in fact many readers have already contacted me and urged me to share my thoughts on that new mix of hardware and software feature of the new iPhone 14 Pro models — the Dynamic Island.

I generally agree with everyone else: it’s a clever feature and an intriguing execution to solve an otherwise annoying design detail iPhones have had for 5 years now: the notch.

Despite the mantra You get used to the notch pretty quickly that everyone and their dog and perhaps even Apple themselves have been chanting since the iPhone X debuted, the unquestionable thing with the notch is that it was there, in all its ugliness, taking up most part of the very top of the display, disrupting the status bar’s usefulness, and generally being an intrusive element and a sore sight æsthetically.

Physically, this new Dynamic Island is detached from the upper bezel and is smaller than the notch we’ve seen on iPhones since the X. And as I tweeted during the event, my first impression is that at least with the Dynamic Island, Apple has found a way to embrace this minor notch in such a manner that makes people look at it instead of making them try to ignore it.

Look at it and also actively interact with it, because it’s been transformed into something that’s indubitably useful, and with an interaction model that finally seems to have been designed by people who know something about what they’re doing.

Of course, however, now we have all the geeksphere and fanboyland cheering Apple as masters of genius design and interface innovators, meanwhile my eyes have been rolling so much they hurt. Even John Gruber dared to cast this fireball with a straight face (emphasis mine):

I don’t think an iPhone-style Dynamic Island will ever come to iPads, either. For one thing, I’m inclined to think iPad bezels will never shrink to the point where the sensor array won’t fit behind them. For another, iPads now have mouse pointer support when connected to a trackpad and the same illusion-ruining factor I mentioned about the Mac would apply. But here’s an idea: perhaps the Dynamic Island would come to the iPad purely in software. The iPad hardware sensor array would still be hidden in the bezel surrounding the display, but iPadOS could render a pure software Dynamic Island on screen. That, I think, would work completely. You could rotate the iPad and the Dynamic Island would always be at the top. The mouse pointer wouldn’t disappear under any actual hardware sensors. It’d just be a black stadium rendered entirely by software. It could actually be more elegant than the iPhone’s Dynamic Island because there’d be no sensors to disguise.

Yeah, let’s draw a persistent black spot in an otherwise clean user interface because why not. Because now apparently there is no other way (more elegant, more device-appropriate) to replicate the functionality of the Dynamic Island. Instead of working towards eliminating all kind of display intrusions, let’s literally go draw these intrusions where there’s no reason to. By the way, I suspect that if Apple really did that to iPads, the Dynamic Island won’t be “more elegant than the iPhone’s” because it probably would have to be bigger for usability reasons — I doubt that the notifications and animations on a Dynamic Island that’s kept at the same iPhone size on an 13-inch iPad Pro would be as useful, enjoyable, and readable.

So while I agree that the Dynamic Island is a clever bit of UI, it still remains a workaround to make a hardware design weakness become a software and UI strength. And I’ll say it’s great work, indeed. No snark here.

But also…

8. Mac OS, the castaway on an island bereft of ideas

What are these same clever Apple designers doing on the Mac? As I watched the Dynamic Island being illustrated and demoed during the event, I kept thinking about how this design cleverness has been sorely lacking on Mac OS for years. And I am, once again, left with the impression that the software designers at Apple today have generally a better understanding of iOS than Mac OS. New features that are iOS-first or iOS-only feel certainly more organic, more fitting, more ‘right’ for lack of a better term.

What they’ve been doing on Mac OS — or rather, to Mac OS — are repeated attempts at a visual and functional iOS-ification that leave many long-time Mac (and computer) users baffled. And not because these users are “afraid of change”, or “don’t understand Apple’s innovation”. But because this general dumbing-down of Mac OS and the Mac’s UI shows the incompetence of UI designers who don’t get basic UI principles of traditional computers’ operating systems, and are arrogantly trying their new coats of paint because “it’s time to touch things up otherwise they feel too stale”. They keep fixing what isn’t broken. The result is the needless and badly-executed redesign of System Preferences in Mac OS Ventura. The result is shoehorning yet another multitasking interface layer — Stage Manager — that is entirely not needed on Macs because what was there already worked well enough; Stage Manager looks and feels like a last-minute bolt-on that complicates the multitasking UI instead of making it more efficient and streamlined. 

It’s like Apple’s mission with the Mac’s UI has become to take by the hand all these poor users coming from iOS devices who might find the Mac soooo difficult, soooo complicated to use, and need its UI to be as close to the iPhone and the iPad’s otherwise they’re utterly lost. At times I even suspect that many of those interns at Apple working on Mac OS are all iOS-devices-first people. 

All the clarity about the direction iOS has to take, along with the iPhone; the way iPhone/iOS features are thought out, developed, and implemented, appears almost nonexistent on Mac OS and the Mac. Where is, in Mac OS, that solution that is as clever as Dynamic Island on the iPhone 14 Pro models? Where is, in Mac OS, that attention to detail, that innovative thing that makes you utter, Hah, they clearly know what they’re doing and where they want Mac OS to go — where is it?

The case for MacBooks without webcam

Tech Life

It’s a bit sad that, in all his decades-long career at Apple, Phil Schiller will probably be most remembered for his two infamous on-stage remarks, Can’t innovate anymore, my ass! (uttered during the presentation of the 2013 Mac Pro), and Courage (during the presentation of the iPhone 7 and 7 Plus, explaining the decision to remove the headphone jack — “the courage to move on, do something new, that betters all of us”).

Both remarks have, with time, basically become memes and — like the evergreen Think Different — are often used as retorts to criticise some decision or stance taken by Apple.

Well, here’s my idea to file under the Courage category; an idea that would solve both the questionable notch design of MacBooks’ displays and the not-so-great quality of the webcams they come equipped with. Just, remove the webcam altogether.

If you’re thinking I’m the only crazy one to have had this idea, I’m not. In fact, several people made this suggestion on Twitter and via private emails after reading my recent article on the terrible design detail that is the notch in MacBooks.

The idea came to me a bit earlier this year, when I was trying to remotely help an acquaintance set up their phone as webcam for Zoom calls because their laptop’s built-in webcam had failed and wasn’t recognised anymore by the computer. When Craig Federighi introduced the Continuity Camera feature at the WWDC 22, that lets you seamlessly use your iPhone as a webcam for your Mac, I started thinking that maybe this idea wasn’t as crazy as it sounded at the beginning, even to me.

It’s still a bold proposal, so of course it needs to be more detailed than, Just get rid of the webcam on all MacBooks, remove the notch in the process, and be merry.

Like other people suggested, I would restrict the webcam removal to the MacBook Pro models, while more entry-level machines like the MacBook Air would keep their webcams. The reasoning here is that the target audience of an all-purpose Mac like the Air is more likely to need a webcam on a frequent basis, and for them a laptop with an integrated webcam is the best solution. Pro users (at least those I’ve talked with) tend to use the webcam more sparingly, and they also tend to have good, updated iPhone models; so, when they need to be on the occasional video call on their MacBook Pros, they wouldn’t have any problem taking advantage of the Continuity Camera feature. 

Webcams in laptops are an ongoing technical challenge. The only sensible placement is at the top centre of the laptop’s lid, and today more than ever, laptop lids are thin. Too thin to accommodate high-quality photographic equipment. And so, compared to the very high-quality camera hardware in smartphones, when it comes to laptops we’re mostly stuck with sub-par webcams whose video quality can only be improved (a bit) via software. For a FaceTime or Zoom call, they’re probably enough, though sometimes a combination of poor webcam quality and not-optimal lighting conditions can give you an unflattering look when you’re broadcasting yourself.

When I think about a future iteration of webcam-less MacBook Pros, I don’t really see any major downsides. Yes, having to pull out your iPhone and secure it to the MacBook Pro’s lid makes things a bit less immediate, especially when the video call is not planned, and you’re the one being called, but if the Handoff/Continuity mechanics work well enough, you would get the video call on your iPhone then seamlessly continue on your MacBook Pro when you place the iPhone on top of it moments later.

An objection to consider is, But Rick, what about MacBook Pro users who don’t have an iPhone and use an Android phone? My snarky response would be, Why, do you know any? While my more serious response would be that there are software solutions — like Camo — that let you use any phone as a webcam for your Mac or PC.

I think the only people to find webcam-less MacBook Pros cumbersome to use are those who need the power of a MacBook Pro and simultaneously have to use a webcam on a very frequent basis. Here I guess that, knowing beforehand that MacBook Pros come without webcams, they would organise and plan a workaround before purchase. If Apple really removed the webcam in future MacBook Pros, this wouldn’t historically be the first time Apple removes something that ends up annoying a segment of their user base, until people work around it and life goes on. I won’t even mention the removal of the floppy drive in the first iMac back in 1998; more recently, I’m thinking of the removal of the headphone jack in iPhones, or leaving behind certain ports in MacBooks that are still relevant on a practical level, like USB‑A or Ethernet.

The only case that would work against a webcam removal in Apple laptops is that if Apple is planning to bring FaceID to Macs, then the necessary camera array for FaceID must be present. Unless they find a way to implement it even when using an iPhone with the Mac via Continuity Camera. 

From a design standpoint, removing the notch and the webcam would be a win both in the looks and functionality departments. MacBook Pros’ displays would have cleaner lines again; bezels could be made even thinner (you bezels-obsessed folks are already gasping in excitement, I know) and displays a bit larger without making the laptop physically bigger. I bet most MacBook Pro users would accept this kind of trade-off. Overall, I consider the idea of removing the webcam from MacBook Pros less crazy than slapping a notch in the top centre of the display. But let me know what you think, as usual via Twitter or by shooting me an email.

The notch is wrong: feedback and follow-up

Tech Life

Two weeks ago I published a piece that was essentially about something I needed to get out of my system, because I was starting to feel like I was the weird one for maintaining a strong negative stance on the subject. I’m talking about the so-called notch, a questionable design element that Apple, after featuring it on iPhones for many iterations, deemed worthy of applying to Mac laptops as well.

Unlike many people, whose reaction to the notch was just a shrug — both when it debuted on the iPhone X in 2017 and on the 14- and 16-inch MacBook Pros in 2021 — I was extremely put off by it. Especially when it appeared on Macs.

Back in October 2021, when reacting to the first notch appearance on the then-freshly introduced MacBook Pros, I wrote:

[B]ack to the notch: it was completely avoidable. You can justify it however you want, but it has the same fundamental characteristic as its iPhone counterpart — it’s just plain ugly. It is indeed a design compromise on the iPhone because on such a portable device on the one hand there’s the need to maximise screen real estate, and on the other there’s the simple fact that you have to provide a sophisticated front-facing camera with the necessary technology to enable FaceID. So you design a display with a screen that reaches the top where possible, i.e. the area surrounding the notch. You provide as many pixels as possible given the circumstances.

And yes, putting that notch on the MacBook Pros might have originated from the same impulse — maximising screen real estate. But while on the iPhone this was a need, on the Mac it’s just a want. Again, with displays as big and pixel-dense as those in the new 14 and 16-inch MacBook Pro models there’s no need to maximise screen real estate. You don’t need to carve a space up top where to shoehorn the menu bar, as if it were an annoying, restricting UI element, and splitting it up in the process. To me, this makes no sense from a design-is-how-it-works standpoint. It looks like an urge to make a design statement for design statement’s sake — as if Apple products needed some signature design quirk to be recognisable.

Ever since the notch’s introduction as a design element on Macs, every time I engaged in some discussion about it, other people made me feel as if I was the silly one for reacting so strongly about it. Why are you making such a big fuss about it? and What’s the big deal? were among the most typical responses I’d receive.

The feedback I got tells a different story

At the time of writing I’ve received a total of 61 email messages about my article The notch is wrong. Of these, 55 are from people who essentially wrote to thank me for writing that piece and almost every one of them added something along the lines of “I thought I was going crazy and was the only one who hated the notch that much”. I admit that this kind of feedback made me feel much better and even a bit vindicated.

Of the remaining 6 messages, 3 were kind of neutral about the notch (for example, J.A. wrote “I do get your criticism, I’m not a fan of the notch either, but I’ve got accustomed to it and when using my MacBook Pro it doesn’t really bother me.”), and 3 were instead quite supportive of the notch.

This is just anecdotal data, of course, but it’s interesting to see that these 61 emails came from all over the world (it’s a guess based on people’s names — I’ve recognised English, French, Italian, German, Dutch, Polish, Korean, Indian and Japanese names) and from people of varying degrees of tech-savvy. In other words this sample, however small, didn’t feel like originating from the same ‘bubble’, so to speak.

Two important points I should have articulated better in my previous article

When I consider the remaining 6 emails, those with the neutral-to-positive stance towards the notch, in at least 4 of them my correspondents wrote something like, “Your piece sort of makes me feel judged by deciding to purchase a MacBook with a notch, almost as if I were told that I have bad taste when it comes to design”.

And the second thing common to many emails was something like “Yeah I don’t think the notch is ultimately that big of a deal; believe me, you stop noticing it after a few days, it really is unobtrusive”.

Responding to these remarks, I want two things to be especially clear in my harsh criticism of the notch:

  1. I’m definitely not passing judgement on those people who have purchased or thought about purchasing ‘notched’ MacBooks. The 14- and 16-inch MacBook Pros are exceptional machines, and the M2 MacBook Air is a capable all-purpose laptop. Save for the notch and a few other small details, I generally love the design of these Macs, and I’m really happy if they’re the solution that best fits your needs. What can you do about the notch? Nothing, really; it’s something Apple forces down your throat whether you like or not. It’s not you, customer, who has bad taste in design here — it’s Apple.
  2. My criticism of the notch is purely design-oriented. The point I’m trying to make is that we shouldn’t think of the notch as good or bad design depending on if and how much we ‘notice’ it. But that the notch is bad design whether we notice it or not, whether it bothers us little or very very much.

Could it merely be an æsthetic concern?

F.W. writes me:

Don’t you think that your dislike for the notch is merely a matter of looks rather than functionality? If functionality isn’t really impacted, shouldn’t we conclude that the notch isn’t that deeply flawed as your critique would imply?

It’s a good question. While I think a great part of my aversion to the notch is indubitably tied to its visual ugliness, I don’t agree that the notch doesn’t really impact functionality. If Apple itself tells developers they need to take the notch into account when designing their apps, then Apple itself is recognising that the notch could potentially be an issue, functionality-wise. I keep quoting this tweet from Linda Dong (Apple Design Evangelist) because I think it’s very telling of the kind of approach Apple’s having here:

Either way it’s still a great idea to keep menu bar titles short and consolidate menus when you can for usability’s sake! Hunting through a million menus is never fun even on pro software.

She’s suggesting to keep menu bar titles short and consolidate menus because otherwise this happens:

Effect of the notch on the menu bar. Annotated screen capture.
In one of the feedback emails I received, one of my readers attached this screenshot taken from Marques Brownlee’s review of the M2 MacBook Air, where you can see Pixelmator Pro in use (a damn good app, by the way). The annotations are mine.

What she’s suggesting is actually not a good idea for usability’s sake. It’s just a suggestion to avoid making Apple look bad for having arbitrarily introduced a hardware detail that actively interferes with one of the most important UI elements in the whole operating system — the menu bar.

Apple introduces the notch, and then developers have to do unnecessary extra work on their apps to mitigate the potential interference of this element.

  • Keep menu bar titles short — this doesn’t take into account at all any other language that isn’t English, Chinese or Japanese. Just take the Finder menu bar titles. In English, they are Finder, File, Edit, View, Go, Window, Help. All short words, most are 4‑character long; the longest is 6‑character long. In German, the Finder menu bar titles become: Finder, Ablage, Bearbeiten, Darstellung, Gehe zu, Fenster, Hilfe. In Spanish we have Finder, Archivo, Edición, Visualización, Ir, Ventana, Ayuda. Not all languages can afford short words. But even if we just stick to English, menu titles should be as clear and descriptive as possible. They shouldn’t be kept artificially short to accommodate a questionable design compromise.
  • Consolidate menus — “Hunting through a million menus is never fun even on pro software”, Dong says. You know what’s not fun either? Scrolling unnecessarily long menus because you had to consolidate into one menu a series of commands that were previously spread across three menus and it made sense that they were spread this way. Changing places to commands because you need to consolidate menus and reduce the number of menu bar titles because there’s the real possibility that they will collide with the notch, is the polar opposite of doing good usability. Same if you think you could maybe transform a list of menu commands into a popover or drop-down menu hidden behind an icon on a toolbar.

Some wrote me that they still haven’t encountered applications with menus that get displaced and pushed to the right of the menu bar by the presence of the notch. I don’t have recent Adobe Creative apps (the last suite I used is the CS3), so I can’t check, but historically apps like Photoshop and InDesign have had plenty of menu bar titles. Not long ago I also tried out Affinity Photo and Affinity Publisher, and I remember that the menu bar was pretty crowded. Between menu titles and all the menu extra I usually keep on my 13-inch MacBook Pro, I’m pretty sure that if my MacBook Pro had a notch, there would have been disruption up there in the menu bar.

The screen real estate gains purportedly allowed by the notch

This is perhaps the strongest argument I’ve heard from people who don’t mind (or actually welcome) the notch. And while they’re not technically wrong, I still think that what the notch gives you, display-wise, is in most cases simply not enough to justify such kind of design compromise.

If we compare a 13.3‑inch M1 MacBook Air with a 13.6‑inch M2 MacBook Air, their screen resolution is horizontally identical (2560 pixels), while vertically the M2 Air is 64 pixels taller than the M1 Air (1664 vs 1600). Those 64 pixels are essentially the height of the menu bar (split by the notch in the middle). And that is the total of ‘new’, really additional space you have on an M2 Air compared with the M1 model. Yes, the M2 Air has a physically bigger display than the M1 Air, but since the resolution is essentially the same, you won’t see more stuff on the M2 Air’s display. For the most part, you’ll see the same stuff as on the M1 Air, but slightly enlarged. Again, the only real space you gain is 64 pixels vertically.

And while the M2 MacBook Air has physical dimensions that are impressively close to the M1 MacBook Air, the latter is still a tiny bit shorter in height. Just today I picked up both computers in an electronics store, and their overall mass feels essentially the same. The M2 Air weighs 50 grams less than the M1 Air, but when holding both Macs, I couldn’t really tell the difference. One is not dramatically lighter or more compact than the other. But the thinner bezels of the M2 Air really do the trick. The display is bigger than the M1 Air by 0.3 inch diagonally, but feels bigger too. It’s a well-engineered deception (in the sense that, yes, it’s physically bigger, but the only added screen real estate are those 64 vertical pixels).

Another interesting comparison is between the older 2019 16-inch MacBook Pro and the 2021 M1 Pro/M1 Max 16-inch MacBook Pro that features the notch. Resolution-wise, the newer MacBook Pro clearly wins (3456×2234 vs 3072×1920), so here there is a substantial increase in screen real estate compared with the 2019 Intel 16-inch MacBook Pro, but the M‑class 16-inch MacBook Pro is actually thicker, taller, and heavier than the 2019 Intel model, while having essentially the same width (35.57 cm vs 35.79 cm of the Intel model). So yes, here you indeed have a bigger screen in a Mac that is more or less the same size of the previous Intel model, but you don’t end up with a more compact form factor.

The ‘winner’ here is probably the 2021 14-inch MacBook Pro, but it’s also the hardest laptop to draw a fair comparison with. We could compare display size, resolution, and the machine’s physical dimensions with the short-lived 2019 15-inch MacBook Pro, but that comparison has already been won by the 2019 16-inch MacBook Pro itself, having a bigger display, better resolution, and surprisingly similar physical dimensions.

Maybe we could pit the 2021 14-inch MacBook Pro against the 2020 M1 or 2022 M2 13-inch MacBook Pro. And here, the former clearly wins on all aspects: bigger display, better resolution, only roughly 1 cm taller and wider. 200 grams heavier, but given the appreciable performance leap, you can forgive that. The only big difference here is price. If you have a relatively tight budget, there is no compactness or bigger screen advantage for you, you’ll have to choose a smaller, more affordable MacBook. If you don’t mind the notch, the M2 Air may work for you. If the notch annoys you, you want the M2 chip, and don’t mind the Touch Bar, then it’s the base 13-inch MacBook Pro. Otherwise, the best option still remains the M1 Air, in my opinion.

The point of all this long-winded excursion about screen sizes, resolutions and Mac laptop’s physical dimensions is that — except the 14-inch M‑class MacBook Pro — the ‘notched’ display design doesn’t really give that substantial an advantage over a ‘non-notched’ display with a thicker top bezel. Especially in the M2 Air vs M1 Air comparison.

Again, the focus here remains on a purely design-oriented speculation. Pragmatically, lots of customers will chose the ‘notched’ MacBooks because they offer many other tangible advantages: faster chips, more memory, qualitatively better display panel technology, more ports, etc. When you consider these specs and your needs, you clearly give them precedence over design considerations about a funny notch. Here I openly recognise I’m in a stark minority, since I’m not willing to give in, and I won’t purchase a Mac laptop that has a notch, no matter what. It bothers and upsets me too much on a conceptual level for me to ignore it. That’s me, and I’m perfectly aware of my principled stubbornness here.

But I’m very glad for all the feedback I’ve received so far. My sincere thanks to everyone who took the time to write me an email on the subject. At least I don’t feel alone or misunderstood in my strong aversion to the notch. I’m still hopeful that this design compromise will only last a few years and will be discarded in the next major MacBook redesign.