People and resources added to my reading list in 2024

Tech Life

Welcome to the twelfth instalment of my annual overview of my most interesting discoveries made during the previous year. Traditionally, the structure of this kind of post includes different categories of resources: blogs, YouTube channels, cool stuff on the Web, and so forth. Such structure isn’t going to change, but if my previous instalment was perhaps unusually brief, I’m afraid the current one is going to be even briefer. There are a few reasons as to why:

One. For more than half of 2024, my attention was primarily focused on personal matters. Having to find a new place to live, the process of purchasing such place, the move, and finally settling in the new apartment was a time and energy sink for both my wife and I. My time online was mostly spent working and engaging in some light social media activity, and not much else.

Two. What I wrote last year speaking about 2023 didn’t change much in 2024: I’ve often mentioned this low tide brought up by a general feeling of ‘tech fatigue’; as a consequence, [during 2023] my interest in adding technology-related sources to my reads was rather low. I even neglected to stay up-to-date with the people and blogs I was already following. That feeling of tech fatigue started receding a bit towards the end of 2024, when I received a Nothing Phone 2a as a birthday gift — an event that gave me the final push to switch to Android as my primary phone platform, leaving my iPhone SE 3 as a secondary device.

Three. Another thing I wrote in the previous instalment of this series, was this:

This exhaustion stage, this tech burnout, is necessary as well. I’m more and more convinced that more people ought to reach this stage, to then try to approach tech in a different — hopefully healthier — way. Because the next stage is to focus on whatever good remains out there after the squeeze. That’s why I’m trying to approach 2024 with the goal of finding out who and what’s really worth following, who and what is truly distinctive, who and what is ultimately worth my (and your) time. Mind you, it’s what I’ve always been trying to do when compiling these yearly overviews; the only little thing that has changed is that from now on I’ll try to be even more selective. 

You know what happens when you get even more selective? That maybe you follow a link to a blog article, and you like the article, but then you explore that blog further and you realise that such article — and perhaps a couple more — is the only highlight of that blog, and you start wondering, Is this website worth adding to my RSS feeds, or should I just share the link to that specific article and let others decide?

In most cases, I’ve ended up bookmarking & sharing articles instead of adding blogs to my reading list. But what if it turns out to be a mistake and I miss out on some good writers/bloggers? Well, if I bookmark something, chances are I’ll return to that article and website at a later date, and if I find enough stuff I like on my subsequent visits, I may decide to recommend the whole package. Also, if the author keeps writing good stuff, it’s very likely I’ll get other recommendations about them, so I don’t really miss out on anybody. And even if I do — let’s be real for a second — time is a finite resource; I’ll never be able to read or watch everything from everyone I cross paths with. 

Another thing that happens when you get more selective is that you start looking harder and harder at the resources you’ve already discovered — all those RSS feeds, all those YouTube channels, etc. — and you reassess them with a fresh pair of eyes. This is why, during 2024, I’ve been subtracting rather than adding to my resources’ reservoir, so to speak. Interests change, people change (or don’t — and that, sometimes, can be a problem), the quality of a blog or YouTube creator’s output may become less consistent or patently decline… And so it’s time for some pruning and tidying up.

Blogs

Just two:

  • Passo Uno, by Fabrizio Ferri-Benedetti. Fabrizio is a technical, UX, and programmer writer, and that should give you an idea of the main topics he covers in his blog. I like his clear, efficient writing style, and even when he talks about stuff I’m not super familiar with, I often feel stimulated to learn more about it. As for the blog’s name, as Fabrizio states in the About page: ’Passo uno’ is Italian for ‘stop-motion’. It also means ‘step one’.
  • The blog of Vitor Zanetti. I discovered it when Vitor started following me on Mastodon, as I’m always curious to check out other people’s profiles and websites when they follow me on social media. Vitor’s blog doesn’t seem to have a main focus: he may talk about technology in one post, then muse about design in another, or share observations sparked after watching a particular film. Like me, he doesn’t post frequently, but I find his writings to be inspiring and thought-provoking, and perhaps you will too.

Newsletters

I’m not typically a fan of the newsletter format; I can’t exactly explain why. The fact that, once you subscribe, the newsletter is something that comes to you instead of you going to it should be a convenient and preferable dynamic. Instead, I often end up treating it like advertising email, and ultimately ignore it or just skim the part that’s visible in my email client. Over the years I’ve subscribed to many newsletters on a whim — they were genuinely interesting and well written — but I’ve also ultimately unsubscribed from most of them due to lack of time and engagement.

The sole exception I made in 2024 was for Ed Zitron’s Where’s Your Ed At? which I basically treat as a long-form blog. I receive the email updates, but I’m also subscribed to the feed. If technology and the tech industry are your main interests, you should already know who Ed Zitron is. But if you don’t, well, it’s best if I link to the newsletter’s About section. You’ll find everything you need to know. I really, really recommend Ed’s newsletter. Each instalment is generally a long read, but very worth your time.

I started following Ed on pre-Musk Twitter years ago, and was reminded of his work again in recent times when I was looking for materials and information about ‘AI’. And I found out that Ed and I share basically the same (negative) views about it, only Ed has the know-how to talk about it with much more clarity and authority that I have on the subject. A lot of people have asked me to talk more often and more at length about ‘AI’, LLMs, the industry, and why I think it’s largely bullshit. My advice is to subscribe to Ed’s newsletter if that’s a subject of particular interest to you. You’ll find a lot of information, and you’ll know that Ed and I are on the same page. 

YouTube channels

Around September 2024 I looked at my YouTube subscriptions list and was horrified to realise that I was following 136 channels. Yeah, things had got rather out of hand, and so I started unsubscribing from a lot of channels I had added simply after discovering a single video or following a recommendation for a single video. Despite being a mature platform, I’m routinely baffled by how rudimentary YouTube’s tools are for organising content. For instance, I’d love to have the ability to categorise my subscriptions and put them in separate folders, like one does with RSS feeds, so that I can more easily get to those creators whose content could be filed under ‘photography’ or ‘tech’ or ‘gaming’ or ‘lifestyle’ or ‘cooking’ or ‘architecture’, and so forth. Instead, all YouTube offers is an unsorted list on the left sidebar of the home page, vaguely organised by creator activity/frequency of uploads. It gets messy, fast.

After spending the best part of an afternoon reviewing my subscriptions and mercilessly remove a lot of unwanted or uninteresting ones, I ended up with half the initial amount — which is still a lot, but becomes way more manageable. Again, follow my self-imposed Be more selective guideline, the only discovery really worth sharing is, in my opinion, Howtown.

The channel description is perhaps a bit terse: The “How Do They Know” show from journalists Adam Cole and Joss Fong. So it’s better if you watch their short introduction video. Essentially, Cole and Fong create video essays on different subjects to answer the question How do they know or How do we know about this particular fact or topic? In their words:

We want to tell you our guiding principles so you can hold us to them. First, we approach our stories with curiosity above all. So this isn’t a commentary channel. We’re here to make sense of the evidence. We rely on primary sources and interviews, and we’ll share those sources with you with each video. If we make any factual errors, we will post corrections that explain exactly what we got wrong. Finally, we never take money in exchange for coverage. Our sponsors don’t have any control over what we make. 

I find Cole and Fong to be entertaining, personable, and likeable; their videos are well researched and produced, and the fact that they don’t upload content frequently is a good sign in my book, because it means they’re taking time to do their homework before presenting a new essay. If you’re an intellectually curious person as I am, I think you’ll like their channel.

Podcasts

Another year, another round of copying-and-pasting the same quote from a few years ago:

In 2019 I unsubscribed from all the podcasts I was following, and I haven’t looked back. I know and respect many people who use podcasts as their main medium for expression. My moving away from podcasts is simply a pragmatic decision — I just don’t have the time for everything. I still listen to the odd episode, especially if it comes recommended by people I trust. You can find a more articulate observation on podcasts in my People and resources added to my reading list in 2019.

If you’re wondering why I keep the Podcast section in these overviews when I clearly have nothing to talk about, it’s because to this day I receive emails from people un-ironically asking me for podcast recommendations.

My RSS management

Yet again, nothing new to report on this front. I’m still using the same apps I’ve been using on all my devices for the past several years, and I haven’t found better RSS management tools / apps / services worth switching to. In my previous overviews, I used to list here all the apps I typically use to read feeds on my numerous devices, but ever since I broke my habit of obsessively reading feeds everywhere on whatever device, I’ll only list the apps on the devices I’ve used over the past year or so. If you’re curious to read the complete rundown, check past entries (see links at the bottom of this article):

  • On my M2 Pro Mac mini running Mac OS 13 Ventura: NetNewsWire.
  • On my 17-inch MacBook Pro running Mac OS 10.14 Mojave, and on my 13-inch retina MacBook Pro running Mac OS 11 Big Sur: NetNewsWire 5.0.4 — A slightly older version of this great RSS reader.
  • On my other Intel Macs running Mac OS 10.13 High Sierra: Reeder and ReadKit.
  • On my iPad 8: UnreadReederNetNewsWire for iOS, and ReadKit.
  • On my Android phones — Nothing Phone 2a and Microsoft Surface Duo: the Feedly app.
  • On my iPhone SE 3, iPhone 8, iPhone 7 Plus, iPhone 5s, iPhone 5, iPad 3: Unread. (Though on the iPad 3 Reeder seems to be more stable and less resource-hungry).
  • On all my more recent Windows machines I use FeedLab. It’s not a bad app at all, but I’m still looking for something more elegant visually. Nextgen reader used to be a great client, but development appears long discontinued.

Past articles

In reverse chronological order:

I hope this series and my observations can be useful to you. Also, keep in mind that some links in these past articles may now be broken. And as always, if you think I’m missing out on some good writing or other kind of resource you believe might be of interest to me, let me know via email, Mastodon, or Bluesky. Thanks for reading!

A few insights by Don Norman from 30 years ago that are still relevant today

Tech Life

I was perusing some past issues of ACM Interactions magazine, and I stumbled on an interview with Don Norman, a figure I’ve always admired and one of the main forces of inspiration for me to delve deeper in matters of usability, design, and human-machine interaction.

The interview, titled A conversation with Don Norman, appeared on Volume 2, Issue 2 of the magazine, published in April 1995. And of course it’s a very interesting conversation between Don Norman and John Rheinfrank, the magazine editor at the time. There’s really very little to add to the insights I’ve chosen to extrapolate. While discovering them, my two main reactions were either, How things have changed in 30 years (especially when Norman talks about his work and experience at Apple); or, 30 years have passed yet this is still true today. I’ll keep my observations at a minimum, because I want you to focus on Norman’s words more than mine.

1. Forces in design

Don Norman: […] John, you deserve much of the credit for making me try to understand that there are many forces that come to bear in designing. Now that I’ve been at Apple, I’ve changed my mind even more. There are no ‘dumb decisions.’ Everybody has a problem to solve. What makes for bad design is trying to solve problems in isolation, so that one particular force, like time or market or compatibility or usability, dominates. The Xerox Star is a good example of a product that was optimized based on intelligent, usability principles but was a failure for lots of reasons, one of which was it was so slow as to be barely functional. 

John Rheinfrank: Then your experience at Apple is giving you a chance to play out the full spectrum of actions needed to make something both good and successful? 

DN: […] At Apple Computer the merging of industrial design considerations with behavior design considerations is a very positive trend. In general, these two disciplines still tend to be somewhat separate and they talk different languages. When I was at the university, I assumed that design was essentially the behavioral analysis of tasks that people do and that was all that was required. Now that I’ve been at Apple, I’ve begun to realize how wrong that approach was. Design, even just the usability, let alone the aesthetics, requires a team of people with extremely different talents. You need somebody, for example, with a good visual design abilities and skills and someone who understands behavior. You need somebody who’s a good prototyper and someone who knows how to test and observe behavior. All of these skills turn out to be very different and it’s a very rare individual who has more than one or two of them. I’ve really come to appreciate the need for this kind of interdisciplinary design team. And the design team has to work closely with the marketing and engineering teams. An important factor for all the teams is the increasing need for a new product to work across international boundaries. So the number of people that have to be involved in a design is amazing.

Observation: This was 1995, so before Steve Jobs returned at Apple. But Jobs’s Apple seemed to approach design with this mixture of forces. The results often showed the power of these synergies at play behind the scenes. Today’s Apple perhaps still works that way within the walls of Apple Park, but often the results don’t seem to reflect synergetic forces between teams or across one design team — It’s more like, there were conflicts along the way, and an executive decision prevailed. (No, not like with Jobs, because he better understood design and engineering than current Apple executives).

2. Design can only improve with industry restructuring

JR: You just said that there may be some things about the computer industry, or any industry, that make it difficult to do good design. You said that design could only improve with industry restructuring. Can you say more? 

DN: Let’s look at the personal computer, which had gotten itself into a most amazing state, one of increasing and seemingly never-ending complexity. There’s no way of getting out. Today’s personal computer has an operating system that is more complex than any of the big mainframes of a few years ago. It is so complex that the companies making the operating systems are no longer capable of really understanding them themselves. I won’t single out any one company; I believe this is true of Hewlett-Packard, Silicon Graphics, Digital Equipment Corporation, IBM, Apple, Microsoft, name your company — these operating systems are so complex they defy convention and they defy description or understanding. The machines themselves fill your desk and occupy more and more territory in your office. The displays are ever bigger, the software is ever more complex. 

In addition, business has been pulled into the software subscription model. The way you make money in software is by getting people to buy the upgrade. You make more money in the upgrade than in the original item. Well, how do you sell somebody an upgrade? First, you have to convince them that it’s better than what they had before and better means it must do everything they had before plus more. That guarantees that it has to be more complicated, has to have more commands, have more instructions, be a bigger program, be more expensive, take up more memory — and probably be slower and less efficient.

3. Why changing is hard in the tech industry

DN: […] Now, how on earth do you move the software industry from here to there? The surety of the installed base really defeats us. For instance, Apple has 15,000,000 computers out there. We cannot bring out a product that would bring harm to those 15,000,000 customers. In addition, if we brought out a revolutionary new product, there’s the danger that people would say the old one is not being supported, so they’ll stop buying it. But they don’t trust this new one yet. “Apple might be right but meanwhile we better switch to a competitor.” This story is played out throughout the computer industry. It’s not just true of Apple. Look at Microsoft, which has an even worse problem, with a much larger installed base. It’s been a problem for many companies. I think the reason why a lot of companies don’t make the transition into new technologies is that they can’t get out of their installed base. 

Mind you, the installed base insists upon the current technology. There’s a wonderful Harvard Business Review article on just this: Why don’t companies see the new technology coming? The answer is, they do. The best companies often are developing new technology. But look at the 8‑inch disk drive which has replaced the 14-inch Winchester drives. It was developed and checked with the most forward-looking customers, who said, “That will never work for us.” So the 8‑inch drive wasn’t pushed. Despite everything being done to analyze the market, in retrospect, the wrong decision was made. At the time, by the way, it was thought to be the correct decision. 

It’s really hard to understand how you take a mature industry and change it. The model that seems to work is that young upstart companies do it. Change almost always seems to come from outside the circle of major players in the industry and not within. There are exceptions, of course, of which IBM is an interesting one. IBM was once the dominant force in mechanical calculating machines and young Thomas Watson, Jr., the upstart, thought that digital computers were the coming thing. Thomas Watson, Sr. thought this was an idiotic decision. But actually Junior managed to get the company to do create the transformation. It’s one of the better examples of change in technological direction, and it also was successful.

About Norman’s last remarks, see Wikipedia: “Watson became president of IBM in 1952 and was named as the company’s CEO shortly before the death of his father, Watson Sr., in 1956. Up to this time IBM was dedicated to electromechanical punched card systems for its commercial products. Watson Sr. had repeatedly rejected electronic computers as overpriced and unreliable, except for one-of-a-kind projects such as the IBM SSEC. Tom Jr. took the company in a new direction, hiring electrical engineers by the hundreds and putting them to work designing mainframe computers. Many of IBM’s technical experts also did not think computer products were practical since there were only about a dozen computers in the entire world at the time.”

4. “Personal computers”

JR: So it looks as though we have another transition to manage. It’s very strange that they call these devices ‘personal computers.’ 

DN: Yes. First of all they’re not personal and second, we don’t use them for computing. We’re using these things to get information, to build documents, to exchange ideas with other people. The cellular phone is actually a pretty powerful computer that is used for communication and collaboration. 

Observation: This brief remark by Norman about mobile phones is rather amazing, considering that it was made back in 1995 when smartphones didn’t exist yet — the functions of what we now consider a smartphone were still split between mobile phones and Personal Digital Assistants (PDAs). Also the mention that these devices (personal computers) are not really personal still sounds especially relevant today, for different reasons. See for example this recent piece by Benj Edwards: The PC is Dead: It’s Time to Make Computing Personal Again.

5. Interface design, interaction, and building a personality into a device

JR: So in what direction do you think computer-interface design should go? Many companies are making moves to simplify entry and interaction (Packard Bell’s Navigator and Microsoft’s BOB). In the short term, how does this fit your vision? 

DN: The question really is, in what direction do I see our future computers moving? Microsoft has introduced BOB as a social interface, which they think is an important new direction. Let me respond to the direction and I’ll comment later on BOB. As I’ve said before, I believe our machines have just become too complex. When one machine does everything, it in some sense does nothing especially well, although its complexity increases. My Swiss Army knife is an example: It is very valuable because it does so many things, but it does none of the single things as well as a specialized knife or a screwdriver or a scissors. My Swiss Army knife also has so many tools I don’t think I ever open the correct one first. Whenever I try to get the knife, I always get the nail file and whenever I try to get the scissors, I get the awl, etc. It’s not a big deal but it’s only about six parts. Imagine a computer with hundreds or thousands of ‘parts.’ I think the correct solution is to create devices that fit the needs of people better, so that the device ‘looks like’ the task. By this I just mean that, if we become expert in the task, then the device just feels natural to us. So my goal is to minimize the need for instruction and assistance and guidance. 

Microsoft had another problem. Their applications are indeed very complex and their model is based on the need to have multiple applications running to do, say, a person’s correspondence, communication, checkbook, finances. How did they deal with the complexity with which they were faced? There has been some very interesting social-science research done at Stanford University by Cliff Reeves and Byron Nash, which argues that people essentially treat anthropomorphically the objects with which they interact, that is they treat them as things with personalities. We kick our automobile and call it names. Responding to computers in fact has a tendency to go further because computers actually enter into dialogues with people, not very sociable dialogues, but dialogues nevertheless. So from their research, Reeves and Nash did some interesting analysis (somewhat controversial, by the way) in the social-science community about the social interactions between people and inanimate objects. That’s all very fine, and you can take that research and draw interesting conclusions from it. It’s a very big step, however, to take that research and say that, because people impart devices with personalities, you should therefore build a personality into a device. That was not supported by the research. There was no research, in fact, about how you should use these results in actual device construction. 

Observation: The bit I emphasised in Norman’s response made me wonder. And made me think that maybe this is one of the reasons why most automated ‘AI’ assistants — Alexa, Siri, etc. — remain ineffectual ways to devise and implement human-machine interaction to this day. Perhaps it’s because we fundamentally want to always be the ones in charge in this kind of relationship, and do not like devices (or even abstract entities such as ‘AI’ chatbots) to radiate perceived personality traits that weren’t imparted by us. By the way, I hope we’ll keep holding on to that feeling, because, among others, it’s at the root of a healthy distrust towards this overhyped ‘artificial intelligence’.

It’s very difficult to decide what is the very best way of building something which has not been studied very well. I think where Microsoft went wrong was that, first of all, they had this hard problem and they tried to solve it by what I consider a patch, that is, adding an intelligent assistant to the problem. I think the proper way would have been to make the problem less complex in the first place so the assistance wouldn’t be needed. I also think they may have misread some of the research and tried to create a character with an extra cute personality. 

In his response, Norman continues with another interesting remark (emphasis mine, again). Despite referring to a product we now know did not succeed — Microsoft BOB — I think he manages to succinctly nail the problem with digital assistants and offer a possible, radical workaround; though I seriously doubt tech companies today would want to engage in this level of rethinking, preferring to keep shoving ‘AI’ and digital assistants down our throats.

6. Making devices that fit the task

JR: It seems as if substantial changes in design will take a long time to develop. Will we have something good enough for the ten-year-old with ‘Nintendo thumb’ before he or she grows up? 

DN: I think for a while things aren’t going to look very different. The personal computer paragon could be with us another decade. Maybe in a decade it will be over with. I’d like to hope it will be. But as long as it’s with us, there aren’t too many alternatives. We really haven’t thought of any better ways of getting stuff in or out besides pushing buttons, sound, voice, and video. Certainly we could do more with recognition of simple gestures; that’s been done for a very long time, but we don’t use gestures yet in front of our machines. I mean gestures like lifting my hand up in the air. We could, of course, have pen-based gestures as well and we could have a pen and a mouse and a joystick and touch-sensitive screens. Then there is speech input, which will be a long time in coming. Simple command recognition can be done today but to understand, that’s a long time away. 

So in my opinion the real advance is going to be in making devices that fit the task. For instance, I really believe within five years most dictionaries will be electronic, within ten years even the pulp novel, the stuff you buy in the airport to read on the airplane, will have a reader. What you’ll do is go to the dispenser and instead of the best 25 best-selling books, it will have 1,000 or 2,000 books for browsing. When you find a book that you like, you’ll put in your credit card and the book will download to your book reader. The reader will be roughly the size of a paperback book today and look more like a book than a computer. The screen will be just as readable as a real book. Then look at any professional, say a design professional. You couldn’t really do your design without a pencil. Look how many pencils good artists will use. They may have 50 or 70 or 100 different kinds of drawing implements. We have to have at least that kind of fine-detail variation in the input style in the world of computers. I don’t think we’ll have the power that we have today with manual instruments until we reach that level. I think the only way to get that power, though, is to have task-specific devices. That’s the direction in which I see us moving.

Observation: There was, indeed, a time, when tech seemed to move in the direction envisaged by Norman, with devices designed for specific tasks. When Steve Jobs illustrated the ‘digital hub’ in the first half of the 2000s, the Mac was the central hub where we would process and work with materials coming from different, specialised devices: the digital camera, the camcorder, the MP3 player, the audio CD, the DVD, the sound-recording equipment. At the time, all these devices were the best at their designed tasks.

But then the iPhone came (and all the competing smartphones based on its model), and it turned this ‘digital hub’ inside out. Now you had a single device taking up the tasks of all those separate devices. Convenient, but also a return to the Swiss Army knife metaphor Don Norman was mentioning earlier in what I indicated as section №5: “My Swiss Army knife […] is very valuable because it does so many things, but it does none of the single things as well as a specialized knife or a screwdriver or scissors.”

If you think about it, the Swiss Army knife is also a good metaphor to explain a big part of the iPad’s identity crisis. A big smartphone, a small laptop, a smarter and more versatile graphic tablet, among other things; and yet, it tends to do better at the task it ‘looks more like’: a tablet you use with a stylus to make digital artworks.

After years of smartphone (and similar ‘everything bucket’ devices) fatigue, it seems that we may be moving again towards task-specific devices, with people rediscovering digicam photography, or listening to music via specialised tools like old iPods and even portable CDs and MiniDisc players. The e‑ink device market seems to be in good health, especially when it comes to e‑ink tablets for note-taking and drawing; products like the Supernote by Ratta or the BOOX line by Onyx; or the one that likely started the trend — the ReMarkable. I have recently purchased one of these tablets, the BOOX Go 10.3, and it’s way, way better than an iPad for taking notes, drawing, and of course reading books and documents for long stretches of time.

I hope we’ll keep moving in this direction, honestly, because this obsession for convenience, the insistence on eliminating any kind of friction and any little cognitive load, and wanting single devices that ‘do everything’ is what is making interfaces become more and more complex, and making tech companies come up with debatable solutions to make such interfaces less complex. See for instance how Apple’s operating systems have been simplified at the surface level to appear cleaner, but in doing so have removed a lot of UI affordances and discoverability, burying instead of solving all the complexity that these systems have inexorably accumulated over time.

Or see for example how digital assistants have entered the picture in exactly the same way Microsoft came up with the idea of BOB in the 1990s. As Norman says, an intelligent assistant was added to the problem, becoming part of the problem instead of solving it. So we have complex user interfaces, but instead of working on how to make these interfaces more accessible, less convoluted, more discoverable, intuitive, and user friendly, tech companies have come up with the idea of the digital assistant as a shortcut. Too bad digital assistants have introduced yet another interface layer riddled with the usability and human-machine interaction issues we all know and experience on a daily basis. Imagine if we could remove this layer of awkwardness from our devices and had better-designed user interfaces that completely removed the need of a digital assistant.

[The full magazine article is available here.]

Recreating Delicious Library

Handpicked

Wil Shipley (via Michael Tsai):

Amazon has shut off the feed that allowed Delicious Library to look up items, unfortunately limiting the app to what users already have (or enter manually).

I wasn’t contacted about this.

I’ve pulled it from the Mac App Store and shut down the website so nobody accidentally buys a non-functional app. 

I closely follow Michael Tsai’s blog (and you should too, go add it to your feeds) but this bit of news somehow flew under my radar. That Delicious Library has to be retired is indeed the end of an era. The app had been going strong (and was a great example of good UI) for 20 years, and it’s sad to see great apps die just because someone at a Big Tech company decides to flip a switch.

I remember downloading a trial version of Delicious Library in late 2004 and at the time I really thought it was a nice solution for cataloguing my stuff. I ended up not using it, but the problem was me, not the app. I simply had too many things to catalogue. I could have turned the data entry into a daily habit — you know, scanning a bunch of items every day when I had some free time, and see my digitised library slowly grow and mirror my physical library — but twenty years ago I was far more impatient than I am now. The task seemed too daunting and I simply chickened out.

Fast forward to a few days ago, and I receive an email from Ding Yu, a reader of my blog whom I also know via X/Twitter. He’s a software developer based in Tokyo. And he’s had an idea: 

I’m considering making a modern web version of the beloved Delicious Library, but I’m not sure this is something worth pursuing. I’ve put up my thoughts here: Recreating Delicious Library in 2025?

Prompted for feedback, I told him that I think it’s a very good idea. Despite not being a Delicious Library user myself, I’ve always thought it was a necessary application that would fit the cataloguing needs of a lot of people. 

In writing my response to Ding, I also remembered my experience with Shelfari: I discovered it at the peak of its popularity, and I decided to give it a try. Maybe this time I would be more patient, so I started cataloguing my extensive book library. Things were going well enough, but since on the Internet we can’t have good things for too long, Shelfari was acquired by Amazon, shut down, and subsequently merged with Goodreads. Imagine my disappointment, after patiently uploading and curating data about almost 400 books… 

In recalling that disappointment, the terrible feeling of having the rug pulled from under your feet after all that work, I think that a lot of Delicious Library users must feel the same right now, especially early adopters who have been using the app for twenty years, growing their extremely curated library of physical media. So I wrote back to Ding telling him that for these people, having an alternative Web app/service that could replicate most (or all) of Delicious Library’s functions could be nice and a welcome solution. 

I also told Ding I would spread the word about his idea, so here I am. What do you think about it? If you’re interested, please go read his blog post, get in touch with him, and share your feedback. If you look at Ding’s past projects (outlined in the post), you’ll realise that he’s perfectly capable of coming up with a good product. And I understand his feelings and uncertainty when he writes, I really want to make this happen—it feels like something I’ve been working toward for years. But I’ve also built so many things that no one wanted before, so I’m not sure if this idea is worth pursuing. — I’ve been there. I know. So the more feedback he receives about this, the better.

Do principles always have to lose when it comes to tech?

Handpicked

In Principles vs Pragmatism, Pete Moore writes:

It’s a mistake to judge others for their software choices, while still making exceptions for ourselves. Our hands aren’t exactly always clean. It would be akin to me blasting someone who is using HEY or Kagi, while disregarding my use of Apple products or occasionally ordering from Amazon. Case in point: I’ve seen discourse and uproar about Tim Cook donating $1M to Trump’s inauguration fund, while simultaneously ignoring others who are doing the same thing. No doubt it’s cringeworthy beyond words, and symptomatic of the larger, more pernicious issue of political lobbying and capitalist corruption. 

Cook’s donation being cringeworthy is an understatement. I have made my tiny contribution to that discourse and uproar, by posting this on Mastodon a few days ago:

For me, this is the straw that broke the camel’s back. 

No more money spent for an Apple product until there’s some clear sign of a change in stance and direction with Cook’s successor. 

And I’m going to stand by that. I’m not ignoring the fact that Sam Altman, Jeff Bezos, and Mark Zuckerberg are doing the same thing. I am actually not surprised about that. But I also largely don’t care about their businesses or products. I don’t use any product by Meta. I stopped being active on Instagram the day after Meta acquired it in April 2012. I’ve never had a Facebook account. I only order something from Amazon if there is no real alternative option. And so forth.

And for me, the issue here isn’t what others like Cook have done. The issue is that Cook didn’t act differently. He’s the CEO of the most valuable company in the world, a company that supposedly has thinking different in its DNA and culture. A company that certainly has all the resources to shoulder possible consequences from acting differently here.

Moore continues:

Are those who are lashing out at Tim because of their principles going to abandon Apple entirely, or does their pragmatism prevent them from doing so? I believe it would be naive and unwise to assume a change in leadership at Apple—or any of these other guilty parties—will prevent this from happening again. Spoiler alert: it won’t. This is the gaping wound that’s been allowed to fester and rot in our political systems. 

We have to clarify what ‘abandoning Apple’ means. In my case, it’s not the same as rage-quitting, and that would be silly. Between current, older, and vintage models, I own about 40 Apple devices (Macs, iPhones, iPods, iPads, Newtons), purchased or acquired over the past 30 years or so. I’m not going to put them all in a crate and bring them to the recycler. Some of these devices hold sentimental value, and others were bought when Apple was overall a better company, innovation-wise and culture-wise. Plus, I study user interfaces. I want to keep having access to older versions of Mac OS/iOS/iPadOS to compare and contrast with the newer ones and analyse how they’re evolving (or not). I also need access to Apple devices for work reasons, though this requirement, over time, has become more relaxed.

Getting rid of all traces of Apple in my household as a reaction to Tim Cook donating $1M to Trump makes little sense and doesn’t really ‘boycott’ Apple in any meaningful way. However, as I said more succinctly in my Mastodon post, this is the straw that broke the camel’s back; I have been increasingly frustrated with Apple, their products, their design decisions, their software, their attitude towards third-party developers, their App Store policies, their attitude towards EU legislation’s requests, and in general with Cook’s direction over the years. This last gesture by Cook is something I find especially shameful considering the recipient of the donation and the ulterior motives behind it. So, my abandoning Apple is a process that’s starting now, by refusing to invest a single cent in an Apple product from now on, unless things were to dramatically change. And since, according to Moore, it would be naïve and unwise to assume they will, then okay, I will live with my principled decision.

I found Moore’s piece via Eric Schwarz, who comments:

Over the weekend, there have been a lot of words written about Apple CEO Tim Cook’s $1 million personal donation to the incoming administration’s inauguration fund and I think Moore nails a lot of aspects of how I’ve felt about it. It’s disgusting and we shouldn’t even be in a spot where it’s a necessity. However, while is something that will potentially benefit Apple and its shareholders, it also benefits employees and customers. Apple may get a special carve-out from the threat of tariffs and not have to raise prices. 

Or Apple may get absolutely nothing out of it. That wouldn’t surprise me in the least. Everyone I’ve talked with about this disgusting donation these past days has pretty much reacted the same way: Steve Jobs wouldn’t have donated anything and would have stood by his decision.

I’ve had plenty of instances where I swore off businesses for one reason or another, but if you keep writing them off for every little thing, you’ll run out of options. Bad customer service? Sure. Disrespect of your time and patronage? Fine. Institutional values that don’t align with your own? Okay. […] 

Except this isn’t ‘a little thing’, at least for me. It’s the cherry on top of the shit cake. 

I’ve stood by my principles and choices even when doing so had made my tech life a bit more difficult or increased friction in my workflows. I’m not completely inflexible, and I’m the kind of person who very rarely makes rash decisions. I tend to give people a second chance. In a technology context, I similarly tend to give companies, products, and services a second chance except in cases of major screw-ups that ended up impacting me severely. But when I’ve really had enough of something, it’s unlikely that I’ll reconsider. 

I think it’s increasingly important to have principles and to stand by them in a tech landscape that has never been as insidious as it is now. The only way to stand up to big tech companies is to refuse to be complacent, to refuse to play their games. We always tend to focus on how tech has made a lot of things better or easier, but we never really stop and consider the hefty price we have been paying in return. 

Moore ends his piece with this:

The choice between principles and pragmatism often means being content living within the grey area between them. 

More and more often, I feel that for many people ‘pragmatism’ means essentially ‘convenience’. Why bother taking a stand? — they rationalise, — It’s not going to change anything and I will have made my life harder for nothing. Tech companies, and every other entity that wields some kind of power over us, know this well. That’s why today only legislation seem to have enough power to make certain things change in the tech world. A lot of people like to spout about Voting with our wallet, but then they rarely act on their words. 

Before you think I’m acting all holier-than-thou on this: I’m not judging anybody, and if you’re fine with what Cook did — or if you aren’t, but stopping supporting Apple feels too much or is unfeasible for a dozen reasons — I totally understand. The people I do have a problem with, however, are those who keep whining about how Big Tech is increasingly shaping and controlling our lives, but effectively do nothing to oppose that trend, and always choose convenience while telling themselves they’re being pragmatic. By constantly choosing the path of least resistance, doesn’t this ‘pragmatism’ eventually morph into acquiescence?

If I publicise my stance and my decisions, it’s not out of a desire to virtue-signal, but to manifest my unrest and disagreement, hopefully in a meaningful way. I’m still on team Principles even if it’s clear we’re losing the battle, and I really wish more people joined our team. But in the end I do what I do because it matters to me, because it means something to me and my conscience.

Eric Schwarz:

If you’re mad at Tim Cook, that’s also fine, but what’s the alternative? Microsoft or Google products? Building your own computer and phone with open source tools and hoping it works as well? 

Well, it can be done. It takes patience and a healthy amount of tech-savvy, but it’s not impossible. There was a time when I thought, I don’t think I’ll ever touch Linux — it’s too ‘this’, and not enough ‘that’, and so forth. There is some friction, there is some workflow re-evaluation, some habits may change, but ultimately it’s a bit like those games that hit you hard in their first levels, but as you familiarise with their mechanics, you get more proficient.

But to respond to Eric more directly: Google or Microsoft aren’t necessarily better alternatives, but again, the crux of the matter here isn’t whether these and other big tech companies have donated money to Trump and how much. The matter is, quite simply, that Tim Cook did. Am I naïve and an idealist in thinking that he could have acted differently? Perhaps. That doesn’t change how I feel about it — disgusted and disappointed. 

Schwarz closes his commentary with this:

Politics are a disgusting game, probably even more now, but anyone who is tasked with running one of the largest companies has to unfortunately play a little. 

Should they, though? One thing is having to comply with, say, international laws, and with what foreign governments require (e.g. China, Russia, the EU, etc.). Another thing is participating in this nauseating show of bringing offerings to the loose cannon who is now again U.S. president, in the hope that he shall be benevolent in return; all this while forgetting he is, indeed, a loose cannon.

Time to move on from bootable backups, whether you like it or not

Handpicked

Or: Another instalment of the series The more we progress, the more we regress

Adam Engst, writing at TidBITS:

The latest installment in the story of how bootable Mac backups will eventually disappear started with a blog post by Shirt Pocket Software’s Dave Nanian. In it, he explained why SuperDuper could no longer make bootable duplicates on M‑series Macs running under macOS 15.2 Sequoia, blaming Apple’s asr (Apple Software Restore) utility. This tool is the only way to create a bootable backup. […]

First, I confirmed that the problem was real but limited to M‑series Macs. On my Intel-based 27-inch iMac, SuperDuper had no problem completing a backup, and I was easily able to boot my iMac from that backup. 

He then tried two other similar tools, ChronoSync and Carbon Copy Cloner, to no avail.

Regardless of whether asr caused these problems, such uncertainty is problematic when it comes to backups. I feel terrible for Shirt Pocket Software, Econ Technologies, and Bombich Software because they’re trying to provide a longstanding feature that users want—bootable backups—and they’re entirely at the mercy of Apple’s asr tool to do so. As we’ll see, Apple has relatively little interest in supporting bootable backups. 

This gradual move away from bootable backups is part of Apple’s Mac OS lockdown procedure, as I’d like to call it. It’s all disguised as providing users with hardened security for their Macs, while effectively limiting their choices when it comes to managing machines they purchased and own.

I’ll be quoting a lot in this piece, so bear with me.

From Apple’s perspective, allowing system files to be copied inherently introduces opportunities for attackers to modify system components. Since macOS 10.15 Catalina, the separate system volume is immutable, locked, and validated using cryptography—what Apple calls the “signed system volume.” Any method that allows it to be copied onto a bootable drive must preserve the same verification to ensure nothing has changed. 

As I was reading this paragraph, I was thinking of all the typical regular users of Apple computers who use their Macs at home or in their home office or studio, and how real, how reasonably likely, could be the threat of a hacker penetrating their Macs and modifying their systems. But sure, I’ll concede that this security measure — locking and encrypting the system volume — is sound. Particularly useful against a type of computer user who invariably annoyed me back in a previous life when I used to freelance as IT support: the user who tinkered a bit too much with their production machine (or with their only machine) without really knowing what they were doing, but attempting it anyway because “my tech-savvy friend told me I could try this to optimise this stuff”, or because “I read on the Internet that I could speed up downloading files from websites with this [shady] utility”, or because “someone said in a forum that with this Terminal command you could double your free memory”. And so forth. You’ve certainly met this kind of user more than once in your life. They’re their computer’s worst enemy. Protecting all the critical components of the operating system against this type of user is a good idea. Their Macs will never get as messed up as some Macs I had to laboriously un-mess back in the era of Classic Mac OS and older Mac OS X versions.

Back to Engst:

To mitigate this move away from easily making bootable backups, Apple has invested a lot of effort into macOS Recovery and Migration Assistant. It is now trivial and streamlined to boot a Mac into macOS Recovery, install macOS, and restore user files using Migration Assistant. With a separate system volume, a reinstallation just creates a new, secured, immutable volume and then copies your user files to the data volume. Because Apple controls every part of that process, there’s no worry about the security of the system being compromised. 

Uh, no, it’s not that trivial. I only have anecdata, but several people in my circle of friends and acquaintances have told me their experience with Migration Assistant — especially with recent Macs — hasn’t been smooth at all, citing freezes and failure to transfer all the expected data. And it’s not as fast as having a bootable cloned disk at hand in case of catastrophic failures. Well, in case of a catastrophic failure, like your Mac’s internal SSD dying, you obviously can’t transfer anything. Unless you have some backup lying around, you’re done.

Oh, and there’s another fun thing that happens when your Mac’s internal SSD is toast: you can’t boot from an external drive. I completely forgot about this. Engst references this great 2021 article by Glenn Fleishman: An M1 Mac Can’t Boot from an External Drive if its Internal Drive is Dead.

But why would Apple do this? — asks Fleishman in that article, and his answer is, To increase security. And, maybe, to reduce its tech support costs. “Security, again,” I repeat out loud, rolling my eyes.

Look, I’m not arguing against security; I’m not downplaying possible security risks, especially in today’s world, which is certainly worse than the world of two decades ago; I’m not even arguing that this is all security theatre, because it’s not. I’m simply arguing that this degree of security-driven Mac OS lockdown is overkill and it’s certainly been implemented by Apple to make their lives easier, not the end users’. 

There are many interesting comments to Engst’s article. An example of users having more limited choices is provided by reader Michael Schmitt:

But still… let’s say you have an Apple Silicon MacBook Pro which came with Ventura (like mine), and is currently on Sonoma (like mine). Your internal SSD dies, so you take it to the Apple Store and get it replaced.

A week later you have your computer back, but it is on Sequoia. You want it to be on Sonoma. What to do?

The problem is that macOS Recovery doesn’t let you pick which macOS version it will install. On Intel Macs you have limited options: macOS computer came with, macOS it is currently on, or most recent macOS. None of those will work.

On Apple Silicon, as far as I can tell, you have no choice at all. So if it installs Sequoia, you’re stuck, because the macOS installer won’t let you downgrade. You can’t even use it to installer a lower version of macOS on an external drive(*). 

(*) Another reader, down in the thread, notes that it is technically possible to perform such a downgrade, but it’s not exactly an intuitive, ‘Mac-like’ procedure.

Note that this can happen (and has happened to a friend of mine) even if you take the Mac to the Apple Store for other motherboard issues and not just because the internal SSD has died. In my friend’s case, his MacBook had developed power issues. I don’t know whether it was an intermittent failure at powering up, a failure in detecting a connected power cable (so the battery couldn’t be charged either), or both, but they performed a motherboard replacement and he found himself with a fresh installation of what was the latest Mac OS version at the time. And he was, I think, two versions behind because a couple of software applications he relied upon either weren’t working well or at all under the latest Mac OS.

Back when my iMac G3 broke down in 2001 (analogue board failure), the repair shop told me I could have my Mac back in 2–3 weeks, a downtime I simply could not afford. So they put the iMac’s hard drive in an external FireWire enclosure, and I was able to continue working by connecting the drive to my iBook G3 SE straight away. My downtime that day was about 2 hours (the time it took me to bring the iMac to the shop and return home). 

Reader ‘trilo’ writes the comment that resonates with me the most:

The past few posts from Michael and Doug explain the issue perfectly. It has made what used to be quick and easy, extremely hard or impossible.

Having a securely locked OS is a great concept but it clearly comes with significant consequences. Bricking a machine is unacceptable for people who need their machines to make a living and where time is critical. There are dozens of times over the past 10 years where booting from a clone kept our production running and deadlines met, and there’s now circumstances where this can’t happen.

For mine the biggest concern of Michael Schmitt’s scenario is the statement “A week later you have your computer back”. From past experience I’d be very surprised if it only took a week.

As for OS versions, some people simply prefer to run older versions of an OS whether it be for practical reasons or personal choice. Forced upgrades aren’t cool.

Finally, I realise no amount of complaining or explaining will change Apple’s mind — but it doesn’t mean it’s not a bone-headed decision done for Apple’s convenience rather than the users’. 

In a reply to ‘trilo’, reader Doug Miller says:

My last ten to fifteen years of computer use on Macs have been the most stable of my life — they are the most reliable they have ever been for me… Generally the only times my Macs restart are when OS upgrades get delivered (there are also restarts of course for the desktops when we have power outages). I’m reminded a bit of the Louis CK “everything is amazing and nobody is happy” sometimes.

I’ll also note that I once did Mac cloned backups and I always found issues — every time I booted the clone to check if it was ok, things were just a bit messed up. The boot took longer; performance was poorer. Dropbox required authentication (that’s just the one app/service that I remember having issues — there may have been more.) It generally worked, but it didn’t “just work”. 

I’ll say this: ever since SSD technology matured, it has increased stability and reliability exponentially, both in my newer and older Macs. It’s too bad that this stability on the hardware side is paired with a worsened experience on the operating system software side. For a UI enthusiast and long-time Mac user such as myself, watching Mac OS gradually become a shell of its former self — more locked down, more simplified and iOS-ified — is a painful spectacle. Have I had any problem with my M2 Pro Mac mini running Ventura since I purchased it in June 2023? No. Not an issue, and not a crippling bug either. That’s great, don’t get me wrong. But also: am I happy every time I interact with this Mac OS? No. Not as happy as when I switch to another of my Macs running older Mac OS versions like High Sierra, Mojave, El Capitan, Snow Leopard, Tiger. I use this Mac mainly for work. But it feels just like when I used a Windows PC for work. I tolerate it, I can work with it; but the fun is elsewhere.

Oh, and unlike Miller I never had an issue with bootable cloned drives in the past. There was one occasion when SuperDuper threw an error when the cloning process was finished, so I asked Dave Nanian for clarifications, but in practice everything went smoothly and the cloning was successful. All the contents of my 2017 iMac 4K were copied on an external SSD, and I’ve been using that SSD as main volume ever since (that iMac still came with a spinning hard drive, and I didn’t want to open the computer to replace the HD with the SSD, preferring to leave the hard drive inside and use it as a data backup volume).

trilo’ replies to Doug Miller, and in their reply there’s another bit where we strongly agree, and it’s that last paragraph:

My work is deadline driven publishing and Apple has removed the safety net we enjoyed. Maybe the Apple market is now just Instagram and tiktok viewers but some of us still do real work where we can’t afford hours, days or weeks without a functional machine.

I’d like the choice to do it. I’m happy to shoulder the risks — just don’t prevent me from doing it. Some users don’t want to be dictated to by the lowest common denominator. 

I’m sure it’s technically possible to provide the option of making bootable cloned volumes in an easy, user-comprehensible fashion while preserving a layer of underlying security, but I’m also sure it would be more work for Apple behind the scenes. It’s more cost-effective for Apple to follow the principle that the fewer moving parts, the fewer the chances of a machine breaking down. To the point that Macs are basically black boxes.

Whatever your opinion on this whole matter, there’s an unescapable fact — recovering from a serious hardware failure or data loss used to be faster and simpler than it is now. Did it involve a lesser degree of security? Theoretically, yes. In practice, we accepted the security trade-off of being able to use a quicker, more ‘open’ procedure to get back on track instead of having to jump through largely overkill security hoops that ultimately create a lot of friction and encumbrance for the end user. A user who’s simply dealing with data loss or hardware failures, with reasonably near-zero risk that ‘some attacker’ may target their machine or information. 

As a coda to all this, there’s one last observation I’d like to make. In Engst’s comments, in Fleishman’s afore-linked article, and in the comments to Engst’s piece, it is repeatedly pointed out that the internal SSDs in today’s Macs are extremely reliable, making the actual need for bootable backups rather redundant and irrelevant. And while I don’t necessarily disagree with this, such reliability has led to a fascinating side effect: people don’t make backups of their data like they used to.

Every once in a while, I conduct private surveys and polls with a fair amount of volunteers. Statistically, the sample isn’t very large (we’re talking 100–120 people), but it’s diverse enough to have a modicum of relevance for me. My volunteers are people with varying degrees of tech expertise (from none to a lot), different age ranges, different jobs and incomes, and hail from different countries within and outside the EU.

A few months ago, I had the idea of writing a piece about how we’re doing backups today, so I sent out a few questions via email to my volunteers. I wanted to know which platform they were using, which backup solutions they had in place (if any), and whether their backup strategy had significantly changed in recent times. 

I received 106 replies, 75 from Mac users. Of these Mac users, only 11 are still actively, routinely backing up their data. Of the remaining 64, 21 told me they’ve never backed up anything. In the remaining group of 43 users, a few of them relied solely on Time Machine backups (without even verifying them), but the majority was simply using some cloud service (Dropbox, iCloud, OneDrive, Google Drive) to save selected critical data and nothing else. After a few follow-up enquiries, an interesting trend emerged: every person in my sample who was using an Apple Silicon Mac didn’t bother with any particular backup solution, and a lot of them specifically told me that they had stopped bothering with backups since Apple stopped including spinning hard drives in their computers, and especially since transitioning to the Apple Silicon architecture. They told me the reliable hardware makes them feel secure enough to skip backups altogether. Some of them keep a few important documents in iCloud, but they haven’t even bothered purchasing more iCloud storage for that.

A couple of responses were fascinating, and they were along the lines of, “My Mac feels like an iPad now, and I certainly don’t spend time backing up my iPad. If something happens, I just do a restore”. I don’t know what kind of ‘restore’ they’re thinking of, but I perfectly got the overall attitude. 

(By the way, of the 31 Windows users who submitted their replies, the vast majority used OneDrive as main backup solution, while 5 people told me they relied on local NAS solutions to preserve their data. Even among them, SSDs inside their main computers meant a general sense of increased reliability and security).

I ended up not writing that article about backup strategies, but the information I collected with my survey had got me thinking. Now, maybe these results don’t align with your personal experience, but I’m curious to know whether you, too, have relaxed or entirely neglected your backup practices since switching to SSD-powered machines and specifically Apple Silicon Macs.

All this to me feels like a double-edged sword. On the one hand, having faster and more reliable storage technologies is very welcome, as catastrophic data losses become less frequent and less likely. On the other hand, people getting progressively careless about backup strategies, to the point of ditching them entirely, is a bit worrying. Sure, disasters are less likely to strike, but when they do strike, it’s going to hit harder than before. SSDs are not infallible, neither are they everlasting. Also, in my experience, SSD failures can happen without warning and be immediately, entirely devastating. Hard drive failures can be gradual and not utterly destructive straight away. A hard drive can start failing but still remain operational long enough to allow you to make an emergency backup in case you’re caught unawares (as it happened to me in 2006 with my 12-inch PowerBook G4’s drive — I was able to copy everything on a second drive with only 0.3% of data corruption before the drive failed completely). An SSD just fails and there’s basically nothing you can do about it. 

So, while SSD failures are still way less common than hard drive failures, I’d still call this almost unconditional reliance on them a false sense of security. And no, of course I’m not saying it’s Apple’s fault — I’ve been criticising the company more and more often, but I’m not a moron. Yet, it’s somehow ironic to see a more secure, locked-down Mac OS, and users feeling so much safer that they’re willing to forgo backup solutions almost entirely. Thank goodness I’m not doing IT support anymore.

As for software and security, thankfully it’s still possible to run any application you want on Mac OS, but it’s increasingly clear to me how Apple would prefer an iOS/iPadOS scenario, where the only apps you can install and execute would come from the App Store, and only from the App Store. For now, we simply have to deal with additional mouse clicks and granting permissions to apps that aren’t from the App Store or from ‘Identified developers’. But I routinely find myself wondering how long this software freedom will last before Apple initiates another lockdown. 

I assume it’s because at the moment Apple still fears the inevitable backlash from users (and especially power users), but I’m starting to wonder how much of a backlash it will really be after a few Mac OS cycles. Judging by the utter lack of interest from regular users when it comes to UI-related matters — and I’ve noticed that every time I’ve raised some issues regarding Mac OS’s worsened user interface and first-party apps. Judging by the fact that an increasing number of Apple users are utterly unfazed by atrocious design choices like putting notches on iPhone and MacBook displays, or by Apple’s almost complete disregard of their own Human Interface Guidelines in their own operating system, I’m afraid that when Apple decides to pull the ‘App Store only’ card for Mac OS apps, most users will just accept that with a shrug and move on. In case something like this eventually materialises, my hope is that the European Commission will regulate against such practice and will save Mac OS from its dumbed-down, locked-down fate.