Tim Cook steps down as CEO of Apple. Several observations on his tenure from a long-term Apple customer and observer.

Tech Life

Among the working titles for this post were Good riddance and Stop the praises, and I haven’t chosen them not because I thought they were somehow mean-spirited, but because they sounded like coming from a place of deep care. They sounded like the reaction of someone with deep emotional investment in the whole thing. But over the past few years I — a long-time enthusiastic Apple user and customer — have become desensitised towards most of what Apple does and what Apple has become. And I have to thank Tim Cook for that.

I was told on Mastodon that I shouldn’t judge Tim Cook only by the standard that was Apple leadership before him, because that would be missing the point. That “dismissing an era that delivered custom silicon, record financial performance, and a strengthened privacy among many other things feels less like critique and more like nostalgia frustration.” This because my initial reaction after learning that Cook would step down as CEO of Apple was indeed along the lines of Fuck off and Good riddance, and indeed came from a place of bottled frustration. From my corner of the Internet, I climbed on top of my metaphorical tower and yelled at the clouds like the Simpsonian old man. It was great. I needed to get that fuck off out of my system. 

But now is the time to have a more articulate conversation. 

Now, most people will look at Cook’s tenure as CEO of Apple and will talk about the incredible financial success it brought to the company. Cook took an already healthy Apple in 2011 and made it exponentially thrive in the following fifteen years. This is as dismissible as an elephant in a room. I never doubted Cook would be a good ‘maintainer’. It’s like letting a much-decorated admiral take control of your ship — of course they’ll do a good job. Before taking the role of CEO, Cook was the company’s COO, and in these past fifteen years he has always been a COO in CEO’s clothes. And I guess Steve Jobs knew that when he appointed Cook as his successor. Jobs didn’t want the Apple ship to sink, and Cook was the man to ensure something like that wouldn’t happen. With regard to vision and direction, well, that’s a whole other matter. 

I’m clearly speculating here as an external observer, so take this with the usual grain of salt, but I suspect that Jobs thought that his other men in executive positions at the time of his passing would help Cook when it came to envisaging products, avenues of technological research, and when it came to giving Apple a direction and personality. Or rather, Jobs perhaps thought that Cook would ask Jobs’s other lieutenants for help with these things. What happened, instead, is that Cook essentially did away with the figures who had been close to Jobs (in methods and mindset), to favour people he felt more collaborative, more ‘team players’ than the annoyingly charismatic and (I suppose) confrontational ones he had to argue with about pretty much everything. Too bad that a lot of these people were also pretty competent at their job.

Not knowing what to do with Jonathan Ive

Whether some like it or not, Apple historically owes a lot of its success to the design of its products, and therefore to Jonathan Ive. His role under Jobs’s tenure was clear: Senior Vice President of Industrial Design. Designing hardware. Jobs also described Ive as his ‘spiritual partner at Apple’ because he felt they both consistently operated on the same wavelength. Jobs told his biographer Walter Isaacson:

If I had a spiritual partner at Apple, it’s Jony. Jony and I think up most of the products together and then pull others in and say, ‘Hey, what do you think about this?’ He gets the picture as well as the most infinitesimal details about each product. And he understands that Apple is a product company. He’s not just a designer. That’s why he works directly for me. He has more operational power than anyone else at Apple except me. 

Under Cook, after the ousting of Scott Forstall, Ive was also tasked to provide “leadership and direction for Human Interface (HI) across the company in addition to his role as the leader of Industrial Design” (Source: Apple press release, 29 October 2012). This turned out to be an unfortunate decision, not because I think Ive was bad at this new role, but because his hardware design aesthetic didn’t translate as well or as efficaciously when it came to software and operating systems. This resulted in years of flat design in iOS and Mac OS, and in an increased focus on how a user interface looks versus how it should work.

In 2015 Ive was promoted to Chief Design Officer, which, again as an external observer, felt like the kind of title you give to someone when you simultaneously want to recognise their expertise and also dump everything related to their field squarely on their shoulders. Ive was ‘the design guy’ so let him ‘design stuff’ — it could be hardware, software, services, product packaging, architectural design for new company headquarters… You know, an Everything-Design Bucket, with Ive being responsible for this gigantic, undifferentiated bucket. Then in 2017 Ive was told to leave this big bucket alone and get back to manage just the product design team. Then in 2019 Ive was gone for good. 

I don’t want to exclusively defend Ive on this — I’m not a fan of certain design decisions and directions he made and took both under Jobs but especially under Cook — but I don’t feel that he was properly managed and utilised as the valuable asset he had been before Cook became CEO. After Jobs’s passing, Ive lost an important sounding board, someone who could really brainstorm with during the creation process, someone who had the ability to look at things from the end user’s perspective and who could make intelligent suggestions regarding the ‘design is how it works’ part of the equation. I absolutely don’t have a clue about how the atmosphere was at Apple Park with Cook as CEO, but my feeling is that that kind of constructive brainstorming was replaced by stuff like “Show me the designs for the new MacBook Pro by next Monday. Okay, have a nice weekend, see you.” Probably the only sounding board left was the rest of the Design Team, but they were basically all Ive’s subordinates. You can still have a productive discussion, but it’s just not going to be the same.

The power of iteration

Again, not to downplay Apple’s financial success under Cook, but let’s be real for a moment: do you honestly think people would stop buying Apple products after Steve Jobs’s untimely death? All major product lines were experiencing an enviable momentum at the end of 2011: the iPhone 4S was a great successor and upgrade of the iPhone 4. Mac laptops were in a good place, especially after a great 2010: MacBook Pros in three sizes (13‑, 15‑, and 17-inch); a redesigned and much improved MacBook Air; a healthy desktop offering with iMacs, Mac minis, a Mac Pro that was still going strong, a newly-introduced 27-inch Thunderbolt Display that would stay in production until 2016. And last but not least, the introduction of the iPad. On the software side, iOS was at version 5, showing lots of improvements and a mature visual design that worked well even on that ‘big iPhone’ that was the iPad. iCloud had been just announced, and promised to be a much more reliable solution than MobileMe. Mac OS X was at its peak with Snow Leopard, and while Lion didn’t really feel like a worthy successor, it was still in the right place compared to what was to come in the following years.

With all this food on the plate, do you really think people would stop coming to the restaurant? Of course they wouldn’t stop coming. When you can’t innovate, you iterate. And that sums up pretty much Apple’s hardware design under Cook’s tenure. Sure, there have been new products and product lines (Apple Watch, Apple Vision Pro, AirPods), and sure, design consistency isn’t necessarily a bad thing, especially if we follow the don’t fix what is not broken adage, but show a MacBook Pro or a regular Apple Watch to a layperson, and at first glance they won’t be able to tell whether it’s a recent model or not. The first time I brought to a library the 17-inch 2010 MacBook Pro I acquired two years ago, I was approached by a university student asking me if it was a new, bigger laptop from Apple. This happened in early 2025. When I told him it was a 15-year-old machine, he was genuinely astounded.

If you know by heart the recipe to make a good product, and a product that was already successful, you’ll keep making that product. Every now and then you’ll adjust the recipe slightly, so that you don’t deviate from the product too much but at the same time managing to keep it feeling fresh, and voilà, the power of iteration will keep you afloat.

It’s worth noticing that, wherever Apple deviated from the tried-and-true formula, the results were questionable at best. Examples include, in no particular order:

  • Putting a notch on the iPhone. This was done as a design compromise: you want to provide a bigger screen real estate, but the technology to put a front-facing camera beneath the display panel is still unsatisfactory, plus you need space for additional modules since you’re also debuting a new authentication method based on face recognition. The notch on the iPhone was a necessary evil. I despised it from day one, but I understood the kind of compromise. I kept buying iPhones with a Home button and TouchID technology because I still believe to this day it’s less intrusive and less awkward from a UX standpoint than FaceID (especially when you pay with your phone). I also think that iOS had better, easier to memorise gestures on iPhones with a physical Home button than on notched, FaceID-based iPhones.
  • The Touch Bar. A potentially-interesting idea, poorly developed, badly executed, and left behind much later than it should have.
  • Thinness for thinness’ sake. The 2015 12-inch retina MacBook was not a worthy successor of the MacBook Air. The only improvement was the display. It failed under every other aspect. Yes, it was even thinner and lighter, but the difference was not that significant, and that thinness came at a cost: underpowered CPUs, severe lack of ports, and…
  • The butterfly keyboard. Another design blunder that Apple was too proud to promptly back-pedal from, and this fiasco got protracted too many years and cost actual money to a lot of people affected by faulty keyboards multiple times.
  • Putting a notch on MacBooks. This is just inexcusable, stupid design. And less justifiable a decision than putting a notch on iPhones. Here, space is not at a premium. And putting a centimetre-high black spot in the top centre of a display because you don’t want to have a centimetre-thicker bezel is again a very questionable decision. Other manufacturers have either opted to have slightly thicker bezels on their laptops, or to go the opposite direction of a notched display by having the webcam placed on a sort of ridge while keeping the display bezels thin. Said ridge is also useful as a place where you can put your finger to lift and open the laptop lid.
  • Radical design departures that have been ultimately detrimental for an entire product line. It happened with the ‘trash can’ Mac Pro in 2013. It happened with the 24-inch M1 iMac in 2021. We can argue the fine points, but my general takeaway in both of these cases is that Apple profoundly misunderstood the needs of the users these machines are supposed to cater to. Professionals who favoured the Mac Pro liked its internal expandability and didn’t really care about the sheer size of the machine, which was always purchased for its versatility and not for its looks. iMac customers appreciated having a thin-enough all-in-one machine with a good display and a good array of ports. The redesigned iMac brought back colours but took away other stuff. Apple needlessly doubled down on the thinness, which led to having an awkward external power supply that also featured an awkwardly-positioned Ethernet port. Thinness that also led to internal design flaws, such as a display cable that can’t withstand the internal heat and breaks down over time.

All in all, the most impressive display of the power of iteration is that Apple, under Cook, has managed to iterate on its very success. It has been the industry equivalent of autophagy. The company has eaten from its own already consolidated brand and reputation and benefitted from it. That is significant, especially if you like to limit your perspective to the financial and growth aspects of this success.

As I wrote in my 2024 piece Jobs’s ‘quirky Apple’ (something worth re-reading, if it’s not too much asking):

The ‘utterly consistent’ excellence of Cook’s Apple is achieved through masterful levels of iteration. We’re seeing, for the most part, the same computers, devices, peripherals we’ve been seeing since they were introduced under Jobs, but continually refined and perfected. The brand and related recognition must be maintained. And before you jump at me and tell me that iteration in tech isn’t necessarily a bad thing, I’ll tell you that you’re right, it’s not. But when it patently goes on for this long and for every product line, I’m starting to question Apple’s ability to come up with something truly original and groundbreaking (and sorry, but the goggles are not that — they are stereoscopic iPads with iPadOS floating in 3D). 

Thinking indifferently

But as a long-time Apple user, and someone deeply interested in technological research and advancements, I didn’t want Apple to become ‘the top dog’ after years of being the underdog. I wanted Apple to stay true to its culture. Not necessarily by remaining an underdog in the tech industry, but by remaining the alternative choice, the different choice, the entity that does not align with the rest of the industry, but stays in its unique sphere, in its out-of-the-box approach. Apple under Cook went in the opposite direction and now — as a big tech company — is not really different from Google or Microsoft. As a product company, it’s just another Sony when Sony was at its peak.

One thing I appreciated of the Jobs’s era was that for Jobs the products — and therefore the customers — came first. The logic was simple and effective: if we build great products for our customers (and we do because we care to understand their needs), then people will come to us, they’ll buy our products, they’ll be satisfied with them, and will become returning customers. And money will be the natural outcome. 

It’s clear that under Cook money and revenue and the bloody ROI have always been the priority and the rest has been the process of putting in motions various plans to achieve that goal. Under Cook, every Apple product seems the result of some corporate strategy instead of the result of some thoughtful investigation into customers’ needs and ways to actually improve their lives and work. These computers and devices somehow feel more generic and impersonal, produced in ways as to appeal to as many customer segments as possible, with an approach that reminds me more of a car manufacturer or a fast fashion company than a tech company making supposedly personal computing devices.

Among other things, the failure to keep the Mac Pro relevant is a testament to this progressive detachment Apple has shown towards its customers, a detachment that comes from putting Apple’s own needs before everything else. It’s also a clear consequence of another issue that only got more and more evident under Cook’s tenure: spreading Apple’s resources too thin.

One sin of Cook’s Apple I’ll never forgive is wanting to be everywhere and keep adding platforms and services. Apple went from being excellent at a selected few things to being mediocre at many things. When Apple was excellent at a selected few things, they had razor focus and were very receptive to the needs of their customers. The pre-2013 Mac Pros were great because Apple knew and cared about what professionals wanted from it. But Apple today just wants to sell as much stuff to as many people as possible. There is almost no more ‘tailoring’ in their products. There is a very much consumer-first mentality (consumer as opposed to pro), so their offerings are more ‘general purpose’ than they used to be. And it’s all made with one priority: it has to be as low-maintenance as possible (from Apple’s viewpoint). So we have increasingly closed, un-expandable Macs. Here are some more ports if you need to connect something.

In this scenario, a machine like the good old expandable Mac Pro is viewed as a high-maintenance one. If you make one (or more) slot for custom graphics cards, you have to work with graphics card makers to provide an always up-to-date support for past, present, and future cards. And so forth. And this Apple doesn’t care. They still have an amazing hardware prowess, but they don’t care. It’s too much work. These ‘professionals’ are a niche (hint: they’re really not), it’s not worth it. The 2023 Apple Silicon Mac Pro is existing evidence that today’s Apple does not understand an important segment of their audience. It’s a machine that looks like the result of someone at Apple asking ChatGPT how to make a new Mac Pro. “But look, it’s still expandable!”, they point at proprietary slots, while the motherboard sports an SoC with integrated CPU, GPU, and storage. The tagline for that Mac Pro should have been “This is really it. You can’t make this shit up”.

Not knowing what to do with the iPad

I’ve written so many things about the iPad’s identity crisis over the years, that this has become a topic I loathe revisiting. I’ll be as brief as possible.

We don’t know how Steve Jobs wanted the iPad to grow. He passed away too soon for that. But this is how he introduced the iPad in 2010:

…And so all of us use laptops and smartphones now. Everybody uses a laptop and/or a smartphone. And a question has arisen lately: is there room for a third category of device in the middle? Something that’s between a laptop and a smartphone? And of course we pondered this question for years as well. The bar is pretty high. In order to really create a new category of devices, those devices are going to have to be far better at doing some key tasks. They’re gonna have to be far better at doing some really important things: better than the laptop, better than the smartphone. 

What kind of tasks? Well, things like browsing the Web. That’s a pretty tall order; something that’s better at browsing the Web than a laptop? Okay. Doing email. Enjoying and sharing photographs. Watching videos. Enjoying your music collection. Playing games. Reading eBooks. If there’s going to be a third category of device, it’s going to have to be better at these kinds of tasks than a laptop or a smartphone, otherwise it has no reason for being. 

Now, some people have thought that that’s a netbook. The problem is netbooks aren’t better at anything. They’re slow, they have low-quality displays, and they run clunky old PC software. So they’re not better than a laptop at anything, they’re just cheaper; they’re just cheap laptops. And we don’t think they’re a third category device. But we think we’ve got something that is. 

Quoting from my 2019 piece My kind of tablet:

…what I want to emphasise in this quote is this part: In order to really create a new category of devices, those devices are going to have to be far better at doing some key tasks. They’re gonna have to be far better at doing some really important things: better than the laptop, better than the smartphone.

Far better at doing some key tasks. Better than the laptop (but let’s just say better than a Mac or any other traditional computer), and better than the smartphone. Think about that.

For the first few iterations of its existence, the iPad and iOS delivered on their mission. In 2010 I had a brand-new MacBook Pro and I was still making the most of my iPhone 3G, but I couldn’t wait to get an iPad. I wanted to use it especially for reading, so I waited very patiently for an iPad with a retina display. And in 2012, with iOS 5, the iPad was still a great device to do everything it was designed for. A fast device with an intuitive operating system with an extremely low learning curve. Some apps for more creative tasks had appeared, and with the addition of a Wacom stylus I had fun at drawing and painting some stuff.

Then some people got very excited about the iPad, and another question arose: why can’t we use the iPad for all kinds of tasks?

That’s when things started to go awry, in my opinion. 

Simplifying a lot, the iPad could have taken three different paths:

  1. Continue being a device mostly designed for consumption, very practical and portable for everyday tasks, for checking information, enjoy audio & video content, doing light work on the go; and, with the right app, being a good-enough device for digital art.
  2. Become a true tablet in the way you interact with it; an interface tailored for stylus-based input and activities, backed by an operating system that could take the best of iOS and NewtonOS and fuse them together. Essentially turning the iPad in a creation-first device.
  3. Become Apple’s bold response to Microsoft’s Surface. A device that could be considered more like an ultraportable laptop with mixed input (traditional + pen and touch), but running an operating system that could guarantee desktop-class applications in a compact and portable format. iOS would have been too simple for such task. A version of Mac OS adapted to such a device would have been ideal.

And Apple never really took a definitive decision with the iPad, so they kept changing course and approach. They kept throwing stuff at it, at this iPad that kept becoming a jack of an increasing number of trades, while being a master at very few of them, comparatively. They built an increasingly higher tower of ‘stuff the iPad can potentially do’ over the inadequate foundation that iOS/iPadOS was and is. They thought that the problem was solvable by throwing faster and faster CPUs at it, while the actual work should have been done on the operating system front. There are still things a Newton MessagePad 2100 with a 162MHz ARM processor can do better than an M5 iPad Pro because NewtonOS is a better-designed OS for the device it runs on than iPadOS is on the iPad.

They also thought that remaining vague enough about the iPad’s core purpose was a good strategy, perhaps to buy time or to avoid taking a defining direction for the iPad that couldn’t be easily reversed. Apple’s way of remaining vague was perfectly epitomised by the phrase, We can’t wait to see what you’ll do with it. Like, here’s this obscenely powerful slab of glass and aluminium, do with it whatever you wish. Wow.

The thoughtless neutering of Mac OS

Ah, Mac OS. An operating system that kept getting better and better until the moment it wasn’t the only (or main) operating system to be developed at Apple. We love the Mac, Cook and other executives have been repeating for the past 15 years, while doing very little to actually prove they meant that. Mac OS has lost a lot of functionality over the years through removal of features, services, extensions (someone on the TidBITS forum even tried to compile a full list of what has been lost); removal or obfuscation of user-interface affordances, a general reduction in user-interface cleverness, clarity, intentionality, polish.

The moment it was decided that Mac OS and iOS had to converge, it was sort of the beginning of the end for Mac OS. This convergence is demonstrably unnecessary from a user’s standpoint. Even new users of Apple platforms had no real issues getting accustomed to Mac OS and iOS when these two operating systems were visually and functionally more distinct. (I know firsthand because at the time I was still doing a lot of freelance Mac consulting and tech support). From a user interface and user interaction standpoint, it makes more sense to have distinct operating systems, each designed to make the most of the device it runs on — even visually, because the way you do your computing on a Mac desktop or laptop, with big displays and mouse and keyboard, is spatially different than how you do stuff on an iPad or iPhone.

First Apple tried to make iOS and iPadOS more complex because the iPad needed to be a more sophisticated device than an iPhone, but apparently there’s a ceiling after which complicating iPadOS makes the whole system unintuitive, with increasing discoverability issues, and the insurmountable obstacle that is a touch interface — and a touch interface can only do so much.

So, when the complication of iPadOS didn’t go very far, the natural next step for Apple was simplifying Mac OS. Why? Because ‘convergence’, because they’ve been homogenising the hardware architecture for years, and mirroring that by homogenising the software as well is the easiest thing to do. Less stuff to maintain under the bonnet, while giving the illusion of things moving forward by working at a surface level. From a management perspective it’s an efficient strategy, a good plan. But you should hire competent people to help you accomplish that. Or, if you did indeed hire competent people, you should listen to what they have to say. 

What I can see from here, as a tech observer, end user, UI and UX enthusiast, is that you could make Mac OS more ‘friendly’ for someone coming from iOS without having to butcher Mac OS with the hammer of UI regression. People aren’t tech-illiterate as they were when the first Macintosh came out. Tech literacy has dramatically improved in the years following the introduction of the first iPhone. I still remember with wonder and surprise how quickly regular people got accustomed to interacting with the iPhone’s OS back in 2007, and later with the first iPad in 2010. When you have two separate but well-designed operating systems, people quickly pick up both on their respective devices. They know that things may work slightly differently on a traditional computer with mouse, keyboard, menus, sophisticated window management, keyboard shortcuts, etc. from a multi-touch device with an operating system that is designed for touch- and gesture-based input. People know, people learn. Especially young people. These are not the things that confuse them.

Instead, the Mac OS Simplification Initiative is apparently in the hands of UI designers that don’t seem to know much about the evolution of the UI in Mac OS. They seemingly poke around Mac OS’s interface, and prune and graft haphazardly so that their resulting FrankenMacOS can look and feel as much iOS‑y as possible. But it’s like dismantling an automaton and rebuilding it while leaving out some springs and screws and levers and buttons, and when you interact with the rebuilt mechanical puppet, you keep finding glitches: you turn the wind-up key, and instead of the puppet moving its right arm, an eye pops out of the socket. You try to move its head, but it remains stiff. You expect it to walk, but only one leg moves correctly. You get the idea. You end up with a crippled toy. You end up with an operating system that is a shell of its former self.

Once again, I don’t know the truth. It could be the fault of a design team that isn’t enough competent for the task on hand. It could be that the executives don’t care that much or don’t have time to care because Apple has a bad case of the FOMO (Fear Of Missing Out) and it must be in every tech market imaginable to stay relevant (hint: not true). Or there is no time in general because Apple executives decided that every OS must follow a strict yearly upgrade cycle, and this is what we get in return. The reasons could be many. I, in my observer’s innocence, have repeatedly suggested that maybe this yearly upgrade cycle is a bad idea, and that maybe leaving alone certain mature and established areas of Mac OS’s interface and underpinnings was a better solution than this thoughtless trimming, which keeps going on because — much like the first time you decide to cut your hair yourself, and you keep trimming the sides because they never seem ‘right’ — the new UI adjustments and rearrangements never seem to bring things to a stable and balanced stage.

TL;DR — Fucking up Mac OS was not necessary and was completely avoidable, but it has become collateral damage under a direction that has always clearly favoured the iOS platform. Everything that has been done to Mac OS under Tim Cook has been done in the service of iOS. iOS is ‘the future’ for these people, not recognising that a crippled, iOS-ified Mac OS is a terrible operating system for very powerful computers that are supposed to carry out much more complex, fine-grained tasks than mobile devices.

Not knowing what to do with developers

How Apple has treated developers for the past decade has generally been awful. Third-party developers are fundamentally responsible of improving the whole Apple ecosystem, and yet they’ve been increasingly treated as an annoyance, or taken for granted as if they were mere suppliers that Apple — the Big Tech behemoth it now is — can just bully around. The App Store Review process remains, after all these years, an inscrutable mechanism in the hands of capricious entities. Similar entities may decide the fate of even long-standing applications overnight. Developers aren’t considered valued collaborators but rather resources to take advantage of. The fact that a trillion-dollar company still takes the same 30% cut of a developer’s revenue it took back in 2008 is ridiculous and insulting. Yes, I know that this can become 15% under the App Store Small Business Program. It’s still too much for small businesses (as some of these have told me in private correspondences).

I really liked Jeff Johnson’s piece Small ways the App Store could be improved for developers, which makes so many good points on the matter.

I won’t say more on this. The subject is as old as the App Store itself. Much already has been said, and very little (if any) steps or improvements on Apple’s part have been made.

Macs are simultaneously better and worse than before

Macs have certainly come a long way with the transition from Intel to Apple Silicon that started taking place six years ago. I waited until I felt the platform was mature, though I was very impressed by the performance of the first M1 chip from the beginning. I purchased my M2-Pro Mac mini in mid-2023 and from a hardware standpoint I really have no complaints. It works today as well and as responsively as the first day. Performance-wise the current Macs are the most powerful and power-efficient Macs in all the history of the Macintosh, no doubt about that.

But the way the Apple Silicon SoCs are engineered, it also means that these Macs are also the most closed-down, un-upgradeable Macs we’ve ever had. CPUs, GPUs, RAM, storage, are all fused together. Upgrading RAM and storage down the road was a great way to extend the lifespan of an entry-level machine. Alternatively, it was a way to have a Mac adapt to your needs as time passed. Maybe you were fine with a machine that had 4GB of RAM and a 128GB SSD initially. Then, as you needed more storage, or as new software updates demanded more RAM, you could swap the old RAM chips and have 8 or even 16GB of RAM. If I remember well, there were even Macs with a socketed Intel CPU, so in theory you could even upgrade that with a more powerful, but still compatible chip.

This kind of flexibility also helped when your hard drive or SSD failed. You could just swap it with a new one, restore from a backup, and be back on track in a matter of a few hours. I hear that the storage chips in Apple Silicon Macs are very reliable. But still, in the event of a hardware failure, you practically lose everything if you don’t have backups and you have to bring your Mac to an Apple Store as there’s nothing you can do yourself. With a fixed amount of RAM and storage, you also have to pay a premium if you want to better future-proof your Mac at the time of purchase. Something not everybody is ready to do, budget-wise.

The general level of repairability is poor for these Macs, something that clearly makes Apple happy, as they can exert more control over spare parts, and over who can repair what. And of course now, since you can’t upgrade a Mac anymore, Apple gains more money because you either pay more for a Mac to future-proof it from the start, or you get another one sooner than you used to if you’ve outgrown the specs of the Mac you bought previously. You’re also more likely to pay for AppleCare because who knows what may happen with these black boxes… 

Again, yes, Apple Silicon is good technology, Apple Silicon is genuine hardware innovation. But it also looks like a very convenient way to squeeze money out of the customers.

Lastly, in a piece I wrote at the end of 2020 about the then-new M1 Macs, I said:

They’re unbelievably good machines, and everything that is genuinely good about them and future Apple Silicon-based Macs — sheer performance, astounding power-efficiency, and great backward compatibility with Intel software thanks to Rosetta 2 — will also allow Apple to get away with a lot of things with regard to platform control, design decisions, software quality, and so forth. Who cares that a pill tastes bitter, if it makes you feel good, right? 

Which is exactly what has been happening over and over again. Power users lament the increasingly worse quality of Apple’s software, Apple and other fanboys divert everyone’s attention by extolling the sheer performance of Macs (and iPads, and iPhones). This has just happened with the new MacBook Neo: lots of people and pundits marvelling at its performance and specs-to-price ratio; everyone seemingly forgetting it comes with the disgraceful Liquid-Glass-infused Mac OS 26 Tahoe preinstalled.

I have to conclude, but really, I could go on.

As someone who first got his hands on an Apple computer in 1982, to then finally become a regular Mac user since 1989, my huge disappointment in Apple under Tim Cook doesn’t really come from a place of nostalgia. It comes from having seen a company that has had an immense impact on my life progressively deviate from directions and practices I supported and I recognised myself with. It comes from a place of unplanned detachment, like when you have to break up a relationship with someone because their values have shifted, the things they believe in have changed, and you can’t see eye to eye anymore on a lot of that.

In the pursuit of that record financial performance, in having put financial engineering over software engineering, in transforming Apple from a company that did things in a different — sometimes even special — way, to yet another big tech money-making behemoth, Tim Cook has lost at least one customer, me, and probably many more, judging from the numerous emails I still get from readers of my blog.

At the memorial to Steve Jobs, it’s reported that Cook said, Among his last advice he had for me, and for all of you, was to never ask what he would do. ‘Just do what’s right’ [Jobs said].”

Nevertheless, when Cook began his tenure as CEO, I hoped, expected, wanted an Apple that could treasure all the best lessons from Steve Jobs and build on them. And what I got is a very different Apple. In some ways, an unrecognisable Apple. An Apple that valued its past only as a treasure trove of reputation to ransack and cannibalise in order to go on and as a shield to get through their various blunders (What do you mean, “which ones”? Have you been reading this piece at all?) and keep thriving in spite of them. 

Was all this right, Tim? Did you do the right thing? You’ll probably think so. Many others will think so. Everyone who thinks in terms of, If the money keeps coming, we must be on the right track, will think so. I do not. I just do not.

 

Will things change significantly under the new CEO John Ternus? I like the guy. I also like that other guy, Stephen Lemay, who has taken the role of Alan Dye, former VP of Human Interface. They’re competent figures, and I certainly hope their competence will shine through during this new chapter. I’m not feeling particularly invested in Apple at present, just vaguely curious to see where things will go from here. But to be frank, I don’t expect significant deviations from the status quo.

The Author

Writer. Translator. Mac consultant. Enthusiast photographer. • If you like what I write, please consider supporting my writing by purchasing my short stories, Minigrooves or by making a donation. Thank you!