The excessive reliance on technological crutches

Tech Life

The items I carry (1996 version)

Pen, paper, mind maps

In this picture you can see a rather faithful reconstruction of the items I used to carry everyday in my backpack during my university years, around 1996. I was quite fond of ink and ink-related writing tools. On that ring binder I used to take notes on white sheets of paper (I hated ruled or squared paper), and draw during the most boring classes. In that same binder there was also an agenda/planner section I created, tailored to my needs. The calendar & organiser features provided by the Sharp IQ-7300 M Multilingual databank were nice, but inserting and updating items like appointments, reminders and the like was a bit awkward and impractical for my tastes and habits (the fact that the device had an ABCDEF keyboard layout didn’t help in the least). 

Pen and paper have always had an important, deep-seated presence in my life. The earliest childhood memories I can recall involve writing and drawing tools: pens, pencils, crayons, papers and notebooks. These tools have brought me two invaluable gifts (for lack of a better word): the passion for writing, and a good memory.

For me, writing things down (on paper, not typing them in a computer), has always been the key to organising stuff in my head, and to remember bits of information. It has wired my brain in a way that, for simple-to-medium memory and mind mapping activities, I can usually do without pen and paper. This is hardly rocket science, and I guess it’s a common learning technique. Still, I have often surprised people with my ability to remember dozens of different phone numbers, addresses, or the names of all the new people I used to meet at parties or other gatherings. “No, I’m neither an autistic nor an idiot savant,” I used to joke, “I just have good memory.”

For more complex tasks, I still like writing things down and drawing mind maps; my faithful Newton MessagePad is a nice alternative when I don’t have pen and paper at hand, or when a particular ‘map’ has to be preserved electronically and not just discarded once I acted on it. This physical, low-tech approach is still the most effective for me, especially when I’m writing creatively. Some of my story plots can be convoluted, or sometimes I need to keep a bird’s-eye view of all the relationships among the characters in a novel. All the software applications I’ve tried don’t have the same immediacy as writing a map down on an A4 or A3 sheet. I’ve also tried the iPad + stylus approach: it’s a bit better, but still not optimal. 

There’s an app for that, but I don’t really need it

But let’s go back to my good memory. Sometimes people ask me what’s my favourite calendar app (on the Mac, on iOS), and when I say that I don’t use calendar apps, some of my interlocutors think I’m pulling their leg, other react with comments along the lines of “Oh well, evidently you’re not that busy.” (Quite the contrary, actually.) Same goes for to-do apps. What’s my favourite? None, really. Not because the existing to-do apps are not good enough for me, but because I don’t need a to-do app in the first place. At this point people tease me: “Sure, you remember everything you have to do for any given day, don’t you?” Well, yes I do. I also remember all the passwords tied to my most important accounts and services. I have developed a system so that I can create passwords that are both memorable and strong enough. (I have also written them down as a failsafe, of course.) So I don’t really need applications like 1Password. 

Save it somewhere and forget about it

The point of all this is obviously not to show off my abilities. It’s to emphasise the importance of keeping one’s memory well-trained, and the importance of organising information (and its intake) in a way that doesn’t make one too dependent on devices and machines. When I mention this subject, I’m either labelled a Luddite, or I usually get the objection that “devices and machines are meant to relieve our minds of boring tasks and the burden of remembering dull information, et cetera, so that our minds can focus on more important, interesting stuff.” 

The fact is, I don’t think this “save it somewhere and forget about it” approach is being that beneficial to our minds. It’s the ‘forget about it’ part that bothers me, of course, because this approach doesn’t encourage either the retention or the organisation of information into something systematic that eventually becomes knowledge. Instead, this approach encourages forgetfulness and delegation. When something goes wrong, e.g. you’re unable to retrieve the information — or sometimes even the ability[1] — you delegated to a device or application, you get stuck and at times you even ‘short-circuit.’ The other night I was talking with a friend, and he told me how a common acquaintance was facing a bit of an emergency, his phone had died and had to use another phone to call his sister and brother-in-law, but he blanked out when he realised he didn’t remember their phone number. Thankfully their landline phone number was listed in the telephone directory, and he eventually got to call them. (Though admittedly he wasn’t sure about their exact street number, either.) We’re talking about a 25-year-old guy who doesn’t remember somewhat important information about close relatives. 

If you think this is an isolated case, it’s not. At least according to the (anecdotal — I know) evidence I’ve been collecting lately. In an informal poll via email, I’ve asked my youngest contacts, acquaintances, friends, to answer a few questions, such as:

  • Do you own a smartphone and make extensive use of it?
  • Do you use to-do apps and reminder apps on a regular basis?
  • Can you share some examples of the most common reminders and to-dos?

 

The demographic is people in the 19–28 age range. So far, 25 people have replied with useful data. The results are interesting and seem to prove my point:

  • 23 out of 25 people own a smartphone. The other two just own a feature phone.
  • 22 out of the 23 smartphone owners use to-do apps and reminder apps regularly.
  • The sheer majority of reminders and to-dos involve extremely simple and mundane things/tasks like “Remember to buy water, milk and bread on the way home,” “Remember to ask X about her exam,” “Go to driving school at 6 PM,” “Phone dad,” “Pick up Y at school” [where Y is the little brother], “Need haircut,” “Plan weekend trip,” and so on.

 

When I say that the results seem to prove my point, I mean that these people, young people, seem to heavily rely on their devices to be reminded of carrying out even the most trivial stuff — things I find hard to believe one could forget otherwise. I can understand entering dentist’s appointments on their calendar, or other events that are scheduled to happen in a relatively distant future. One may indeed forget about a medical exam (the exact date and the exact hour) when it is in six-seven weeks from now. But setting up a reminder to call your dad or go to your little brother’s elementary school to pick him up? Really? Are people this detached from things and other people they should care about? And what about those reminders to have their hair cut or that they need to plan the trip for the weekend? I wonder what would happen without such reminders: would this person realise with horror, Friday night, that he or she has to go on a trip the following morning and they don’t know what to do or where to go? I would be less surprised if the demographic were people in their sixties. 

Among other things pertaining to the upcoming Windows 8.1 update, in this video Joe Belfiore (Corporate VP of Microsoft Windows Division) talks about Cortana, Microsoft’s new personal digital assistant. At about 5:12, Belfiore mentions a feature called ‘people reminders’ where he instructs Cortana as follows: Next time I speak with my sister, remind me to ask her about her new dog. Now, on the one hand I’d lie if I said this isn’t a cool feature; on the other I can’t help but hope that only impossibly busy or really forgetful people will use it… or perhaps people who (to use this very same example) really don’t care much about their sister since they don’t even bother to keep in mind she got a new dog.

Technological crutches

I’m not saying that all these applications, services, personal digital assistants and the like — which are designed to assist us in various ways and capacities — are useless or should be avoided. I’m merely pointing out how I find a bit alarming that young people, with supposedly healthy and highly functioning brains, seem to rely on such technological crutches a little too much and probably more frequently than they should. Out of metaphor, you use real crutches after you suffered an injury, not pre-emptively just because you want to prevent one.

Again, I appreciate the technology and it’s great that it exists because there are people out there who really need the assistance it can provide. But I fear that a lot of other people are using these technological crutches just because they exist, therefore developing a certain ‘mental laziness’ that can’t be good in the long term. One can argue that, since the technology is here to stay, these ‘mentally lazy’ people will just keep taking advantage of such crutches and that’s the way it’s going to be; with humans becoming increasingly more dependent on devices, machines and technology (smart cars, smart homes, wearable devices, etc.) for all kinds of things — some of which, come on, are perfectly manageable and have always been. 

Yes, we definitely live in interesting times. My hope is that all this technology that should make our lives easier won’t end up making our minds also dumber in the process. 

 


  • 1. I know people who apparently have put all their sense of direction in the hands of sat-nav systems and GPS-based apps, judging by how they’re basically helpless without such devices.

 

Mavericks and display management — a headache

Software

It’s been more than ten years since I used a desktop Mac model as my main machine. After the iMac G3, I spent a year with the clamshell iBook G3/466 special edition as sole work machine. (At the time it was rather powerful for my needs.) Then, when I upgraded to a PowerBook G4, for me began the era of dual-display configurations and ‘extended desktops’ — and since 2004 my main setup has been a Mac laptop connected to a bigger external display. 95% of the action, for me, happens in the external display, while the laptop screen is relegated to a minor role (I typically keep Finder windows from external drives and servers open on the MacBook Pro’s display, and very little more).

Therefore, my typical dual-display configuration looks like this:

Display arrangement ann

 

Now, I’ve never found the previous Spaces-based virtual desktop and display management particularly problematic. There was the occasional nuisance, but nothing terrible. (At least for how I use my Mac.) Things started to get annoying under Lion and Mountain Lion with Mission Control, especially with different Finder windows open in different Spaces: I’m sure you got mad as I did when going back to a specific Finder window accidentally opened on another Space meant visually jumping from Space to Space — Whoosh… to Space N.1! Then whoosh… back to Space N.4! — and so on.

It was annoying, yes, but predictable. Since upgrading to OS X Mavericks that desktop-jumping back and forth has gone, but the overall display management has been rather erratic and unreliable for me. As you can see in the figure above, now Mavericks puts a menubar on both displays, to (supposedly) facilitate handling different applications, windows and interface elements on different displays. As the Multiple Displays blurb on Apple’s OS X webpage says, There’s no longer a primary or secondary display — now each has its own menu bar, and the Dock is available on whichever screen you’re working on. You can run a full-screen app on one display and have multiple windows on another display, or run a full‑screen app independently on each display.

Which is great and all, but sometimes things don’t behave as you’d expect. When you’re working in one display, usually you’ll have an active menubar there and the Dock will be placed on that same display. Yet — and I haven’t been able to fully reproduce this behaviour — sometimes you quit a full-screen app and you’ll find the Dock placed on the display you’re not actively using. Or you return to the Finder, quickly open a new Finder window with ⌘-N, only to find that the active menubar is on the other screen, and the window has been created there. Or the Finder’s ‘Copy’ dialog that appears when you’re copying files and folders, starts being displayed on the ‘wrong’ screen.

And then the other day, after watching a video in full-screen mode, this happened:

Half half ann

On my ‘main’ display I was left with the active menubar, but the application switching interface (the bezel with the array of open apps that appears when you press ⌘-Tab) and the Dock had been moved to the ‘secondary’ display. You may say, Fixing that is easy: just click on the inactive menubar on the MacBook Pro’s display, make it active, then go back to the external display, make the menubar active there again, and all the UI elements will return to the external display where they were before. But no. Clicking on the inactive menubar didn’t make it active. Everything remained stuck in the arrangement depicted above until I repeatedly tapped ⌥-⌘-D to hide and show the Dock. After many attempts, the Dock finally appeared on the main screen and everything went back to normal.

These may be considered minor annoyances, I know. But what’s maddening, in my opinion, is their utter unpredictability. They get in the way of what you’re doing for no apparent reason (you’re not doing anything ‘wrong,’ often you’re just switching from an application to another) and certainly don’t make for a smooth or seamless multiple displays management.

Adding DuckDuckGo as a search service in Sleipnir

Briefly

Since Sleipnir is absent from DuckDuckGo’s list of browsers that include it as a search option, I wanted to share how to add DuckDuckGo to Sleipnir’s Search/Address bar as a new search service. It’s quite fast and easy.

  1. Open Sleipnir’s Preferences and go to the Search tab.
  2. Click Add.
  3. A pane will drop down. In the Search Service text field, enter DuckDuckGo. In the Address field, enter https://duckduckgo.com/?q=%@. In the Search Shortcut field, enter the shortcut you prefer (I used d). It should appear as follows:

DuckDuckGo and Sleipnir

 

Unfortunately, DuckDuckGo can’t be added to the options available in the Web Search drop-down menu (as usual, they are Google, Yahoo and Bing), but once you add DuckDuckGo as a search service, performing a search from the browser’s Search/Address bar is fast anyway — you just enter the shortcut followed by a space and the search keyword(s). (For example: d apple)

Ddg sleipnir 2

Even better, once you add a search service in Sleipnir, the application assigns a keyboard shortcut to it (⌘-4 in my case, as you can see in the figure above), so when you press that keyboard shortcut, Sleipnir will open the DuckDuckGo main page a new tab.

The need for new voices

Tech Life

Lately I haven’t been able to shake this feeling — that the voices that matter in the tech writing sphere are too few; that the meaningful debate is confined to a sort of elite circle of pundits who don’t look particularly interested in seeking the point of view of other people who don’t belong to their circle or are outside the reach of their RSS readers. 

Today everybody can easily publish their opinions for everyone to read, and the Web has become an immense space where it’s very difficult to be noticed and gain a significant audience. In a certain way, it’s like looking at a pyramid scheme, where the only ones who really ‘earn’ something are those at the top of it — the aforementioned elite circle of pundits. What they earn is credibility, authority, and also money. Now, I’m not really questioning how and why they got where they are. (I personally believe that some of them deserve it, others less so. But that’s another story.) 

No, what’s beginning to make me feel restless is the scenario that’s been consolidating so far. These ‘top pundits’ now hold a seemingly unchallengeable position. If you’re a lesser-known writer and criticise the quality of what they’ve been publishing on their sites, or what they’ve been broadcasting on their podcasts, you’re the jealous loser who just writes out of frustration, while they remain untouchable because they’re the good guys. Maybe you do have a point in your criticism, but it will rarely get through. If you’re a lesser-known writer and try to add your opinion to the mix, you can do so, of course (the Web’s democracy, remember?), but it will rarely matter — in other words, it will rarely get the same exposure and attention as the opinion of the ‘top pundits.’

The infuriating thing is that some analysts or tech journalists who write atrocious, ill-informed, intellectually dishonest, or just flat-out dumb pieces are likely to get more attention than certain brilliant voices that struggle to reach a wider audience simply because they’re not on the radar of anyone in a significant position to amplify them.

I think that those ‘top pundits’ should really make an effort to expand their horizons and their readings to include other people deserving of attention, instead of keeping on quoting big tech news sites and their friends 90% of the time. They don’t have a real incentive to do so — perhaps some of them are afraid of losing some people in their audience — but I think it’s the most responsible attitude for people in their position.

But it’s also something we should do from the bottom, so to speak. We can expand the debate in the tech sphere by openly recommending our findings — websites with great content, with intelligent contributions, from insightful people focussed on quality over quantity; people who deliver constantly, offering opinions and perspectives deserving of a greater number of listeners. We keep suggesting great new apps and gadgets on social networks. We retweet, repost and reblog the stupidest things and the silliest memes. It’s time we all started to suggest great sources and writers worth following — and possibly more than once, with more than a passing mention. I’ve previously dedicated a couple of pieces to people and resources I discovered and added to my daily reads in 2012 (this article) and 2013 (this article), but I plan to write more often about the new voices I stumble upon every now and then. It’s a good, responsible practice I’d like to see happen more often.

The Mac can’t do it all — really?

Handpicked

I realise I may have fallen into the linkbait trap. I’m usually good at avoiding this kind of stuff, but this morning, when I spotted on Flipboard How I Moved Away From The Mac After Leaving Apple, a piece written by David Sobotta for ReadWrite, I kept reading because the article presented itself as a long-form writeup of what sounded like an interesting perspective on Apple — that of a former employee of the company and long-time Mac user.

Instead, I ended up rather astounded by what read like a rather shallow, subjective and somewhat embittered report from someone who — despite writing Take it from someone who knows Apple inside and out… in the very subheading of the piece — doesn’t seem to ultimately understand Apple much. I agree with Grant Hutchinson when he says that the article “just highlights the backwardness of entrenched industries and legacy systems… not issues with the Mac platform.”

I know: the article is about a subjective experience, and its subjective stance is clearly worded in the title itself. Still, I expected something with more depth than what can be summarised as “I had to enter more Windows-friendly environments for work reasons, plus I had back luck with the Apple hardware I bought, plus I didn’t like how Apple addressed my hardware problems, plus I never really had issues when I started using Windows, so yay Windows/Microsoft and boo Apple.” I mean, this is the kind of user experience tale I’ve heard many times back when I used to frequent online forums, mailing lists and the like. Back when the Mac vs PC wars were raging and the majority of opinions and experiences came from users who used their very personal anecdotes as a measure of a platform’s worth. 

From Sobotta’s piece I expected a more balanced and in-depth analysis of the factors that may drive a user away from the Mac, for example. The article seems to present itself as a sort of cautionary tale, but, in my opinion, is too strongly tied to specific and circumstantial issues (to use Sobotta’s words) to have the exemplary value it probably seeks. It also fails to convincingly demonstrate what it announces in its premise — i.e. that the Mac “can’t do it all.”

Another minor thing, but still worth mentioning, I take issue with, is the way the author uses links to its personal blog in certain points of the article. There are parts in the article where Sobotta is vague, or makes certain statements that require further explanation. Indeed, he uses links to his personal blog as a way to better explain the issue he’s talking about, but often these posts on his blog don’t really explain matters. Furthermore, they are long posts which the average reader probably tends to skip. The result is that issues remain vague and certain statements remain largely unexplained. 

For example, at a certain point Sobotta explains that, after years of using (and liking) iPhoto, “in summer 2011, problems with iPhoto caused me to pull the plug on my favorite Macintosh application, iPhoto, altogether.” Now, ‘problems with iPhoto’ is vague, but there’s a link to his blog (to be honest, I expected a link to some other site or discussion forum explaining what exactly those problems were, but I’m digressing) so maybe I’ll find an explanation there, I thought. After reading that long aside, however, the only thing I understood is that Sobotta had a library-related issue after updating to iPhoto 9.1.5. Even in that blog post things are described rather vaguely:

The last week of July 2011, I did a routine software update on the MacBook. It included iPhoto 9.1.5.

When I launched iPhoto after the software update, it told me my library needed to be updated. It started and never finished. Relaunching iPhoto got me nowhere so I searched the web for some solutions. I tried a few which did not seem to work. I even tried creating a new library, and iPhoto 9.1.5 still did not work.

Then I dug out my DVD and re-installed iPhoto 9 from scratch. I then applied the software updates and got the same non-functioning results. 

The feeling I have after reading this is that Sobotta gave up rather quickly. And that he’s also quick at passing judgment with considerations like: What bothers me about the iPhoto problem is that it is just another glitch that is making me wonder if Apple is stretched a little too thin. (Just a few paragraph after the quoted bit.) By the way, in the same post he gives up on iMovie in much the same rushed way: I wasn’t one of the folks who hated the big change in iMovie. I got so that I liked it, but recently after several unsuccessful tries at uploading movies to YouTube from iMovie, I gave up on iMovie.

Back to the ReadWrite article, here are a few highlights that made me raise an eyebrow, to use an euphemism:

Fast forward to late 2012 — my office gets its latest technology refresh. The first product I buy is a first-generation Lenovo Yoga, the second is an I5 Lenovo desktop, and the third is a Mac mini, which is really more of a token Mac than anything else. 

This belittling mention of the Mac mini is a bit unwarranted, I think. It may have been an underpowered machine back in its first PowerPC and Intel Core Solo iterations, but by the end of 2012 the Mac mini was already a rather powerful desktop Mac. Don’t believe me? Ask the guys at Macminicolo or Brett Terpstra, just to make a couple of examples off the top of my mind.

I ordered an Intel MacBook in 2006 and a 26″ I5 iMac in 2010.

I hope that 26″ is a typo…

By early 2010, my wife’s 12″ G4 PowerBook was so slow that even the Washington Post’s minimalist webpage wouldn’t load.

Sorry, sorry, but here is when I call bullshit, loud and clear. As someone who still uses his 12-inch PowerBook G4 purchased in 2004, I have to say that, sure, by today’s standards it is certainly a slow machine and I’d never use it for video conversion or other CPU-intensive tasks, but in 10 years of use I never had issues with… websites. There may be the occasional rendering glitch due to Safari not being updated past version 5.0.6 on Leopard, and YouTube videos stutter (less so if you specify Safari on iPad as user agent), but “so slow” that a webpage “wouldn’t load”? Give me a break.

I had no intention of buying my wife a premium-priced Mac with an outdated processor (the Intel Core 2 Duo) but around that time I saw a special at Staples for HP laptops with the new Intel processors. […] The two HP laptops together cost less than $1,500 and both computers showed up in a few days, even though Apple folks maintained the processors weren’t shipping in any products any time soon. It would take a few months before Apple could announce similar products — which, of course, were also priced much higher.

The Intel Core 2 Duo processor may very well be outdated today in 2014, although my 15″ MacBook Pro with a 2.66 GHz Core 2 Duo CPU is still quite powerful at many tasks, whether simple or demanding, but in 2010 it was certainly not. I understand the need for some ‘future-proofness’ when buying a new computer, but if Sobotta’s wife was still using a PowerBook G4 in 2010, I deduce that CPU performance was not at the top of her list when choosing a new computer.

Then there’s that remark about Apple laptops being “priced much higher,” which is really becoming rather old. 

The Apple addict I am, I eventually relapsed in the fall of 2010 and ordered an I5 iMac […] but that particular computer is when the wheels really started coming off the Apple wagon.

The iMac and I never hit it off. I had to buy the huge 26″ model to get an I5 processor and I hated the positioning of the SD slot right under the DVD slot.

Again with that 26″. I’m starting to believe it’s not a typo after all… Also, the fact that Sobotta hated the positioning of the SD slot doesn’t make it a lesser machine and isn’t an inherent shortcoming, but obviously just a personal peeve. 

When I went to buy a travel laptop in late 2012, I could not find a Mac that had an integrated SD card for under $1,000. So, I bought an I5 Lenovo Yoga for $999 (which comes with a bonus—a touchscreen), as well as a $479 Lenovo desktop to run all of my photo editing tools and applications like Lightroom and Picasa.

What Sobotta did there is something I witnessed other people do. They don’t want to spend, say, $1,000–1,200 for a Mac, but they spend more on two cheap Windows PCs (or a PC and an Android tablet) and still feel they made a cost-effective purchase. Sobotta didn’t want to spend more than $1,000 for a Mac, but ended up spending $1,478 for two PCs when probably the more cost-effective solution was to purchase a 13-inch MacBook Air for $1,199 — the price of the entry level 1.8 GHz Core i5 mid-2012 model with 128GB of flash storage and equipped with an SD slot. It didn’t have a fancy touchscreen display (to do what, exactly?) but could have run applications like Lightroom and Picasa equally well. 

But if you’re looking for a Mac for $1,000 and then end up spending $1,478, I guess that money isn’t really the issue here. A different story would have been if Sobotta was looking to purchase two computers, had a $1,500 budget, and concluded that he couldn’t buy a laptop and a desktop Mac for that price. That would have been more understandable. 

I still use the Macintosh for certain things but I have to admit being a Mac user has become too much trouble.

Again, a bold remark, linked to a post on Sobotta’s personal blog. Another long post that ultimately doesn’t explain why being a Mac user has become too much trouble. What he says there basically revolves around this statement: Even with a history of good experiences with Apple’s high end systems, my experience with the iMac and MacMini leaves me a little skeptical of the new Mac Pro. Here and there you can find mysterious remarks such as: I have not been happy with OS X and its default world of iCloud for a while. (What does “its default world of iCloud” mean, exactly? He paints a picture of OS X as if it weren’t possible to use without iCloud, which is not strictly true.)

My most recent Kindle book, 100 Pictures, 1000 Words, A Crystal Coast Year, was written and compiled in Microsoft Word on my Lenovo desktop running Windows 8.1. The images were all catalogued and edited using Lightroom on my Windows desktop. I still needed my Mac for a few things — I resized all my images on Pixelmator and edited the filtered HTML for the Kindle using TextWrangler — but many of these things could have been easily done on Windows. 

Actually, I’d like to point out that this workflow could have been more efficiently carried out on a Mac in its entirety. A Mac can indeed run applications like Microsoft Word and Lightroom, in case you were wondering.

Towards the end of the article, Sobotta admits that

Many of the issues I’ve experienced are specific and circumstantial…

Then proceeds to make a sweeping generalisation:

The other issue with Apple, to me, is its attitude. I would’ve felt better about my failing products if Apple was willing to repair the problems. […] What’s worse is that Apple’s poor attitude towards hardware issues rubs off on its customers.

Again, since he had hardware issues with his Macs and had a problem with Apple’s attitude (it would have been interesting to know more about this: maybe I missed it but why was Apple unwilling to repair the problems? What happened, specifically?) — he talks about this ‘poor attitude towards hardware’s issues’ as if Apple had such attitude by default, everywhere, with everybody. Which is not the case. 

Amidst the series of problems that ensued with nearly every Mac I purchased over the years, I still hung to Apple’s platform. But for some reason, there are a number of Mac users out there that will blame you for the problem, regardless what it is, and heap shame upon you for suggesting the world’s richest company should solve a hardware/software problem that you caused. It is radically different mindset from the worlds of Windows or Linux, where most people tend to relate to your problems and end up blaming Microsoft, or perhaps the hardware manufacturer.

This is another generalisation and — again — has nothing to do with Apple as a platform. It’s neither a software-related issue, nor a hardware-related issue, not even an ecosystem-related issue. Honestly, the perceived mentality of the users of a platform shouldn’t be a factor in deciding whether to keep being a user of such platform or to leave it behind. 

Maybe I just know too much about Apple and its products to be able to enjoy the taste these days.

If I have to be blunt, judging by what I’ve read in this article and the linked blog posts, the impression I’m left with is that Sobotta doesn’t know that much about Apple, doesn’t really get how Apple operates, fails to explain why “the Mac can’t do it all,” and why today “being a Mac user has become too much trouble.” All this in a piece filed under the Infrastructure category. It boggles the mind.