“If it has to be jailbroken and side-loaded in order to run approved code, and then disposed of when it’s reached it’s pRE-determined end of life, what you’ve got there isn’t a computer, that’s a Smart Device.”

When Steve Jobs debuted the original iPhone it was clearly a very small computer.  There was a CPU, a screen, memory, input, output.  Almost every device we use these days, from televisions to watches to vacuum cleaners, is some sort of computer.  I have smart light bulbs that are technically computers.  I say this because I want to make a distinction between a Personal Computer (PC) and a Smart Device (SD).

A Smart Device is a computer by any technical definition, but there are some things that differentiate a SD from a PC, mainly based on how the technology is intended to be used and maintained by the purchaser. To illustrate the differences, I will compare and contrast my television, my phone, and my laptop.

My television is an LG Smart TV.  It has apps and an app store and it has some sort of operating system on it that is referred to as “firmware”.  Maybe it’s Android, maybe it’s WebOS, maybe it’s something else, honestly I don’t know and I don’t care and I’m not supposed to care.  There is a Netflix app and a Hulu app and the like, so it can do television things.  While it is technically a computer I don’t think of it as a computer and I don’t use it as a general purpose computer and neither would any other normal person.  Other than my ability to choose which channels I watch or which apps I install to tailor the television to my needs, I would never be expected to open the hardware or alter the underlying firmware, I would never “hack” my television.  The product is defined, designed, and controlled by LG and sold as a single-purpose Smart Device with a curated and controlled experience and when it no longer works as you desire, you are intended to dispose of it and buy another one.

Almost everything I just said about my television also applies to my phone.  My current phone is also an LG, coincidentally, but I’ve also owned iPhones and Android-powered phones from other manufacturers.  All of these phones are Smart Devices and, like the television, they have a proprietary hardware design with embedded firmware that is controlled by the manufacturer.  They feature some sort of app store that allows for the installation of new capabilities, but they are sold as dedicated devices, not general purpose computers.  You cannot easily install alternate firmware or execute code that is not distributed via the app store.  You cannot replace or upgrade the internal hardware.  When they fail you are expected to recycle them and buy new ones.

Now, let’s compare that to my laptop.  In this case, it’s a Lenovo Yoga 920 but I also have an older Apple MacBook sitting nearby.  In both cases, the machine comes with an operating system, similar to the firmware on the television or phone, but with one key difference.  I can pick which operating system I would like to use and even install more than one.  My Lenovo is currently running Microsoft Windows 10 but can also boot up into Linux.  The MacBook defaults to MacOS but can also boot up to either Windows or Linux.  If I don’t want to configure multiple boot setups, I can run “virtual machines” within host operating systems.  For example, I routinely run an Android virtual machine on my Windows laptop (using BlueStacks) so that I can use certain Android apps that aren’t available on Windows.  There are no fundamental obstacles in place barring me from doing any of these things.  I bought the hardware, I choose what I want to do with it, I do not have to pay Lenovo or Apple for the privilege of using the hardware I own for general purpose computing, whatever that may be.  I bought it once, I own it, and I’m free to alter it’s behavior.  It’s more than just operating systems, though.  Let’s talk about software and data.

“Jailbreaking was never a concept in the world of personal computers and this illustrates a fundamental difference: you don’t need to break out of a jail that isn’t there.”

In the Smart Device space it is common to hear the term “side loading”.  Side loading is when you put content that is not approved by the SD manufacturer onto the device.  For instance, if you purchase an Amazon Kindle and attempt to put a book on to the device that you didn’t get from Amazon.  If a friend sends you an ePub or a PDF and says “you should read this” you can do that on your PC but not on that Kindle.  This is true for software running on Smart Devices as well.  The only way to run an application or a game on a Smart Device is via the approved channel.  Any other code that you attempt to run is side loaded and, depending on the device manufacturer, can even void your warranty and lead to your device being rendered non-functional.  This is why we have the concept of “jailbreaking” Smart Devices.  Hackers around the world have found the imposed limits of handset and tablet manufacturers to be frustrating and arbitrary and have therefore collaborated to free the devices from those constraints and allow them to run unapproved code.  Jailbreaking was never a concept in the world of personal computers and this illustrates a fundamental difference: you don’t need to break out of a jail that isn’t there.  Personal computers have, until the M1 Mac, never been designed to need jailbreaking.  They have never before come with baked in limits to what code you could run on the hardware you purchased.  This has been the case with almost all the Smart Devices every sold, but not computers.

The final point is about the hardware itself and whether or not you are intended to be able to upgrade or repair it.  If my TV or phone die, I will recycle them and buy new ones, but if the PC in my basement refuses to boot up someday, I will repair it.  Every component in the box is replaceable or upgrade-able, the power supply, the video card, and processor, the motherboard itself.  The entire hardware configuration is modular and component based.  This used to be true of virtually every computer built or sold, but in recent years laptops have become less upgrade-able and repairable, a trend lead by Apple.  It was common place to have a replaceable battery pack until Apple’s quest to make thinner and thinner machines put a stop to that.  Replaceable hard drives and memory were done away with, again by Apple, a few years ago and they started gluing or soldering the components in.  This trend towards laptops that cannot be altered from their purchased state is a choice by Apple to drive sales of new machines rather than allow users to update older machines, a sales decision, not an engineering one. Other manufacturers still offer machines that can be upgraded or fixed when there are hardware failures, but Apple has made the modern MacBook an entirely disposable product and many other manufacturers have followed suit.  Consumers haven’t generally complained too much since most of them didn’t really upgrade, repair, or replace things but this freedom to alter the configuration of the hardware you own has nonetheless long been one of the defining characteristics of a Personal Computer as opposed to a Smart Device.

This brings us to the new M1 Macs and the final step in the Apple plan to transform the Mac from a Personal Computer into a Smart Device.

A lot of people seem to forget that when the iPhone was originally announced, there was a ton of skepticism.  Nobody thought that people would shell out the money Apple was asking and Apple themselves were not entirely sure how they would fare in the market and the most glaring omission from the original iPhone was the App Store.  Apple had no idea how insanely profitable it would be to get a cut of all that sweet app revenue and actually had planned for the iPhone to work entirely with mobile web applications on Safari.  They did not allow for third-party app developers.

When they launched the App Store it was a big deal.  I was one of the early sign-ups, having been developing software professionally for 13 years.  I really loved learning to code for the iPhone.  The code-signing and App Store signup process and all that was a pain in the ass, but I did it.  It was an exciting time but it was by no means without controversy.  Developers HATED the constraints of the App Store model.  If I wanted to write a game and give it to a friend to play on their PC (Windows, Mac or Linux), I could do it, but there was suddenly no way to write software for this new phone without paying Apple a hundred bucks a year, filing a bunch of paperwork with them, and getting their approval of my app?  This was unprecedented.  The first app I ever submitted to the App Store, Virtual Bacon, was rejected by Apple because it didn’t have enough practical use.  Of course it didn’t, it was silly, it was an app to virtually fry bacon on your phone but I was not allowed to share it with the world because Apple didn’t like it.  I couldn’t even put it on the web and let people install it themselves because Apple wouldn’t allow side-loaded apps to run.

Despite all the developers who were rankled by this new way of doing things, Apple counted on the fact that end users wouldn’t care about developers feelings, only that their phone was sexy, and Apple was right.  End users didn’t care.  They loved the closed device with the curated experience and made Apple the richest company in tech.  The developers mostly got over the initial shock of learning to develop for such a draconian platform.  The ones who truly wanted freedom just went to Android or the web instead, where there was more of an open road and one didn’t need to pay to play.  Apple, in the meanwhile, started to see the beauty of raking in 30% of every App Store sale for software they didn’t have to code themselves.  That was straight to the bottom line with only the overhead of the approval process, which was partially offset by collecting annual developer dues.  Apple learned that they could get tens of thousands of software developers to pay Apple for the privilege of selling their apps to Apple customers while simultaneously giving up 30% of their own sales revenue to Apple.  Never in the history of computing has a company done so little to earn so much revenue as Apple did with this model.  From a stockholders perspective, this was beautiful.  From a small developers perspective it was highway robbery.

There was one fly in the ointment for Apple, however.  The Mac.  The Mac had been around since 1984 but had never managed to garner more than about 10% of the PC market, no matter what.  It took very little time for the Mac to be overtaken by the iDevices and the almighty App Store on the Apple balance sheet when it came to the core business of making money for Apple.  Apple launched a Mac App Store but it wasn’t the same.  The Mac App Store was an option, but not the only one.  Software could still be purchased, sold, downloaded, distributed, installed, and executed for the Mac without Apple seeing a dime in revenue.  This had always been the case and it seemed it always would be.  If Apple wanted to make the Mac more profitable, they needed to close that third party software gap, at least for the vast majority of consumers.

They also needed to sell more Macs and there they faced a second problem.  If they couldn’t gain market share, if 10% was the cap, they needed to sell more Macs to those 10% of users.  Study after study showed that Mac users tended to use their computers for much longer than Windows users.  This was trumpeted by Apple as proof that while the upfront cost of purchasing a Mac was higher, the overall Total Cost of Ownership (TCO) was lower.  If I spend 50% more for a piece of hardware up-front but it’s usable life is three times longer than the competition, my TCO for the more expensive machine is actually lower.  These TCO arguments were great for Mac users arguing with PC users in internet forums, but they didn’t seem to really drive sales.  The same could be said for the “halo effect”, a term that referred to the idea that consumers would buy iDevices and, in turn, decide to replace their Windows machines with Macs.  Remember the old “I’m a Mac/I’m a PC” commercials?  The Switch Campaign?  Apple tried, repeatedly, to expand the Mac user-base, but they could never quite get there.  So, they fell back on plan B.  Make Macs disposable.

The process of altering the Mac laptops to make them harder to upgrade started as early as 2012, but the final straw was in 2016 (https://www.vice.com/en/article/xygmyq/new-macbook-pros-mark-the-end-of-upgradeable-apple-computers) with that year’s MacBook Pro.  The desktop iMac was also redesigned along similar lines to make it nearly impossible for a normal person to do so much as add RAM or replace a failed drive.  The entire Apple computer lineup, with the exception of the insanely expensive Mac Pro desktop machine, was designed to be thrown away.  The one exception, the only remaining modular machine in the Apple lineup, currently has a STARTING price of $6000.  So, while it is technically true that Apple still sold an upgrade-able machine, the vast majority of users, 99% of the Apple consumer base, would never even touch one.

This strategy allowed Apple to stimulate new Mac sales to the same people who currently owned Macs, but not as fast as one would like.  It became important to establish a schedule for obsolescence for the Macs just as they had for the iDevices.  Users needed to run into a point at which they needed to buy new hardware to run the latest software (even if that point had no real technical rationale).  Apple decided to end the Mac OS X operating system development group and instead converge the iOS and Mac OS dev efforts.  They even renamed MacOS to macOS to be more like iOS and watchOS.  Apple expects iPhone users to buy a new device every two years, iPad users more like three, but Mac users were holding on to machines for five to seven years.  That wouldn’t do.  The solution? Keep pushing out updates to macOS and cutting older machines off the supported hardware list.  Again, this strategy worked to drive adoption to newer hardware and stimulate Mac sales, up to a point, but it also had the internal benefit of allowing Apple to avoid maintaining any responsibility for any backwards compatibility for older hardware and therefore save internal development costs.

There was one final piece to the Mac strategy that is worth noting.  It was certainly going to be controversial when they made the Mac hardware disposable, but they powered through.  It was the same logic by which they removed the headphone jack on their phones, closing another gap in their ultimate control of the user experience.  Both decisions met with initial user resistance but were ultimately copied by competitors.  The final piece was not a case of removing something, the final decision was the choice not to add something: a touchscreen.  Apple was the major innovator and pioneer in the development of touch-friendly computing via iOS.  iOS is, at it’s heart, nothing more than a touch-optimized version of Mac OS X.  It is striking, then, that Apple is the only major computer company that does not offer a touchscreen laptop or desktop and has no plans to ever do so.  The Lenovo I am using to write this post can convert into a tablet by folding in half and has a very nice touchscreen.  My phone and my iPad are both touchscreen enabled.  My Kobo e-book reader, touchscreen.  My work PC?  Touchscreen.  But the Mac?  Never.  Why?

The obvious reason, again, is revenue.  Simply put, a touchscreen Mac would cannibalize iPad sales.  Rather than do that, Apple opted to develop the iPad into a laptop replacement, even going so far as to recently market the iPad under the tagline “your next computer is not a computer”.  The App Store revenue on the iPad alone probably dwarfs revenue for the entire Mac product line.  Apple figured they didn’t need a touchscreen Mac, they just needed people to replace their Macs with iPads.  For many consumers, this is enough, but there is still this stubborn group of users who want an actual computer.  They still buy MacBook Pros, iMacs, and Mac Minis and, those who are really rich might even buy that $6k machine.  These users balk at the idea of attaching a keyboard to an iPad and pretended it is a general purpose computer.

I get it.  I have an iPad Pro with a keyboard sitting here and a MacBook Pro.  They run almost the same software, they are almost the same machine, but the MacBook can simply do a lot more.  If there were only one litmus test needed to highlight the difference, ti would be this: I cannot write software for the iPad by using the iPad.  Let me repeat that.  I cannot create software for an iPad by using an iPad.  In order to create software for an iPad, I need a Mac.  This stands in stark contrast to every personal computer ever made.  Personal computers have always allowed the user to create and compile software on the machine itself.  The freedom to code on a self-contained machine.  The iPad fails that test and the Mac passes that test and for many, myself included, this makes the iPad a Smart Device and the Mac a Personal Computer, even setting aside all the other differences.

So an iPad with a keyboard isn’t a full replacement for a MacBook Pro and can’t be unless iPad users (and iPhone users) can code on their own devices, but they can’t.  If Apple made a touchscreen Mac, enough of their users would prefer that to the iPad+keyboard option so they won’t offer that.  How do they resolve this?  By closing the final gap on the Mac.

I’ve already discussed the fact that the Mac hardware refresh cycle was shortened by a move to disposable hardware and aggressive software updates, how Apple has consistently avoided adding the now industry-standard touchscreen to the Mac to avoid harming iPad sales, and how they have managed to reap massive revenues from the App Store model on iDevices but generally failed to see the same results on the Mac.  The remaining pivot in strategy, after trying all these other avenues, was pretty obvious and I am certainly not the first person who saw it coming.  The final step was to put the Mac in the same “jail” as the iDevices and thereby force all Mac software revenue to come through the Mac App Store.  There was only one problem.  Intel hardware.

The MacBook Pro and the Lenovo Yoga I keep referring to are, in almost every meaningful sense, the exact same machine.  Both are light, modern, laptops with metal shells, similar sizes, similar keyboards, SSD hard drives and Intel processors inside.  They can both run Windows or Linux, and, although Apple has made it difficult to run macOS on non-Apple hardware, both machines can technically run that as well.  There are a few differences.  The Lenovo has a touchscreen and a fingerprint sensor and can convert into a tablet.  It also cost much less than the Mac did.  Both machines, however, are fundamentally the same computing architecture.  Both machines allow me to write code that I can run on them.  Both let me explore and hack and use the computer however I wish.  As long as Apple machines are based on standard Intel hardware there is little that can be done to change this fact.  It was therefore not surprising when Apple announced their intention to take the final step and make their own proprietary processor, the M1.

This was a long time coming.  Apple had to build the facilities to produce chips and also develop the expertise in chip design.  They began with the A-series chips that have powered the iDevices.  Having complete control over both the hardware and the software and free from the hassles of making upgrade-able hardware, the A-series processors could be altered and developed in any way Apple desired with absolutely no consumer impact beyond the usual “buy a new device every couple of years” thing.  Apple could, and often did, even cause apps you already owned and purchased to cease functioning with no warning when it would have been extra work to maintain backwards compatibility with those apps.  They tested the waters and found that consumers got used to several things that were once unimaginable: daily software updates pushed out to devices, loss of ability to downgrade, and the straight up deletion of apps that the user had bought and paid for without any sort of refund or credit.  This slow boiling of the consumers allowed them to streamline their hardware development and maintain the 18-month cycle and healthy profit overhead.  Eventually they needed to move the Mac to this more-profitable business model in order for it to continue to be worth it for them and after years of experience with the A-series processors, they finally reached the promised land with the M1.  They have finally got a proprietary processor that they believe they can sell.

The rationale Apple has pushed for all of these moves?  Ease of use, or security, or thinner and lighter, or faster, or freedom to innovate, all of these reasons for moving the Mac in this direction might be good marketing PR, but they are fundamentally bullshit.  There are ways to develop products that are secure, thin and light, fast, innovative, and all the rest without also being locked down, proprietary, closed or disposable.  Other manufacturers recognize this and Apple of old did as well.  But the iDevice paradigm is such a good business model and you wouldn’t want a free computing experience to get in the way of a good business model, would you?

The M1 Mac will not have a single upgrade-able component and it will be incapable of executing unsigned code.  An individual developer will be capable of self-signing code on their own machine for development purposes, but any distribution of software to anybody else will require they pay Apple for the privilege, unless they distribute their software as source code and the other user signs and compiles a copy for themselves.  The M1 Mac will be unable to run other operating systems.  Linux and Windows will not be options.  Even if they were somehow able to be tricked into running on the M1, they would not be usable, it will be a jailbroken device at that point and could be bricked.  An M1 Mac will be a disposable, fully controlled device, not a general purpose computer, no matter how many apps you might have available to run on it.  For most people, this is irrelevant and Apple is counting on that.  Very few people think about the developer community that creates the software they consume or the issues of right to repair and hardware and software platform openness that those developers are passionate about.  Most consumers just want their computer to be a TV with a keyboard and the internet and for these consumers, an M1 Mac will be indistinguishable from whatever other Apple stuff they use today and Apple will make a mint.

Apple has every right to move the Mac to this closed model, but I, as a consumer, have every right to reject the model and opt for freer, more flexible, and less limiting options.  The Mac was once the most powerful Personal Computer on the market, capable of running any OS and almost any code you could imagine with a long life, high end engineering, and upgrade-able components that justified the higher upfront costs, but with the M1 Apple has taken the final step in the gradual and intentional transformation of the Mac into just another Smart Device, that one oddball member of the iPhone family that happens to have a physical keyboard stuck to it.  It’s ceased to be a PC, it’s now the macDevice.

Ironically, I believe this may ultimately lead to touchscreen MacBooks.  Once all sales on the Mac platform are forced through the lucrative App Store gates and third party innovation on the Mac is effectively quashed, Apple may feel freer to allow the Mac to be more iPadesque just as they have allowed the iPad to get more Macish.  With the macDevice as just another form-factor variant of the same basic product, the freedom to blur the lines between iPhone/iPad/macDevice without harming revenues of other product lines may get the basic MacBook hardware design to evolve for the first time in a decade.  Who knows?  I know one thing, I won’t be along for the ride.  The Apple strategy to achieve the macDevice without market rejection has been well-executed and blindingly obvious for many years.  Each step has followed logically from the one before it and I jumped ship several years ago, one of the earlier Mac faithful who Apple failed to lead down this particular garden path.  I wish to own the things I own, maintain the right to repair and alter those things, and retain the freedom to use them as I see fit for as long as I wish.  If I purchase a product, I do not wish the manufacturer to then dictate the terms under which I can use it or maintain control over how long I can use it.  On more than one occasion, Apple has disabled and deleted software I depended on or enjoyed, while providing no rollback plan and no financial credit, with little or no notification.  For a decade they have removed jacks and ports and hardware options to suit themselves and their business model with every new generation of their products, providing less and less in the way of choice while increasing their grip on users who are too invested in the Apple ecosystem to ever be willing to spend the time and effort to escape.  An Apple consumer can do anything they like with an Apple product unless Apple doesn’t like it and they can only move to another platform by repurchasing music, apps, devices, cables, videos, and peripherals.  The macDevice belongs in this iteration of Apple.  It is the embodiment of Tim Cook Apple.  But don’t call it a Personal Computer.  If it has to be jailbroken and side-loaded in order to run approved code, and then disposed of when it’s reached it’s pre-determined end of life, what you’ve got there isn’t a computer, that’s a Smart Device.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.