I keep seeing advertisements online for lite/small/basic/dumb phones.  These are usually promising to break the user away from the mind-numbing addiction to the doomscroll and allow them to once again see the world around them.  I am guessing that none of these products actually have much of a likelihood of succeeding in the marketplace because, at the end of the day, they are the electronic equivalent of a healthy diet and we all want pizza.

But I get the appeal.  I have gone to great lengths to simplify and cut back and escape the ultra-intrusive and soul-crushing miasma that is the modern internet, social media, news, hell even the gas pumps have Maria Menounos talking your ear off whenever you just want to fill ‘er up.  The world is loud.  Everybody everywhere wants a piece of everyone else, everybody wants to be viral and sticky and every available niche is being filled by noise.  It’s awful.  No wonder we tell ourselves that a simpler phone will save the day.  Seems like such an easy solution, but that’s an illusion. The phone isn’t the problem.  The phone is a delivery device for the poison of our modern culture, sure, the smart phone is to psychological poison as the cigarette is to carcinogens, but the real problem is the fascination with and addiction to the gazillion small hits of dopamine we get from ingesting the latest stupid headline, the latest trivial status update, the latest tweet, the latest Tik Tok video, the latest, the latest, the endless content ocean.

I put it to you that the mindless consumption of endless hours of low value content and ephemeral news (always mostly bad) has never, in the history of humanity, been a healthy activity.  It was a little harder to do, back in the day, I’ll give you that, but only just.  You know what the Fox News MAGA Boomers have in common with their Zoomer grand-kids?  The former keep a television on during all waking hours, feeding themselves an endless stream of targeted information chosen by an editorial staff in the service of advertisers and the latter stare at a phone during all waking hours, feeding themselves an endless stream of targeted information chosen by an algorithm in the service of advertisers.  The Venn diagram is a circle.  Only the content differs.  The narrowcast, tailored, corporatized “social” web and app ecosystem is no more diverse, empowering, educational, or conducive to free thought than the old broadcast radio and television it has superseded.  At least there were three major networks broadcasting television to our parents generation and, you know, PBS, but for us it’s one bubble, crafted by tracking cookies, collaborative filters, and virality that create an echo chamber at the personal level that gives Fox News programming a run for it’s money in it’s extreme lack of variety.

Reality has been so curated for us, our ideas and desires and personal situations, our friendships and family connections have been productized, monetized, and exploited so heavily, that we find ourselves in an almost absurd predicament as a society.  We technically have more access to all of the information in the world than any population in mankind’s history and yet on a daily basis we have to make such a violent and intentional effort to encounter it that it might as well not be there.  We are the least informed consumers, the least enlightened populace, and the most radically misinformed bunch of sad sacks that the modern post-enlightenment world has ever seen.

Of course, this is all in the service of scratching the itch of boredom.  We work at our jobs all day and we crave something interesting and corporations are really really incredibly good at giving us diversions.  Allegedly we want to know what’s happening in the world, connect with our friends, laugh at something silly, but really, it’s just that we are bored and don’t know what to do with that novel feeling in a world so filled with stimulation.  In fact, I would go so far as to say that we don’t even have a chance to get legitimately bored.  We simply find ourselves lacking a diversion, which is not the same thing.  We have forgotten how to just exist to such a level that we equate being alive with boredom.  We get an idle minute and we have to decide to be unconscious (sleepy time!) or to seek out something diverting.  Diversion wins.  We happily step into the most convenient available trap.  The Phone.  The TV.  Potayto.  Potahto.  So, you see, this isn’t a new problem and a simpler phone isn’t much of a solution.  What we need to do is learn to do nothing and have it be enough.  Allow inaction to occur.  Don’t call it boredom.  And don’t seek a diversion.  Here are some exercises you can try.

Exercise: Turn off all electronics.  Put them in a totally separate room.  Make a meal.  Eat it and give it your full attention.  Don’t shovel it in your mouth while scrolling Twitter.  Taste it.

Exercise: Switch out some piece of media consumption that you currently use a device for with it’s “obsolete” equivalent.  For example, you like your e-reader?  Read a print book for a change.  You love Spotify?  Dig out those old tapes or records or CDs from the closet and play one.  Experience the difference between streaming “media” into your bubble and the physical act of interacting with a physical piece of media.  Read a paper newspaper.

Exercise: Remember back to things you used to do to entertain yourself before you had a smartphone that you don’t do anymore.  Do that for a day.  See how it feels.

Exercise: Schedule times to be online for a week but otherwise, be offline by default.  Throughout human history, as recently as 10 years ago, most people were not carrying a phone around with them 24/7 and could not be pinged, messaged, rung up, or tweeted at and somehow, somehow, these brave ancestors survived.  Imagine a world in which your time was respected, in which nobody expected you to be waiting by the phone 24/7, nobody panicked if you went a day or two between texts, how much pressure would that take off your shoulders?  How much relief would you feel?

Exercise: Find a news outlet that is honest, reliable, and without partisan bent and (if you must consume current events) make that your first stop of the day.  Before you encounter memes, spin, or your own bubble, try to be aware of a neutral reporting of facts, sans opinions.  Then, for bonus points, form your own opinions.

Exercise: Track the trackers.  Add an extension to your browser that alerts you to how many organizations track your every move online and block them.  Observe changes in your online experience.  Opt for media interactions that don’t track you and, even more to the point, don’t monetize your activity.  Buy products, not access, copies, not subscriptions.  Companies don’t track you if you aren’t being monetized.  When is the last time you actually owned a copy of a new album rather than just streaming it?

Look, I get it, we aren’t ever getting rid of this technology.  You’re not going to live this way all the time.  These are exercises intended to make you think about the choices you’re making on a daily basis.  Practices to gain some perspective.  Things you can try doing to make yourself more aware of the ways you are being catered to, manipulated, handled, exploited, and sold.  We aren’t going back to the “good old days”.  There aren’t any.  We are, however, going to wind up in Idiocracy if enough of us don’t get out of the bubbles and into reality.  So, you know, stop reading this.  I’m not tracking you or monetizing your eyeballs but still, get offline.  Paint something.  Play that xylophone you got at the yard sale.  Read a physical book.  Sit quietly in a room and listen to your environment.  This whole online thing is a fiction and you know it.  Shoo.

Hold tight, this one’s gonna get nerdy.

Let’s take a little trip in the way-back machine to the dawn of the desktop computing era, that time period that we seem to be incapable of escaping: 1984.  It was in 1984 that Steve Jobs famously unveiled the first “modern” computer, the prototype forerunner of all we use today, the original Macintosh.  It wasn’t the first GUI, but it was the first commercial application of the idea that a computer has a mouse and desktop and icons and a What You See Is What You Get user experience.

The original operating system for this original machine was primitive on every level.  It ran on a 400 kilobyte floppy disk and that included the ability to do text-to-speech.  A marvel of engineering and design, yes.  Influential like The Beatles? Yes.  But a strong foundation for future computer operating systems?  Hell no.

The original Macintosh System Software had ground-breaking interface design, clever engineering to do a lot with a little, but ten years after that original launch the world had changed a lot.  Now it was 1994 and in the decade following Hello, Microsoft had turned Windows into a Mac competitor on IBM PC compatibles and Apple had squandered it’s first mover advantage and become something of an afterthought.

One lesser known, but massively important, thing happened during the decade of Windows’ rise to prominence, and it is that Apple fired Steve Jobs, shortly after the Mac launched.  Jobs then started a new company called NeXT (he also bought Pixar from Lucasfilm and created that whole company, which is, like, the hugest footnote to a career ever, but I digress) and NeXT needed an operating system for their cool new computer, the Cube.  Jobs didn’t want a clever bit of under-engineering (ala the Mac System Software) for his Cube.  He wanted what the big boys had been using since the late 1960’s: Unix.  So, he built an operating system on an open-source Unix variant called BSD Unix (Berkeley Software Distribution).  The resulting operating system, NeXTStep, was not a commercial success and neither was the company he founded.

Though not commercially successful in the way Jobs intended, two big things can be laid at the doorstep of NeXT and Jobs in this time period.  First: the guy who single-handedly created the World Wide Web, Sir Tim Berners-Lee, invented the Web on a NeXT computer.  So, by 1994, when we suddenly had the blossoming Web, when the internet started becoming a part of the culture, it was born on NeXT.  If that had been the only contribution of NeXT to the world, it would have been enough, but another thing happened.  Apple, struggling to avoid bankruptcy, brought Steve Jobs back as an interim CEO and one of the first things he did was buy NeXT and turn NeXTStep into….  tada!  Mac OS X.  Which then begat iOS and all the other flavors of Mac OS X.

If you own a Mac, if you own an iPhone, if you own an iPad, if you own an Apple Watch, or an Apple TV, or any Apple product made in the last 15 years except a clickwheel iPod, you have used the current iteration of the NeXT platform that Jobs launched after being fired by Apple back in 1985.  This also means that you have been using BSD Unix under the covers, whether you realize it or not.

Another 10 years later, 2004, and this was all obvious.  The WWW had taken over the world, Apple was back, Mac OS X was launched and headed towards success, Jobs was plotting the iPhone and iPad.  This is all history.  However, at the time, 1994, the big news was actually coming from Microsoft and their launch of Windows 95.  So let’s talk about that for a minute.

Windows was second to market with a GUI, and not technically superior to the original Macintosh operating system, but unlike Apple, Microsoft was determined to do something about it.  The previous version of Windows (3.1) was a 16-bit shell that ran on top of DOS.  Windows 95 was going to be a full 32-bit operating system like OS/2 Warp (Google it, it was a thing at the time) and what’s more it was going to include a new Microsoft Dial-Up Service called MSN that would compete with AOL and Compuserve (no, there was no internet access yet, they missed that one).   In 1994 we all knew it was coming but we didn’t get it until August 1995.  I should know, I bought a copy on launch day.

Windows 95 did change the world.  The user interface conventions we take for granted on modern Windows computers, they all started on Win95.  It was for today’s Windows but the original Macintosh was for the modern Mac from a user-interface convention perspective.  What it lacked, just like the Mac of the era, was stability.  And that’s where the real story begins, not with Windows 95, but with the REAL progenitor of the modern Windows computer, a totally different thing called Windows NT.

Now, you might be thinking, I have heard of Windows 95, and Windows 98, Windows XP, etc., but what is Windows NT?  I’ll tell ya.  Windows NT, initially released in mid-1993, was a version of Windows that was designed around a new operating system kernel, the “New Technology” kernel.  A kernel, BTW, is like the heart of an operating system.  It controls the reading and writing of data, the execution of programs, communication with devices, all that stuff.  It is not the part you see on screen, with the windows and icons and stuff.  All of that is just the graphics.  So, back to NT.  The first few releases were intended for servers, not desktops, where they wouldn’t be asked to run games or general productivity applications and would also be expected not to crash.  During the second half of the 90’s, Windows lived two lives.  There was the one that normal users had (95/98/ME) and there was NT (which most users never even heard of).

The first time most normal users got their hands on Windows NT it was going by a new name: Windows XP. The years leading up to Windows XP had allowed Microsoft to develop a strategy (“compatibility mode”) so that apps written for non-NT versions of Windows could run on NT, thereby allowing them to migrate to a more robust core for their operating system, the exact same thing Apple was trying to do with OS X.  In the case of Windows, the core was built around the NT kernel, in the case of Mac the core was built around the BSD Unix kernel, in both cases the goal was to get users off the crappy 80’s foundation and onto something reliable.  You can have religious wars about the NT kernel vs BSD and the user-interface choices made by Apple and Microsoft, but in general, these two platforms making these major shifts created the operating environments for most of the devices we all use today including laptops, smartphones, and tablets.

Now, I’ve intentionally left something out of this picture and it’s a doozy.  Way back in 1991, a student at the University of Helsinki, Linus Torvalds, was learning operating system design with a Unix clone called MINIX and he was annoyed by the limitations.  So, he made his own and he shared it on the fledgling internet and the snowball was pushed from the top of the mountain.  His creation, eventually dubbed Linux (named for Linus himself) has steadily grown and improved and spread throughout the known computing universe.  By some estimates, over 90% of the servers on the internet run Linux.  In the world of servers and other computers that normal users don’t touch, Linux is the king.  You use it every day that you go online, and you probably don’t know it.

And for Android users, this is even more true.  Do you have an Android phone, tablet, or watch?  Guess what….  The NT kernel -> Windows.  BSD Kernel -> macOS/iOS.  Linux Kernel -> Android.

So, NT was designed so Microsoft could compete with Unix in the server business, but instead it became XP and (eventually) Windows 10.  BSD Unix was used by Steve Jobs to make NeXT, which became Mac OS X and Linux, which was a clone of Unix, took over the server business instead of OSX or NT and now it’s at the core of almost every mobile device not sold by Apple.

At the end of the day, Unix-style operating systems are OWNING and even Microsoft has figured this out.  Microsoft came out as huge proponents of Linux several years ago, principally spearheaded by the head of their Azure division.  If you don’t know what Azure is that just means that you aren’t a professional software developer.  It’s not a consumer product, it’s a server thing for people to run their apps on the internet, hosted by Microsoft, and it’s extremely Linux-friendly.  Likely, the folks at Microsoft realized they would have no choice but to support Linux if they wanted to have a cloud-server product since almost the entire server side of the internet is based on Linux.  And they were right.  The guy at Microsoft who ran the Azure division was a fella named Satya Nadella and if that name rings a bell it’s because he is now the CEO of Microsoft, having replaced Steve Ballmer who replaced Bill Gates.

OK, so, the guy who brought Linux into Microsoft is now running Microsoft, so what?  Where are you going with this Sutter?  Well, remember how NT was a server thing and then became the new kernel for Windows a few years later with XP?  Well, there is increasing reason to believe that NT might be heading towards being replaced with, you guessed it, Linux.

A couple of years ago, Microsoft introduced a new Windows feature called Windows Subsystem for Linux or WSL.  WSL allowed a user to run a Linux environment within their Windows environment instead of dual-booting.  I tried it out and quite honestly I couldn’t see a use for it.  If I wanted to run Linux, I could run a full Linux environment.  If I wanted to make Windows more Unix-like there were a number of ways to do that.  WSL seemed like a solution in search of a problem.  But then they came out with WSL 2 (aka: Electric Boogaloo) and things got more interesting.  To radically over-simplify: version one created an environment where native Linux applications could get translated to the Windows core.  In version 2, Windows can now basically run the actual Linux kernel as one of it’s own processes, no translation.  They have even announced support for Linux graphical applications (WSL was only a command-line thing before).

This is starting to sound familiar…  Mac OS X anyone?  Or perhaps Android?  Mac OS X is a graphical shell designed by Apple that happens to run on top of the BSD Unix core.  Android is a graphical shell designed by Google that happens to run on top of the Linux core. It isn’t an insane leap of logic to envision a world in which Windows becomes a graphical shell designed by Microsoft that happens to run as a process on top of the Linux core, (rather than the existing NT core).  The current evolution of the Windows Subsystem for Linux, the entire Azure cloud offering, and the fact that Satya is the CEO all point in this direction, potentially.

What this would mean is that we would have reached a point at which every major operating system is some variant of Unix.  Mac, Windows, Android, iOS, watchOS.  Microsoft already made the startling decision to give up on developing Internet Explorer and Edge and instead of make a new version of “Edge” that is, at it’s core, Google Chrome, (which itself is based on Webkit, the open-source HTML rendering engine that started on Linux in the KDE Desktop and was adopted by Apple for Safari on iOS and then Google and now Microsoft).  They’ve learned that nobody cares about the engine under the hood, they care about the look and feel and the apps they can run.  A Linux-powered iteration of Windows might seem like a leap, but frankly, it’s not.  People made the leap from Windows 98 to Windows XP via emulation and compatibility layers.  The same transformation today could be done less painfully thanks to existing Windows compatibility open-source like WINE (software that let’s you run Windows apps on Linux without Windows).  The merger of Win and Lin seems almost inevitable.

And I’m not the only one saying it.  Open source pioneer Eric S. Raymond has recently posited the same idea (http://esr.ibiblio.org/?p=8764).  I, for one, hope this trend continues.  With the release of macOS Catalina, Apple has taken some previously unprecedented actions towards making OSX into the most closed, most proprietary, least free, computing platform every built, a platform in which no software that is not blessed and sold by Apple will even be able to execute.  Their pending move to making their own processors will make them an even more radically closed platform as their hardware too will be strictly proprietary.  As a proponent of open-source, freedom to repair and the like, I can’t condone the purchase of such disposable and proprietary technology.  The Windows NT kernel has never been my favorite, I’ve always been a Unix guy at heart, so, the idea that Windows might finally transform into Yet Another Unix Variant is one of the better possibilities I’ve run into lately.  Bring it on, Microsoft.  If this is how the Unix desktop finally conquers the market, as bizarre as the road traveled may have been, I’m ready for it.