Personal Computers Are No Longer Personal
Year Zero: 1975
In 1975, the “Personal Computer Revolution” was ushered in by Popular Science in an article on the MITS Altair 8800 - a “personal” microcomputer you could build and operate on your own.
You didn't have to be a government, corporation, or university to have a computer any longer.
Two hackers calling themselves “Micro-Soft” created a unique 4K BASIC language interpreter for the 8800 which give the people something to “do” with their new “personal” computer.
And that task was “command it.” The revolution was on.
In a sense, every hacker alive in 1980 who saw The Empire Strikes Back FELT themselves as the the Emperor, with their personal computer kneeling in front of them, Darth Vader-like, asking “What is thy bidding, my master?” which TRON brilliantly lampshaded as TRON's commune with his user Alan One.
When the Color Computer's OK prompt, or later the Commodore 64's READY prompt appeared, these burgeoning hackers felt the same rush as the finest artists feel when with faced with a blank canvas, holding a loaded palette.
Rise of the Legions of Hackers
The early microcomputer platforms were not “open” in the sense that we think of open-source.
Intellectual property control was already a fierce territory by 1982, having seen Bill Gates' famous letter to the Altair's software “pirates” and his epic struggle with Jack Tramiel of Commodore.
But these early platforms were transparent.
The “insides” were completely open to the owner of the computer, inviting the owner to learn, explore, or tinker with them. Most “getting to know your personal computer” books included with the machines themselves were written with the same theme and tone: “YOU have a powerful magic in front of you, which you can EASILY master …”
These books often included tantalizing hints that there was “more” just around the corner. The Commodore 64 books for example frequently started with BASIC programs, then along the way more and more magical recipes of “POKE” statements, that did things the BASIC statements couldn't, would crop up. This made the reader/student WANT to learn more - all you needed was a spark of curiosity to ask why THAT new POKE statement did something.
If you went looking, it was very easy to figure out because the DETAILS and the IMPLEMENTATION were known and you were ENCOURAGED to find them out.
I am stunned, again and again, at the level of technical detail that was readily available to the end user of these early machines.
Case in point: the SAMS Commodore 64 Programmer's Reference Guide (PRG). Every detail of the machine, hardware and software, in one reference book. Included were an electronic schematic, breakouts of the pin assignments of all of the major chips, every table of constants needed to understand the video, sound, and text editing. Disussions of TWO programming languages, BASIC and Assembly, were clearly detailed command by command.
While the SAMS C64 PRG was not shipped WITH the machine, it was easily available. Even if the book that came with the machine only covered setup operations and BASIC, most microcomputers of the era had similar levels of detail available to anyone who even casually looked. In many cases, other users of the machines would encourage new users to GET into that level of depth very quickly.
Machines sold with BASIC interpreter encouraged casual users to casually code. More interested users could step beyond their BASICs and readily enter the world of Assembly, often by using BASIC programs to POKE values into memory and transfer control away from their BASIC program into the machine language.
These were not applications as we know them now. These were tools that solved problems, tricked the machine into doing more, AND deftly encouraged their users to become an expert on the system in front of them.
The Seismic Fault Line
As computer power scaled exponentially year after year, the secret agreements and intellectual property wars created walled gardens which each conspired further and further to isolate the users away from the machine.
First, the hardware details become obscured behind layers of “specifications” and “intellectual property” which discouraged most from digging into the details.
Two prime examples are CPU Microcode Patching and BIOS.
Initially the CPU would just work, but as CPUs became more complex, Intel switched to internal microcode engines which let Intel change the CPU's internals using a specialized patch language. Some CPUs could not even start working until patched with aftermarket code. But, the only people who got that code and the details of how to access the CPU to use it were the few BIOS makers who signed the agreements to keep those magic recipes secret.
As more complex hardware was added to motherboards, video boards, and drive controllers, each hardware manufacturer created their own walled garden of secrets on how to setup and use their piece of hardware, which ultimately locked users out of hardware exploration entirely.
Each new generation of computer layered additional, increasingly complex “standards” on top of the “backward compatible” previous generations, resulting in an incomprensible, towering monument of specifications which cannot be understood by a person.
Example: Nowhere is this more apparent and glaring as the specification for ACPI. Where the Commodore Programmer Reference Guide for the entire machine weighs in at 486 pages, the ACPI specification runs over 900 pages by itself, just covering one smallish aspect over the overall system: power management and device configuration.
Hardware information disappeared like early morning mist from the “technical” books about the platform, as fewer and fewer machines were built with openly-documented or transparently-designed hardware.
An open-source BIOS project like Coreboot was never necessary in the original personal computer era. Non-Disclosure Agreements (NDA) were never necessary just to know how the device you already purchased actually worked.
As this sea change slowly crept in and BIOS became the fault line where the seismic shift occurred, the encouragement to explore, learn, program, and push the limits of the computer was slowly strangled out of the formerly “personal” computer.
ROM is expensive and impossible (at the time) to patch, so, as much of it as possible was removed. PC's stopped shipping a BASIC interpreter on ROM. PC's began shipping with something called Disk Operating System (DOS) which initially included a BASIC interpreter as a command file.
So, rather than BASIC being the default programmer-ready environment on the system, something called DOS became a requirement At least, through the DOS rabbit hole, you could still get to Wonderland.
And now we see revealed the key that closed the lock of the User Jail Cell.
The ultimate “permission” and “control” over the computer was demoted from Operating System status to simply an Application. This left the illusion of control, but even that faded away as BASIC itself was dropped from future versions of DOS.
Next, Microsoft substituted something called QBASIC for the previous BASIC/BASICA/GWBASIC era, rendering MOST previously written BASIC software and the mountains of books about BASIC useless, as a final step to weaning “users” away from their additiction to creating their own software, rather than paying Microsoft to write it for them and sell it to them.
With the Wild West shoot-out days of BASIC behind them, Microsoft was finally free of the legacy they accidentally created in 1975. They were free of the meddling programmers, and able to replace the Operating System in fundamental ways.
Now, if you wanted to program, you had to buy a costly “development environment” fraught with many intellectual property issues around which parts of the Application Programming Interface (API) you were allowed to know, license, or use.
The final inversion was to isolate users away from the low-level controls and internals by putting a wall of graphics in front of them. Take away the keyboard and commands and replace it with yet-another-PARC User Interface. And in a final act of perversion, Microsoft actually inverted the natural order and turned their user interface itself into the control software.
BASIC was again morphed away from a beginner language that easily led to direct mastery of hardware, into a Visual Basic tool that let you conveniently arrange user interfaces for limited Applications.
The programmer-friendly operating system with hardware control was gone - replaced with an increasingly complex layer after layer of “drivers” running under control of a “user interface” which you could tinker with to make nice forms and visual push buttons that did very little.
So, in one generation, we went from creating the conditions for a rapid increase in the creation of future “computer scientists” to bursting that same bubble in the name of “commercialization” and “progress”.
The boom cycle became a bust as the “operating system” faded into obscurity, replaced without fanfare by an “application platform”, mistakenly (we hope) labeled as an Operating System.
The Operating System used to mean “the thing that let the User Operate the System” - every part of it - in any way the user cared to.
With these layers of complex isolation, the Operating System became “the thing that lets Corporate Programmer Teams Operate the User's System.” Microsoft's increasing “simplification” and “isolation” of the user from knowing anything how their increasingly complex system works on any level caused a dumbing-down of the computer user population which I have called “the dreaded Microsoft effect”.
And nowhere is this more apparent than in the rise of the Apple iOS - wherein Apple beautifully exploited the “dumbing down” trend line created by Microsoft, in such as masterful fashion, P.T. Barnum himself would blushed with embarrassment and given up the game.
Apple demonstrates what is the worst about the jailed design. If you want help you must supplicate yourself to the insufferably named “Genius Bar”. Well, if you have to go to see the Genius to get your PC fixed, what does that make you? Less than Genius? You can almost HEAR the bellowing: “I am the Great and Mighty Jobs! Ignore the reality behind the curtain!”
Apple definitely is the Emerald City - beyond being shiny and and made of pure green, the first thing you get is a curmudgeon at the door saying get lost if you want to “get inside”. Very sad state of affairs for the company founded by nailing a 6502 to a piece of plywood and tossing together an Integer BASIC.
Each smartphone or tablet sold today is a tiny, amazingly poweful device even by last year's standards much less what was happening in Year Zero. And, you are actively discouraged from doing ANYTHING to or with it that is against the Expressed Wishes of the Builders.
The early personal computers did not need “jail breaking” because they were not jails to begin with.
The era of the mainstream hacker was over, replaced with a legion of iThing-fondling hipsters who ultimately had zero clue what was going on behind the curtain, but happy to spend dollar after dollar to buy “apps” - so many of which are such trivial nonsense you wonder why it took programming at all. Slap an icon on an existing API library call to read content from an existing website - voila! Your magic “app” is created.
Well, no. You can pay for the “privilege” of someone else who climbed the wall into the secret garden to write that simple thing for you and sell it to you again and again.
Now, over a generation later, government, corporation, and university all lament the scarcity of “computer scientists.”
So now we know the answer to the question: “where have the computer scientists of the future gone?”
Why, they've gone to the Apple Store. And all the cool kids know the “PC” is something your dumb old dad rambles about when you're forced to look up from your “smart” phone.
Where have the rabid hackers and dreamers gone?
Back to personal computers, that's where.
Linux. Android. Arduino. Raspberry Pi. Each name a magical incantation of imagination and possibility that the hypercomplex-specification driven, DRM-crippled “App Platform” can no longer conjure.
A constellation of other systems have appeared to let people take on, build, program, operate, and relate to computers personally again.
These people relegate their once “personal” computers to the bench, using them only to run a handful of apps like web browsing, messaging, or… maybe email - if they haven't simply delegate those mundane tasks to their opaque smartphone bricks.
Their focus and passion is elsewhere now - the PC, and the smart phone have fallen into second-class status due to the piled-high layers which distance the person from personal control of the device.
Parents, do your kids a real favor. Take away their walled-garden console and smartphone systems, and give them an Arduino kit or a Raspberry Pi or Android devices with SDKs to get them encouraged to tinker with their futures again, rather than passively accept what Corporation 2015 mandates they must use.