Sunday, May 2, 2021

Things are more like they are now than they have ever been

I hadn't noticed until I looked at the list, but it looks like this is post 100 for this blog.  As with the other blog, I didn't start out with a goal of writing any particular number of posts, or even on any particular schedule.  I can clearly remember browsing through a couple dozen posts early on and feeling like a hundred would be a lot.  Maybe I'd get there some day or maybe I wouldn't.  In any case, it's a nice round number, in base 10 anyway, so I thought I'd take that as an excuse to go off in a different direction from some of the recent themes like math, cognition and language.


The other day, a colleague pointed me at Josh Bloch's A Brief, Opinionated History of the API (disclaimer: Josh Bloch worked at Google for several years, and while he was no longer at Google when he made the video, it does support Google's position on the Google v. Oracle suit).  What jumped out at me, probably because Bloch spends a good portion of the talk on it, was just how much the developers of EDSAC, generally considered "the second electronic digital stored-program computer to go into regular service", anticipated, in 1949.

Bloch argues that its subroutine library -- literally a file cabinet full of punched paper tapes containing instructions for performing various common tasks -- could be considered the first API (Application Program Interface), but the team involved also developed several other building blocks of computing, including a form of mnemonic assembler (a notation for machine instructions designed for people to read and write without having to deal with raw numbers) and a boot loader (a small program whose purpose is to load larger programs into the computer memory).  For many years, their book on the subject, Preparation of Programs for Electronic Digital Computers, was required reading for anyone working with computers.

This isn't the first "Wow, they really thought of everything" moment I've had in my field of computing.  Another favorite is Ivan Sutherland's Sketchpad (which I really thought I'd already blogged about, but apparently not), generally considered the first fully-developed example of a graphical user interface.  It also laid foundations for object-oriented programming and offers an early example of constraint-solving as a way of interacting with computers.  Sutherland wrote it in 1963 as part of his PhD work.

These two pioneering achievements lie either side of the 1950s, a time that Americans often tend to regard as a period of rigid conformity and cold-war paranoia in the aftermath of World War II (as always, I can't speak for the rest of the world, and even when it comes to my own part, my perspective is limited). Nonetheless, it was also a decade of great innovation, both technically and culturally.  The Lincoln X-2 computer that Sketchpad ran on, released in 1958, had over 200 times the memory EDSAC had in 1949 (it must also have run considerably faster, but I haven't found the precise numbers).  This development happened in the midst of a major burst of progress throughout computing.  To pick a few milestones:

  • In 1950, Alan Turing wrote the paper that described the Imitation Game, now generally referred to as the Turing test.
  • In 1951, Remington Rand released the UNIVAC-I, the first general-purpose production computer in the US.  The transition from one-offs to full production is a key development in any technology.
  • In 1951, the solid-state transistor was developed.
  • In 1952, Grace Hopper published her first paper on compilers. The terminology of the time is confusing, but she was specifically talking about translating human-readable notation, at a higher level than just mnemonics for machine instructions, into machine code, exactly what the compilers I use on a daily basis do.  Her first compiler implementation was also in 1952.
  • In 1953, the University of Machester prototyped its Transistor Computer, the world's first transistorized computer, beginning a line of development that includes all commercial computers running today (as of this writing ... I'm counting current quantum computers as experimental).
  • In 1956, IBM prototyped the first hard drive, a technology still in use (though it's on the way out now that SSDs are widely available).
  • In 1957, the first FORTRAN compiler appeared.  In college, we loved to trash FORTRAN (in fact "FORTRASH" was the preferred name), but FORTRAN played a huge role in the development of scientific computing, and is still in use to this day.
  • In 1957, the first COMIT compiler appeared, developed by Victor Yngve et. al..  While the language itself is quite obscure, it begins a line of development in natural-language processing, one branch of which eventually led to everyone's favorite write-only language, Perl.
  • In 1958, John McCarthy developed the first LISP implementation.  LISP is based on Alonzo Church's lambda calculus, a computing model equivalent in power to the Turing/Von Neumann model that CPU designs are based on, but much more amenable to mathematical reasoning.  LISP was the workhorse of much early research in AI and its fundamental constructs, particularly lists, trees and closures, are still in wide use today (Java officially introduced lambda expressions in 2014).  Its explicit treatment of programs as data is foundational to computer language research.  Its automatic memory management, colloquially known as garbage collection, came along a bit later, but is a key feature of several currently popular languages (and explicitly not a key feature of some others). For my money, LISP is one of the two most influential programming languages, ever.
  • Also in 1958, the ZMMD group gave the name ALGOL to the programming language they were working on.  The 1958 version included "block statements", which supported what at the time was known as structured programming and is now so ubiquitous no one even notices there's anything special about it.  The shift from "do this action, now do this calculation and go to this step in the instructions if the result is zero (or negative, etc.)" to "do these things as long as this condition is true" was a major step in moving from a notation for what the computer was doing to a notation specifically designed for humans to work with algorithms.  Two years later, Algol 60 codified several more significant developments from the late 50s, resulting in a language famously described as "an improvement on its predecessors and many of its successors".  Most if not all widely-used languages -- for example Java, C/C++/C#, Python, JavaScript/ECMAScript, Ruby ... can trace their control structures and various other features directly back to Algol, making it, for my money, the other of the two most influential programming languages, ever.
  • In 1959, the CODASYL committee published the specification for COBOL, based on Hopper's work on FLOW-MATIC from 1950-1959.  As with FORTRAN, COBOL is now the target for widespread derision, and its PICTURE clauses turned out to be a major issue in the notorious Y2K panic.  Nonetheless, it has been hugely influential in business and government computing and until not too long ago more lines of code were written in COBOL than anything else (partly because COBOL infamously requires more lines of code than most languages to do the same thing)
  • In 1959, Tony Hoare wrote Quicksort, still one of the fastest ways to sort a list of items, the subject of much deep analysis and arguably one of the most widely-implemented and influential algorithms ever written.
This is just scratching the surface of developments in computing, and I've left off one of the great and needless tragedies of the field, Alan Turing's suicide in 1954.  On a different note, in 1958, the National Advisory Committee on Aeronautics became the National Aeronautics and Space Administration and disbanded its pool of computers, that is, people who performed computations, and Katherine Johnson began her career in aerospace technology in earnest.

It wasn't just a productive decade in computing.  Originally, I tried to list some of the major developments elsewhere in the sciences, and in art and culture in general in 1950s America, but I eventually realized that there was no way to do it without sounding like one of those news-TV specials and also leaving out significant people and accomplishments through sheer ignorance.  Even in the list above, in a field I know something about, I'm sure I've left out a lot, and someone else might come up with a completely different list of significant developments.


As I was thinking through this, though, I realized that I could write much the same post about any of a number of times and places.  The 1940s and 1960s were hardly quiet.  The 1930s saw huge economic upheaval in much of the world.  The Victorian era, also often portrayed as a period of stifling conformity, not to mention one of the starkest examples of rampant imperialism, was also a time of great technical innovation and cultural change.  The idea of the Dark Ages, where supposedly nothing of note happened between the fall of Rome and the Renaissance, has largely been debunked, and so on and on.

All of the above is heavily tilted toward "Western" history, not because it has a monopoly on innovation, but simply because I'm slightly less ignorant of it.  My default assumption now is that there has pretty much always been major innovation affecting large portions of the world's population, often in several places at once, and the main distinction is how well we're aware of it.


While Bloch's lecture was the jumping-off point for this post, I didn't take too long for me to realize that the real driver was one of the recurring themes from the other blog: not-so-disruptive technology.  That in turn comes from my nearly instinctive tendency to push back against "it's all different now" narratives and particularly the sort of breathless hype that, for better or worse, the Valley has excelled in for generations.

It may seem odd for someone to be both a technologist by trade and a skeptical pourer-of-cold-water by nature, but in my experience it's actually not that rare.  I know geeks who are eager first adopters of new shiny things, but I think there are at least as many who make a point of never getting version 1.0 of anything.  I may or may not be more behind-the-times than most, but the principle is widely understood: Version 1.0 is almost always the buggiest and generally harder to use than what will come along once the team has had a chance to catch a breath and respond to feedback from early adopters.  Don't get me wrong: if there weren't early adopters, hardly anything would get released at all.  It's just not in my nature to be one.

There are good reasons to put out a version 1.0 that doesn't do everything you'd want it to and doesn't run as reliably as you'd like.  The whole "launch and iterate" philosophy is based on the idea that you're not actually that good at predicting what people will like or dislike, so you shouldn't waste a lot of time building something based on your speculation.  Just get the basic idea out and be ready to run with whatever aspect of it people respond to.

Equally, a startup, or a new team within an established company, will typically only command a certain amount of resources (and giving a new team or company carte blanche often doesn't end well).  At some point you have to get more resources in, either from paying customers or from whoever you can convince that yes, this is really a thing.  Having a shippable if imperfect product makes that case much better than having a bunch of nice-looking presentations and well-polished sales pitches.  Especially when dealing with paying customers.

But there's probably another reason to put things out in a hurry.  Everyone knows that tech, and software in general, moves fast (whether or not it also breaks stuff).  In other words, there's a built-in cultural bias toward doing things quickly whether it makes sense or not, and then loudly proclaiming how fast you're moving and, therefore, how innovative and influential you must be.  I think this is the part I tend to react to.  It's easy to confuse activity with progress, and after seeing the same avoidable mistakes made a few times in the name of velocity, the eye can become somewhat jaundiced.

As much as I may tend toward that sort of reaction, I don't particularly enjoy it.  A different angle is to go back and study, appreciate, even celebrate, the accomplishments of people who came before.  The developments I mentioned above are all significant advances.  They didn't appear fully-formed out of a vacuum.  Each of them builds on previous developments, many just as significant but not as widely known.

Looking back and focusing on achievements, one doesn't see the many false starts and oversold inventions that went nowhere, just the good bits, the same way that we remember and cherish great music from previous eras and leave aside the much larger volume of unremarkable or outright bad.

Years from now, people will most likely look back on the present era much the same and pick out the developments that really mattered, leaving aside much of the commotion surrounding it.  It's not that the breathless hype is all wrong, much less that everything important has already happened, just that from the middle of it all it's harder to pick out what's what.  Not that there's a lack of opinions on the matter.



The quote in the tile has been attributed to several people, but no one seems to know who really said it first.

No comments:

Post a Comment