A Brief History of “Everyone is a Programmer Now”

Nvidia CEO Jensen Huang’s speeches about the future of AI this week in Taiwan got a lot of buzz, and justifiably so as there were definitely a lot of gems in there. (Go read up on the whole thing.) The one I’d like to focus on today is the notion that “everyone is a programmer now”.

Where have we heard that before? And is it more or less true than it ever has been thanks to AI? Let’s take a quick historical tour.

1970’s: BASIC

The BASIC programming language was arguably the first attempt to bring programming to the masses. And its straightforward syntax and reasonably friendly environment allowed it to enjoy a modest bit of success in this respect. In fact, my own introduction to programming came through IBM (actually Microsoft) Advanced BASIC.

As it turned out, the complexity of applications that needed to be built fairly quickly outgrew the syntax and BASIC evolved to keep pace, most notably as Microsoft QuickBasic and later Visual Basic. And if you look at what Visual Basic in particular eventually became, it’s pretty clear that the capacity to “make everyone a programmer” was lost along the way.

1990’s: CASE Tools

The 1990’s run at making everyone a programmer came in the form of Computer-Aided Software Engineering tools, or CASE. There was also an alternative marketing formulation, Rapid Application Development (RAD). These tools usually presented as visual form builders on the Windows desktop where users could “draw” the user interface and connect form fields to some data sources.

And again, these worked ok…if all your application required was a forms-over-data metaphor. What really changed the nature of forms-over-data applications — thereby largely ending the rise of CASE — was the ascent of the Internet. The CASE tool approach worked considerably less well when applications were distributed, and also when (at the time anyway) the client was less rich.

2010’s: Low-Code/No-Code

Fast-forward a decade or so and not only have web interfaces gotten considerably more powerful, but lots of higher-level functionality has been packaged into “services” and made available for integration using APIs. These trends led to another attempt at making everyone a programmer, this time as low-code/no-code tools.

Low-code/no-code tools take a lot of different forms, but perhaps the most interesting ones involve declaratively wiring together services as building blocks and putting either CASE-style forms or an analytical query UI on top.

This idea still has a lot of potential today. But it tends to run into a version of the 80/20 problem familiar to software engineers: The low-code/no-code tool can allow a typical user-programmer to build 80% of certain kinds of business applications, but it hits the wall when it comes to the last 20%. Furthermore, that last 20% tends to be exactly the functionality that makes your business or idea unique, things that the tool vendor never could have anticipated. At that point, the user-programmer needs to call the IT department to get a (wait for it) programmer to push their solution over the finish line.

Honorable Mention: Spreadsheets

It’s an interesting piece of academic trivia that anything that can be “computed” can theoretically be computed with Excel. By that standard, everybody who uses Excel is a programmer, and Excel is the most widely used programming language in the world.

In spite of that, there are obvious reasons why nobody will be building general-purpose applications in Excel, which brings us to…

2020’s: AI

With the rise of LLMs trained on large bodies of programing knowledge and ChatGPT, the assertion made by Jensen Huang and others is that now everyone can program by describing their requirements in natural language, iteratively revising, and a working application comes out the other end. It’s a good story, but…

The current generation of AI coding suffers from at least two significant issues:

  1. The don’t generate end-to-end code that “just” compiles and runs out of the gate for most non-trivial applications and therefore require a professional programmer to debug and tweak.

  2. They work best when prompted with terminology that is most familiar to traditional software developers (and moreover, the more highly-skilled the software developer, the better the results).

I’m willing to assume that both of these issues will eventually be resolved. Where would that leave us?

I would suggest that it leaves us pretty much where we were at with low-code/no-code tools, where 80% of the application is cranked out by users talking to the AI but 20% still requiring some traditional software engineering to hash out.

(Aside: The best news here is that I believe both the 80% and the 20% are going to get done a whole lot faster with AI assistance, but that’s the subject of another blog post.)

What is a “Programmer” Anyway?

Let’s end on a slightly philosophical note. Thinking back to the Excel example, if the definition of a “programmer” is somebody who gives instructions to a computer to get it to complete a task, then maybe everybody has been a “programmer” for quite some time now. If that’s true, then maybe a more useful distinction is between two different kinds of “programmer” — those whose primary responsibility lies in the business, and those who are mostly focused on computer technology.

Previous
Previous

Do You Really Want to Depend on That Open Source Project?

Next
Next

Test Automation: What Is It Good For?