Programming: How It Got Here

    Programming: How It Got Here

      To talk about the history of the computer, we can go back…really far. Like, the invention of the abacus far. But we aren't because…we aren't. Instead, let's talk about Charles Babbage.

      Babbage—but his friends called him Babby (probably)—was absolutely brilliant, but had a little trouble expressing his ideas to others. When he invented the first mechanical computing device (called the Analytical Engine), Lady Augusta Ada King, popularly known as Lady Ada Lovelace, translated his paper explaining it—and ended up writing annotations that were longer than the work itself. In particular, she added a detailed note on how you could find Bernoulli numbers using the Babbage's Analytical Engine. From it, she basically invented the field of programming.

      Some might argue that Lovelace's algorithm wasn't the first program, but everyone can agree that she did something phenomenal. She realized that the Analytical Engine—the predecessor to computers—could take a set of instructions (a group of inputs) and produce an output. We wouldn't be exaggerating when we say that Lovelace started the whole programming game.

      That was back in the late 1800s, though, so let's jump ahead a few years, shall we?

      The year is 1945. Digital computers were just becoming popular (and taking over warehouses everywhere), but they were still pretty basic. For every program, computers had to be manually rewired. John Von Neumann thought we could do better. He decided to develop a way to give computers instructions to control their hardware.

      That wasn't enough for him. He also invented the idea of creating sub-blocks of code that could be called from anywhere (instead of just following a linear progression of lines). Now that we could actually call code instead of rewriting it, code became way more reusable—and a fundamental piece of programming.

      But this was back when programmers were putting punch cards into computers. If they made a mistake, they had to rewrite the entire card. They kept punching cards for computer programming up until the eighties.

      Yeah. Thank goodness we don't do that anymore. Could you imagine how slow the internet would be?

      One thing led to another, and…Bada bing, bada boom, bada FORTRAN.

      As in this language was invented by Navy Admiral Grace Hopper, introducing that nifty little device called the compiler. Before that, every time a high level language was used, it had to be converted into computer code, which took for-stinkin'-ever. Thanks to FORTRAN, we had a new way of converting it once and using it over and over again.

      And that's the dividing line between historical and modern programming languages. As soon as we started using compilers, we pretty much never went back.

      After FORTRAN, the first big one in widespread use today—C—was invented, introducing all the syntax-y keywords (for, if-else, while, switch…), operators, variables, constants, reference variables, subroutines, data types, and memory management we know and love today. We could probably go on for another day and a half talking about C. Needless to say, C became a major trendsetter in the coding world.

      Now we have a bunch of different types of programming languages, but the main types are object-oriented, functional, and imperative. Following C—as well as its children C++, and C#—languages like Java, Ruby, and Python were created. Java in particular is really popular because it can be "[written] once, run anywhere," meaning that any computer can run the instructions when it's converted to the compiler code.

      So there you have it. We've gone from a computer-less world to one…filled with computers. All thanks to Babby and Lovelace.