Mygento
9 min readJul 29, 2022

Have you ever seen photographs of the early computers and wondered how we went from there to the super-cool, high-speed device on which you are currently reading this article?

No? You should, because it’s really wonderful and important.

Learn how programmers throughout history have conceived and innovated new possibilities with programming and coding, and how they’ve used their (primarily mathematical) skills to make their ideas a reality — something we use on a daily basis as a result of their efforts. We’ll spotlight a few notable names and programming-related topics:

Heron of Alexandria

Programming dates all the way back to Mr. Heron of Alexandria’s time in 60 A.D. This guy was a mathematician and engineer recognized for his incredible inventions and technologies. There was one gadget in particular that may have inspired the creation of computer programming; it had puppets on it, and underneath the machine was a system of strings that you could pull to make the puppets do certain things. You could manipulate the puppets to do different things by simply rewiring or reprogramming the machine. So that’s one of the first reported cases where you could program or change the way something worked without having to disassemble and rebuild it.

Joseph Marie Jacquard

Then, in terms of computer programming, everything went black for a while. For the next 1,800 years or more, there was little improvement. The next notable programmer, Joseph Marie Jacquard, arrived in France and pushed programming to the next level by developing the “Jacquard loom.” If you’re unfamiliar with what a loom is, it’s a massive machine that creates specific designs on carpets, rugs, and blankets. Essentially, the Frenchman utilized a series of punch cards made of metal (think early computer programming) that he would stitch together and send through the loom, and the loom would read these cards and weave a pattern appropriately. Consider Jacquard to be the loom programmer.

Charles Babbage

Someone in London, UK, near the close of the nineteenth century, began to make ripples in the programming world. He was also a mathematician (surprise, surprise). Charles Babbage, considered by some to be the “father of the computer,” built the first mechanical computer, known as the analytical engine, which others (who will be mentioned later) would utilize as inspiration in their more complicated, electrical designs and computers. But before that creation, he had a vision of a machine that could execute calculations, which he dubbed the calculating engine. He abandoned it halfway through after asking Parliament for a rather significant budget to try to create the thing when he got a better idea — the analytical engine.

Analytical Engine

So he went back to Parliament to beg for more money to create his new invention, to which Parliament responded, “NO, finish what you started with the calculating engine.” Babbage never finished the calculating engine despite having the better idea to develop the analytical engine and not having the money to do so. Babbage never finished what he started, with the calculating engine only partially built and the analytical engine barely designed. But just because it hadn’t been created yet didn’t mean you couldn’t build “software” for it — which is exactly what one guy did, someone who worked closely with Babbage. Augusta Ada Byron, Countess of Lovelace was her full name, but we’ll call her Ada Lovelace.

Ada Lovelace

Ada Lovelace was a mathematician (oh yes), as well as the world’s first computer programmer, who was interested in what Babbage had devised with his analytical engine. So much so that after an Italian guy wrote a book about the analytical engine, she translated it from Italian to English and learnt everything she could about the machine. So she wasn’t only brilliant with languages, but she was also designing a program, or language, for a machine that didn’t technically exist.

Then, like before, programming advancement slowed until around the middle of the twentieth century, during World War II. This was the point at which contemporary computer programming began to take off. The British (Allies) possessed Alan Turing whereas the Germans had Enigma machines to safeguard their communication during the war. You may remember the name if you saw the 2014 film The Imitation Game.

Alan Turing

Remember when Doctor Strange portrayed Alan Turing in the film “The Imitation Game”? If you haven’t seen it yet, we highly recommend it since it’s a fascinating narrative, albeit a tragic one considering the abuse such a great mind experienced. However, the role he played in the creation of coding is crucial to this story.

During World War II, the British were determined to decipher German coded transmissions. They gathered most of their efforts in Bletchley Park, an English country house, and started to work figuring out the best strategy to crack the case.

Having your city repeatedly bombed appears to be a huge motivator, since the team there achieved rapid advances in the area of automated help, leading to the invention of a machine termed “Colossus” — arguably the world’s first programmable, electronic, digital computer. Turing was a crucial member of the team there, and his work was directly responsible for breaking many German codes, which some historians believe shortened the war by at least a couple of years.

Turing went on to create the ACE (Automatic Computing Engine), which distinguished itself from its predecessors by being the first machine to use “abbreviated computer instructions” — a programming language. Coded messages prompted code-breaking, which demanded speed and efficiency; these traits necessitated machines, which necessitated even more speed and efficiency, which necessitated they run on an abbreviated language that would operate a program, which necessitated…code.

Modern computing was born.

John von Neumann

Coming full circle in terms of developmental demands and advancements, computer technology took off from there. No small part of this was due to John von Neumann, a mathematician, physicist, and general polymath whose work on the Manhattan Project prompted several ideas that he carried forward into algorithmic development, problem-solving with pseudorandom number generators, and the design of computer architecture that is still used today, and which heavily influenced the development of the famed ENIAC machine and the IBM 704.

We could write about von Neumann for the next 50,000 pages and still only scrape the surface of his brilliance. Suffice it to say that he was wicked-smart, extremely influential, and worth learning more about on a variety of levels, particularly the subject of computational progress. Neumann’s theories and practical applications fueled considerable growth in the field of computer programming, particularly in the area of how it operates inside a machine’s architecture. Von Neumann’s explanation of how memory is both stored and accessed is directly attributable to him, and it has permitted several lines of progress to be pursued since his explanation in 1945.

Grace Hopper

We may then use a six-degrees-of-separation approach to make some interesting links. Neumann also provided advice on the EDVAC project, which was led by J. Presper Eckert and John Mauchly. In 1949, the Eckert-Mauchly Computer Corporation employed Grace Hopper as a senior developer on the UNIVAC I project, making her the woman most responsible for the programming language known as COBOL — Common Business-Oriented Language.

In honor of Hopper, Code Platoon awards a Women in Technology Scholarship to a female veteran. During each cohort, the award covers the entire $15,500 tuition for one female Veteran.

Her belief was that programming should be largely written in English because it was easier for most people to understand and work with. Despite being refused for three years at Eckert-Mauchly, she eventually won everyone over and established what would become one of the most influential programming languages in software development.

What is perhaps most fascinating about Grace is that she accomplished all of this while serving in the Naval Reserves, which she joined during WWII (she wanted to be active duty but was too small by Navy standards) and retired from as a Rear Admiral, allowing her to incorporate many of her ideas into Defense Department standards of practice. Her insistence on computer system testing resulted in the convergence of programming languages such as COBOL and FORTRAN, developed by John Backus, and the methods for implementing these tests, which eventually formed the foundation of the National Bureau of Standards, later renamed the National Institute of Standards and Technology (NIST).

Early Computers

The Colossus was the name given to this machine (it really was colossal). Around eleven units assisted in code-breaking during the war, providing the Allies a significant advantage in important battles such as D-Day in Normandy. The majority of them were decommissioned after the war. It was genuinely top-secret, and no one talked about it, and those who didn’t work with it didn’t know about it until the 1950s.

Aside from the Colossus, which was one of the first computers, the “Manchester Baby” was the other “first” computer. It was basically a set of switches and buttons that lit up a variety of lights.

Coding in the Workplace

Many believe the 1980s to be the golden age of technical breakthroughs, and from the standpoint of coding history, it most likely was — 1983 saw the inception of C++, a language that is still used today (think Adobe, Google Chrome, and Microsoft Internet Explorer), while 1987 saw the launch of PERL, a language that is still used by IMDB, Amazon, and Ticketmaster, among others.

Tim Berners-Lee invented the internet in 1989, which has arguably had the greatest impact on our current working lives. Berners-Lee created HTML, URL, and HTTP as part of that invention, which is familiar to anyone who has ever used the internet (that would be 4.2 billion of us).

Some of today’s biggest and most recognizable names in coding emerged in the 1990s, including Python, JAVA, JavaScript, and PHP. We wouldn’t have social networking, Android/OS apps, or streaming services if these didn’t exist.

Microsoft’s.NET, a collection of languages contained within a single framework used for Office and business-related software, has also had a significant impact on the modern workplace. The great majority of office workers are familiar with Office 365, thanks to the.NET framework, which allows those familiar apps to migrate beyond the desktop PC and into the cloud. This clearly has numerous good consequences, including encouraging the rise of remote workers.

Coding skills are becoming increasingly common in industries other than computing and software: you no longer need to be a developer to learn basic code, and many of us are more versed in it than we realize — from formulas in Excel to creating your own website or app, most of us have dabbled in code at some point, perhaps unsuspecting.

Programming Languages

The 1950s were the years when programming languages really took off in computer programming. Short code, established by someone from the corporation IBM, was the first programming language that truly counts as a programming language. Using a set of preset variables or mathematical expressions rather than guiding a machine, as in machine code (as discussed above). In the 1950s, jobs called “compilers” were designed to compile these expressions, variables, and operators into short code. Since then, computer programming jobs have been in high demand.

IBM then developed speedcoding on top of their short code. It was an idea to make coding more efficient and faster (funny that in the 1950s, this is what they thought would be the fastest, most efficient way of coding). With speedcoding, IBM stated, “Programming time should be decreased.” “Isn’t the idea of taking symbols and translating them into other symbols essentially what we created and built the computers to accomplish, and can’t we have the computers automate this?” A-0 and Autocode, the first automatic computer compilers, were created by a couple of people in response to such a topic.

The future of Coding

Something to consider is the future of coding, particularly in the workplace. While many of us have a working knowledge of programming languages, it is projected that there will be a scarcity of professionals in the future. Some believe that there will only be around 400,000 computer science graduates, leaving a stunning 1 million jobs open.

This is why school programs all across the world are including coding into their curricula, and who knows how those future programmers may help revolutionize our workplace technology?

One thing is certain: while ever-changing technology is unavoidable, code will stay at the heart of every advancement.