A brief introduction to the history of programming.

In the early days of computers, hardware was expensive and programmers were cheap. These cheap programmers don’t even have the title “programmer” and are often filled by mathematicians or electrical engineers. Early computers were used to solve complex mathematical problems quickly, so mathematicians were naturally suited to “programming” work.

### What is a program?

First, a little background. Computers can’t do anything by themselves. They have to be programmed to do anything. You can think of a program as a very precise recipe that reads an input and then produces the corresponding output. The steps in a recipe are made up of instructions to manipulate data. It sounds a bit complicated, but you probably know what this statement means:

``1 plus 2 is 3Copy the code``

Where the plus sign is “instruction” and the numbers 1 and 2 are data. An equal sign in mathematics means that both sides of an equation are “equivalent,” but in most programming languages, using an equal sign for a variable means “assignment.” If the computer executes the above statement, it stores the result of the addition (i.e., “3”) somewhere in memory.

Computers know how to use numbers to perform mathematical operations and move data around memory structures. I won’t expand memory here, just know that memory generally falls into two categories: “fast/small space” and “slow/large space”. The CPU register is very fast to read and write, but the space is very small, equivalent to a shorthand note. The main memory usually has a lot of space, but the read/write speed is much lower than the register speed. As the program runs, the CPU keeps moving the data it needs from main memory to registers and then putting the results back into main memory.

### The compiler

Computers were expensive and manpower was cheap. Programmers spend a lot of time translating handwritten mathematical expressions into instructions that computers can execute. The first computers had very poor user interfaces, some even having toggle switches on the front panel. These switches represent zeros and ones in a memory cell. The programmer needs to configure a memory unit, select a storage location, and commit the unit to memory. This is a time-consuming and error-prone process.

The programmerBetty Jean Jennings(left) andFran Bilas(Right) in operationENIACThe main control panel

Then an electrical engineer decided his time was valuable, so he wrote a program that could turn human-readable “cookbook” inputs into computer-readable versions. This was the original “assembler,” which caused quite a bit of controversy at the time. The owners of these expensive machines do not want to waste computing resources on tasks that people can already do (albeit slowly and error-prone). But over time, people found that using an assembler was faster and more accurate than writing machine language by hand, and the “real work” done by computers increased.

Although the assembler is a big step forward from switching bits’ states on the machine panel, it is still very professional. The above addition example would look something like this in assembly language:

``````01 MOV R0, 1
02 MOV R1, 2
04 MOV 64, R0
05 STO R2, R0
Copy the code``````

Each line is a computer instruction, preceded by a shorthand of the instruction, followed by the data that the instruction operates on. This little program first “moves” the value 1 to register R0 and then 2 to register R1. Line 03 adds the values in registers R0 and R1, and stores the result in register R2. Finally, lines 04 and 05 determine where the results should be placed in main memory (address 64 in this case). Managing where data is stored in memory is one of the most time-consuming and error-prone parts of the programming process.

### The compiler

The assembler was already much better than writing computer instructions by hand, but early programmers longed to be able to write programs the way they were used to, as if they were writing mathematical formulas. This need has driven the development of high-level compiled languages, some of which are history and others still in use today. ALGO, for example, is history, but languages like Fortran and C continue to solve practical problems.

Pedigree trees of ALGO and Fortran programming languages

These “high-level” languages allow programmers to write programs in simpler ways. In C, our addition program looks like this:

``````int x;
x = 1 + 2;
Copy the code``````

The first statement describes a block of memory that the program will use. In this case, the block of memory should be an integer size named X. The second statement is addition, although it is written backwards. A C programmer would say this is “the result of X being assigned the value 1 plus 2.” Note that the programmer does not have to decide where to store X in memory; that task is left to the compiler.

This new program, called a “compiler,” converts programs written in high-level languages into assembly language, and then uses the assembler to convert assembly language into machine-readable programs. This combination of programs is often called a “tool chain,” because the output of one program becomes the input of another.

Compiled languages have an advantage over assembly languages when moving from one computer to another computer of a different model or brand. In the early days of computing, companies including IBM, DEC, Texas Instruments, UNIVAC, and HEWLETT-PACKARD were making lots of different types of computer hardware. These computers don’t have much in common except that they all need to be connected to a power source. The differences in memory and CPU architectures were so great that it often took years to translate one computer’s programs into another.

With a high-level language, we just need to migrate the compiler toolchain to the new platform. As long as a compiler is available, programs written in high-level languages can be recompiled on a new computer with at most minor modifications. The compilation of high-level languages is truly revolutionary.

The IBM PC XT, released in 1983, was an early example of falling hardware prices.

The lives of programmers have been greatly improved. By contrast, expressing the problem they want to solve in a high-level language makes things a lot easier. Thanks to advances in semiconductor technology and the invention of integrated chips, the price of computer hardware has fallen sharply. Computers are getting faster and more powerful, and they’re getting a lot cheaper. At some point (probably in the late 1980s), things turned around and programmers became more valuable than the hardware they used.

### The interpreter

Over time, a new way of programming arose. A special program called an interpreter can read a program directly and convert it into computer instructions for immediate execution. Much like a compiler, the interpreter reads the program and transforms it into an intermediate form. But unlike a compiler, an interpreter executes this intermediate form of the program directly. Interpreted languages go through this process every time they are executed; A compiler only needs to compile once, and the computer only needs to execute the compiled machine instructions each time.

Incidentally, this feature is what causes people to feel that interpreted programs run slower. But modern computers are so powerful that most people can’t tell the difference between compiled and interpreted programs.

Interpreted programs (sometimes called “scripts”) are even easier to port to different hardware platforms. Because scripts do not contain any machine-specific instructions, the same version of the program can run directly on many different computers without modification. But of course, the interpreter would have to be ported to a new machine first.

A popular interpreted language is Perl. A complete expression of our addition problem in Perl would look like this:

``````\$x = 1 + 2
Copy the code``````

While the program looks similar to the C version and doesn’t run much differently, it lacks a statement to initialize variables. There are a few other differences (beyond the scope of this article), but you’ll have noticed that the way we write computer programs is very similar to how mathematicians write mathematical expressions on paper and by hand.

### The virtual machine

The latest form of programming is the virtual machine (often referred to as the VM). VMS are classified into two types: system VMS and process VMS. Both virtual machines provide a different level of abstraction from “real” computing hardware, but they are scoped differently. A system virtual machine is software that provides an alternative to physical hardware, while a process virtual machine is designed to execute programs in a “system independent” fashion. So in this example, the scope of the process virtual machine (which I will refer to in the future as virtual machine) is similar to that of the interpreter, in that the program is compiled into an intermediate form, and then the virtual machine executes the intermediate form.

The main difference between a virtual machine and an interpreter is that a virtual machine creates a virtual CPU and a virtual instruction set. With this layer of abstraction, we can write front-end tools to compile programs from different languages into programs that the virtual machine can accept. Perhaps the most popular and well-known virtual machine is the Java Virtual Machine (JVM). JVMS originally supported only the Java language in the 1990s, but now run many popular programming languages, including Scala, Jython, JRuby, Clojure, and Kotlin. There are other, less common examples that I won’t mention here. I also recently learned that Python, my favorite language, is not an interpreted language, but a language that runs on virtual machines!

Virtual machines continue a historical trend of requiring less and less knowledge of a particular computing platform for programmers to solve problems using domain-specific programming languages.

### So that’s it

Hope you enjoyed this brief introduction to how the software works behind the scenes. Is there anything else you’d like me to talk about next? Let me know in the comments.

Via: opensource.com/article/19/…

By Erik O ‘shaughnessy, lujun9972