All About Languages Part 1: The Early Years

Since the advent of computing machines, one requirement has been the ability to record instructions and present them again to the machine to perform the same process, to either repeat the same output or act on a different set of data. However, this isn’t a 20th-century problem. In fact, the first working example of this predates the first electronic computer by over 200 years. In 1725, Basile Bouchon created a paper tape device for driving a loom, later improving this by using a ribbon of punched cards. There were some improvements in the following decades. In 1745, Jacques de Vaucanson added automation capabilities, and Joseph-Marie Jacquard invented a new loom around the automation process in 1804. Charles Babbage visited Jacquard when he was designing his analytical engine; his design included what today would be recognized as a CPU, which was controlled by the placement of pegs in barrels. Babbage’s machine design is today recognized as the first example of Turing completeness (ignoring the lack of finite memory), and in some respects, these pegs can be recognized as the first assembly language.

The next major advance in computational processing was Herman Hollerith’s use of punch cards for data processing and the invention of unit record equipment to process them. These electromechanical devices used wire boards to control processing, and a device called a “tabulator” summed columns over a deck of cards and produced small, simple reports. For almost 80 years, this was the primary method of business data processing.

Machines such as ENIAC and Colossus considered the first real computers, used switches and relays to control processing; these could be considered the use of machine language (actual number values, whether decimal or binary). However, as the first von Neumann machines (no separate storage between data and instructions) were developed, something easier was desired to reduce the possibility of data entry errors.

 

As stored-program machines were constructed, assembly language, a mnemonic method to render machine instructions, was developed during the late 1940s and early 1950s. Of course, programs that took the code and created the actual data/machine language were also created and were called assemblers. The first assembly language was used on the British ARC2 (completed in 1949), and as general-purpose stored-program computers (such as the IBM 704) became available, assembly language usage spread. Of course, as each computer’s instruction set was different, each assembly language was different, although all share the concepts of instructions, operands, and data definition.

Another issue with assembly language is that it is one-to-one, i.e., one mnemonic translates to one machine instruction. Macro assemblers allow shortcuts by generating multiple mnemonics from one keyword, but early examples were not powerful enough to support complex routines. A desire to reduce this verbosity, as well as improve readability and comprehension by others, led to the creation of what today are known as high-level languages. The first examples were simple and specialized. For example, early languages included UNIVAC A-2, IBM speedcoding, and Laning and Zierler (MIT). However, most were interpreters rather than compilers, which had the dual negative effects of slower operation and loss of usable data memory. (A notable exception was PACT, a scientific language used on the IBM 701 and 704, which was one of the first compiled languages.) But the new high-level language concept helped early computer designers and users realize that programming machines were now easier. This led to the acceptance of computers by business by showing it was a useful tool for them.

In late 1953, a self-described “lazy” engineer at IBM created a proposal that took the macro concept and expanded on it greatly. His goal was to create a more practical way to program the IBM 704. For example, one keyword could generate 20 machine instructions, and using several keywords could create hundreds that implemented comparison and loop logic. That IBM engineer was John W. Backus and using his idea, he and his development team gave the world FORTRAN (FORmula TRANslation), the first widely used third-generation computer language. From this concept, proposals for other programming languages would arise, such as COBOL (COmmon Business-Oriented Language) and ALGOL (ALGOrithmic Language), and these led to almost all of the high-level languages used today.

Early third-generation languages are the topic of our next installment.

Recent Stories
Machine Learning Tools That Help Businesses Cater to Customers

Analytics and Automation Are Keys to Modern Batch Processing

Integrity Vulnerabilities on the Mainframe Present Great Risk