History of Computers PtII

We are back to continue our overview of the history of computing.

In part I we were introduced to the legendary, Charles Babbage, Ada Lovelace, Alan Turing and Vannevar Bush among others. These early computer scientists (and others) introduced the foundation of computing. Whilst we now know of the origins of computers we have neglected to discuss programming. So this is where we resume our attention in part II.

Binary lots of zeroes and ones.

All computers store information and instructions in binary — either as a 1 or 0. Each number (either a 1 or 0) is referred to as a bit- think about how your phone may have a storage of 32GB that means 32 000 000 000, 0s and 1s!! Up until around 1949, programmers would have to manually enter these number in to program their computers. This is obviously a tedious and extremely error prone (and infuriating) process. With the invention of programming languages, manipulating computers to perform tasks and programming itself became far easier (though still difficult…).

Assembly language

The first programming language developed was assembly language which is essentially writing the programs in a similar fashion to the way the machine is instructed. The ESDSAC is considered to be the first computer which used assembly language- employing the use of symbolic code rather. Whilst an improvement on machine code, assembly language was still not ideal. First each machine has slightly different operating systems and instructions meaning that assembly language is not universal. Secondly- assembly language was not very readable.

Here is an example of a simple program, using MIPS an assembly language invented in 1985, where the user inputs their age and the inputed value is printed:

.datahowdy: .asciiz “How old are you? \n”age: .word 0age_str_1: .asciiz “You are “age_str_2: .asciiz “ years old. \n”
.text#prompting for their ageaddi $v0, $0, 4la $a0, howdysyscall#reading age and storing in ageaddi $v0, $0, 5syscallsw $v0, age#printing age str_1addi $v0, $0, 4la $a0, age_str_1syscall#printing inputed nameaddi $v0, $0, 1lw $a0, agesyscall#printing age str_2addi $v0, $0, 4la $a0, age_str_2syscall#exiting programaddi $v0, $0, 10syscall

I had the joy of learning MIPS the past few weeks in my introduction to Computer Science for Software Engineers class. Meanwhile the same code in python is simply

age=input(‘How old are you?’)print(‘You are ‘ + age + ‘ years old.’)

One of the co-creators of the UNIVAC Grace Hopper continued her work on computers by creating the first computer compiler in 1952. A computer compiler essentially translates code understandable to humans into code understandable by computers. Such an invention opened the functionality and number of programming languages. Soon high level programming languages like COBOL (1959), FORTRAN (1954) evolved and later (and perhaps more universally known) C (1972), HTML (1990) and JavaScript (1995). To see the development of program langauges check out this nice timeline by Techno Lush.

A fun fact about Hopper is she deemed to have coined the term bug and debugging- still used and a constant annoyance of programmers today. In 1947 she found a moth in the computer she was working with- and by removing the moth she debugged the machine which then resumed working.

The great breakthrough — the transistor -> integrated circuits

Returning to computers, the next breakthrough came a few years earlier, in 1947, when the first working transistor was created. Transistor can work not only as a logic gate, an on, off switch when a certain threshold voltage is achieved but also can amplify the output signal. The transistor was superior to the vacuum tube being more compact and requiring less power.

We cannot underestimate the evolutionary power of the transistor in electronics and in-particular computers. From here, we enter the new and current era of computers- one where computer evolution is (still) frequent and the power and speed of computers seems limitless.

Transistors and its close relatives, the resistor and capacitor, could be connected by awkward wiring to create a circuit. The individual components of the circuit where still moderately large and the wires between them wide. However this soon changed in 1958, with the invention of the integrated circuit in the US by Jack Kilby. Though the integrated circuit had been previously discovered in the UK by Geoffrey Dummer, it was widely overlook by the UK military and business. Integrated circuits saw thin paths of metal placed on the same foundation as the circuit pieces laid. The thin metal acted to connect the various pieces of the circuit into a much smaller integrated (as all parts together as a whole!) circuit. This allowed for thousands of resistors, transistors and capacitors to sit together in a small space, an unrealistic feat when computers relied on flimsy vacuum tubes. Integrated circuits opened up a new front of computing power which previously had simply not been logistically obtainable.

The new age of computing

Soon after the invention of the transistor- IBM (International Business Machines) enters the computer scene. Though today IBM may not be considered by the public as a computer giant (having shifted into hosting and consulting)- it once essentially had a monopoly over the computer market. IBM is responsible for some world changing inventions including the ATM, hard disk drive, floppy disks, magnetic stripe cards and the SQL programming language (used mainly for data management). 2019, marked 27 consecutive years that IBM has held the record for most patents generated by a U.S business. IBM’s origins can be traced back to the 1890s, where what was to become the Tabulating Machine Company optimised the census process.

“Herman Hollerith’s tabulating system is used in the U.S. Census, reducing a nearly 10-year long process to two and a half years and saving $5 million.”

Now in the late 50s and 60s, we enter the Space Race. Both the Soviets and the USA were pouring enormous amounts of money and resources into the proxy war for intellectual and political supremacy. Recall our initial definition of the computer- a human one who performed long, complex mathematical calculations- think the movie ‘Hidden Figures.’ With the advent of electronic computers NASA and IBM worked together to create machines to assist the US to land men on the moon and win the title of space supremacy. Again- war even in the guise of space exploration promoted the investment and evolution of computers.

During the Space race in 1968, engineer Douglas Englebart held what was to become known as the ‘Mother of all Demos.’ Englebart demonstrated many of the features of today’s modern computer with a working mouse and graphical user interface (GUI). A GUI is the visual component of a computer — what is displayed on the screen. Engelbart sought to

“Contribute significantly to the way humans could handle complexity and urgency, that would be universally helpful.”

He saw computers as offering an untapped potential in helping not just the government and large corporations but also the average worker. Englebart also introduced document editing and coping, hyperlinks, real-time collaboration and video conferencing!! Further he described the principles of the information sharing system the ARPANET- which served as the foundation of today’s internet.

With Englebart’s demonstration and recognition of computers helping and improving humanity we transition from seeing computers used solely for large powerful corporations and the rich and powerful elite to the possibility is could ve used by the masses. Finally, humans begin to recognise computer’s potential in their everyday life. Still, it was two full decades before personal computers hit the scene.

From here we see the invention of Intel (also 1968), the floppy disk (1971) and the Ethernet (1973)allowing computers to process more faster, to share information between one another and the connection of computers, respectively. At around the same, time personal computer begin to hit the market. It is important to note that these ‘so called personal computer,’ bear little resemblance to what we think of as computers. For instance, the IBM 5100 (one of the first portable computers) weighed 24kg it was the size of a small suitcase and came with a transportation case- making it ‘portable.’ Meanwhile, the price range was $9,000 to $20,000 (USD) depending on memory capacity. Note that this value is the 1974 valuation –with inflation that would be in the $47,000–100,000 range in 2019. This price makes us reconsider the few thousand we fork out for today’s top of the range computers which have processing power, speed and beautiful design that make the IBM5100 seem primitive….

We will pause our story here and return soon to learn of the origins of some of today’s biggest tech giants — Microsoft and Apple !!

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store