computer history

As early as the seventeenth century, mathematicians were trying to create a machine that could perform basic mathematical functions such as, addition, subtraction, division and multiplication, and around 1640, Blaise Pascal, a leading French mathematician, constructed the first mechanical adding device. The programming language, Pascal, which is widely used today, was named after Blaise Pascal to honour his contribution to the development of the modern computer.
1804 saw the introduction of an automated punched card machine, which was used to operate weaving looms. This was the first known use of programmed instructions, which led the way to the concepts behind today's computers. Around this time, British inventor, Charles Babbage, designed an all-purpose problem-solving machine, the difference engine, which had a mechanical memory to store the results of calculations.
Babbage's partner, Augusta Ada, suggested using a binary system rather than decimal for data storage; she also refined the design of the analytical engine to include the automatic repetition of a series of calculations - the loop, a procedure used routinely in modern computer programmes.
In the 1850's, British mathematician, George Boole, realised that complex mathematical problems could be solved by reducing them to a series of questions, which could be answered either positively or negatively represented by either a 1 or a 0; thus the binary numbering system and Boolean logic was founded. This theory of Boolean logic became fundamental to the design of computer circuitry.
The early twentieth century saw the next stage in computer history, the first electronic machine, capable of solving simple differential equations. In 1937, George Stibitz constructed his complex number calculator from batteries, flashlight bulbs, wire and strips of metal from a tobacco can! This was the first binary adding machine, and paved the way for all digital computers.
In 1939, John Atanasoff and Clifford Berry built the Atanasoff-Berry computer, known simply as ABC, which is now acknowledged as the world's first general-purpose electronic digital computer. At the time, it raised little interest among the scientific community, and when Dr Atanasoff contacted IBM about his machine, the company said that it would never be interested in an electronic computing machine!!!
In 1941, at the height of the second world war, the first operational general purpose computer was built for Germany by Konrad Zuse. This machine used binary logic and employed vacuum tubes, which increased its speed by a factor of 1,000. However, when Zuse applied for funding to use his machine to break war-time codes, an estimated 2 year project, Hitler refused, insisting that the war would be over before the project was finished!
Alan Turing, an eccentric English genius, pursued Zuse's work and capitalised on his vacuum-tube technology. Turing constructed the Automatic Computer Engine (ACE), which could process 25,000 characters per second and which has also been described as the first programmable digital computer, a point still argued by many historians. In 1944, Howard Atkin, a US Harvard mathematician, completed the first automatic sequence-controlled calculator, the Mark 1. This monster was 51 feet long and 8 feet high and it contained over 750,000 parts strung together with 500 miles of wire.
In 1942, in the University of Pennsylvania's Moore School of Engineering, John Mauchly and J Presper Eckert built a machine to compute artillery firing tables for the Amercian goverment; this device weighing 30 tons and containing 100,000 electronic components, including 17,000 vacuum tubes, was called the Electronic Numerical Integrator and Computer (ENIAC). This machine was 80 feet long and 18 feet high and utilised the decimal numbering system. Mauchly and Eckert also claimed that ENIAC was the first general-purpose electronic digital computer, but in 1973 this matter was settled by a US court, which declared that the Atanasoff-Berry computer was entitled to that honour.
Improvements continued until 1959, when both Jack Kilby, at Texas Insturments, and Robert Noyce, at Fairchild Semiconductor, discovered that resistors, capacitors and transistors could be made from a semiconductor material and that vast numbers of transistors could be etched onto a single silicon chip. Thus, the age of integrated circuits had arrived, and from this point forward, computers continuously decreased in size and increased in power and performance.
The IBM System/360 series of mainframe computers, designed by Gene Amdahl, were introduced in the mid 1960's. The System/360 was a family of machines, with upward compatibility throughout the range providing a relatively cheap upgrade path. This was the era of miniaturisation, and in 1963 the Digital Equipment Corporation produced the first minicomputer, the PDP-1.
By 1970, Intel had produced a memory chip that could store one Kilobyte of information and in the early 1970s the same company managed to intergrate the arithmetic and logic functions of several chips onto a single chip, the world's first microprocessor which enabled the development of the first microcomputers. The earlist microcomputer, the Altair 8800, was developed in 1975 by Ed Roberts; this machine used the Intel microprocessor and had less than 1 kilobyte of memory. This was quickly followed by Tandy's TRS-80, Commodore Business Machine's Personal Electronic Transactor ( the commodore PET ), and the Apple 11, developed by Steve Jobs and Stephen Wozniak.
The market for microcomputer software was also developing at this time, and in 1974 Bill Gates and Paul Allen developed Microsoft BASIC which was used by all of the early microcomputers. In 1981, a momentous year for Microsoft and its founders, IBM adopted Microsoft BASIC and Microsoft's new microcomputer operating system, MS-DOS, for its own microcomputer, the IBM Personal Computer. By 1984, the IBM PC and the MS-DOS operating system had become the de facto standard adopted by all microcomputer suppliers. Microprocessor development, led by Intel and Motorola, was rapid; Intel chips set the PC standards and Motorola chips were adopted by Apple for its Macintosh range. Intel's early 8086 was superseded by the 80286, quickly followed by the 80386, 80486 and Pentium range.
The Computer Generations
First Generation
1951 to 1959
First generation computers were powered by vacuum tubes; they were extremely large machines, occupying huge rooms and consuming vast amounts of energy.
Second Generation
1959 to 1965
Second generation computer systems took advantage of semiconductor technology which meant that transistors replaced the vacuum tubes. This resulted in reduced physical size, faster computing and greater power. The transistor was initially developed by Bell Laboratories, a large US corporation.
Third Generation
1965 to 1971
Third generation computers were made from integrated circuits, again reducing size, faster computing and greater power. Integrated circuits at this time consisted of a piece of silicon about 10mm square on which up to one thousand transistors could be placed. Magnectic discs were improved, greatly increasing storage capacity. Input/Output devices such as monitors and keyboards were introduced and the operating system was first adopted. A new concept was also developed ; "families" of computers, which allowed for upgrading and expansion.
Forth Generation
1971 to Present
From integrated circuits to large scale integration to very large scale integration; this was the start of the microprocessor age. The microprocessor used continued to improve from the 8086, 80286 to the 80486, then Pentium, Pentium 2, and now Pentium 3.