Wednesday, January 28, 2009

A Brief History of Computing - Operating Systems

1970

Development of UNIX operating system started. It was later released as C source code to aid portability, and subsequently versions are obtainable for many different computers, including the IBM PC. It and its clones (such as Linux) are still widely used on network and Internet servers. Originally developed by Ken Thomson and Dennis Ritchie.

1975

Unix marketed (see 1970).

1980 - October

Development of MS-DOS/PC-DOS began. Microsoft (known mainly for their programming languages) were commissioned to write the Operating System for the PC, Digital Research failed to get the contract (there is much legend as to the real reason for this). DR's Operating System, CP/M-86 was later shipped but it was actually easier to adapter older CP/M programs to DOS rather than CP/M-86, and CP/M-86 cost $495. As Microsoft didn't have an operating system to sell they bought Seattle Computer Product's 86-DOS which had been written by Tim Paterson earlier that year (86-DOS was also know as Q-DOS, Quick & Dirty Operating System, it was a more-or-less 16bit version of CP/M). The rights were actually bought in July 1981. It is reputed that IBM found over 300 bugs in the code when they subjected the operating system to their testing, and re-wrote much of the code.

Tim Paterson's DOS 1.0 was 4000 lines of assembler.

1981 - August 12

MS-DOS 1.0., PC-DOS 1.0.
Microsoft (known mainly for their programming languages) were commissioned by IBM to write the operating system, they bought a program called 86-DOS from Tim Paterson which was loosely based on CP/M 80. The final program from Microsoft was marketed by IBM as PC-DOS and by Microsoft as MS-DOS, collaboration on subsequent versions continued until version 5.0 in 1991.

Compared to modern versions of DOS version 1 was very basic, the most notable difference was the presence of just 1 directory, the root directory, on each disk. Subdirectories were not supported until version 2.0 (March, 1983).

MS-DOS (and PC-DOS) was the main operating system for all IBM-PC compatible computers until 1995 when Windows '95 began to take over the market, and Microsoft turned its back on MS-DOS (leaving MS-DOS 6.22 from 1993 as the last version written - although the DOS Shell in Windows '95 calls itself MS-DOS version 7.0, and has some improved features like long filename support). According to Microsoft, in 1994, MS-DOS was running on some 100 million computers world-wide.

1982 - March

MS-DOS 1.25, PC-DOS 1.1

1983 - March

MS-DOS 2.0, PC-DOS 2.0Introduced with the IBM XT this version included a UNIX style hierarchical sub-directory structure, and altered the way in which programs could load and access files on the disk.

1983 - May

MS-DOS 2.01

1983 - October

PC-DOS 2.1 (for PC Jr). Like the PC Jr this was not a great success and quickly disappeared from the market.

1983 - October

MS-DOS 2.11

1984 - August

MS-DOS 3.0, PC-DOS 3.0Released for the IBM AT, it supported larger hard disks as well as High Density (1.2 MB) 5�" floppy disks.

1985 - March

MS-DOS 3.1, PC-DOS 3.1This was the first version of DOS to provide network support, and provides some new functions to handle networking.

1985 - October

Version 2.25 included support for foreign character sets, and was marketed in the Far East.

1985 - November

Microsoft Windows Launched. Not really widely used until version 3, released in 1990, Windows required DOS to run and so was not a complete operating system (until Windows '95, released on August 21, 1995). It merely provided a G.U.I. similar to that of the Macintosh., in fact so similar that Apple tried to sue Microsoft for copying the 'look and feel' of their operating system. This court case was not dropped until August 1997.

1985 - December

MS-DOS 3.2, PC-DOS 3.2
This version was the first to support 3�" disks, although only the 720KB ones. Version 3.2 remained the standard version until 1987 when version 3.3 was released with the IBM PS/2.

1987

Microsoft Windows 2 released. It was more popular than the original version but it was nothing special mind you, Windows 3 (see 1990) was the first really useful version.

1987 - April

MS-DOS 3.3, PC-DOS 3.3Released with the IBM PS/2 this version included support for the High Density (1.44MB) 3�" disks. It also supported hard disk partitions, splitting a hard disk into 2 or more logical drives.

1987 - April

OS/2 Launched by Microsoft and IBM. A later enhancement, OS/2 Warp provided many of the 32-bit enhancements boasted by Windows '95 - but several years earlier, yet the product failed to dominate the market in the way Windows '95 did 8 year later.

1987 - October/November

Compaq DOS (CPQ-DOS) v3.31 released to cope with disk partitions >32MB. Used by some other OEMs, but not distributed by Microsoft.

1988 - July/August?

PC-DOS 4.0, MS-DOS 4.0

Version 3.4 - 4.x are confusing due to lack of correlation between IBM & Microsoft and also the USA & Europe. Several 'Internal Use only' versions were also produced.
This version reflected increases in hardware capabilities, it supported hard drives greater than 32 MB (up to 2 GB) and also EMS memory.

This version was not properly tested and was bug ridden, causing system crashes and loss of data. The original release was IBM's, but Microsoft's version 4.0 (in October) was no better and version 4.01 was released (in November) to correct this, then version 4.01a (in April 1989) as a further improvement. However many people could not trust this and reverted to version 3.3 while they waited for the complete re-write (version 5 - 3 years later). Beta's of Microsoft's version 4.0 were apparently shipped as early as '86 & '87.

1988 - November

MS-DOS 4.01, PC-DOS 4.01This corrected many of the bugs seen in version 4.0, but many users simply switched back to version 3.3 and waited for a properly re-written and fully tested version - which did not come until version 5 in June 1991. Support for disk partitions >32Mb.

1990 - May 22

Introduction of Windows 3.0 by Bill Gates & Microsoft. It is true multitasking (or pretends to be on computers less than an 80386, by operating in 'Real' mode) system. It maintained compatibility with MS-DOS, on an 80386 it even allows such programs to multitask - which they were not designed to do. This created a real threat to the Macintosh and despite a similar product, IBM's OS/2, it was very successful. Various improvements were made, versions 3.1, 3.11 - but the next major step did not come until Windows '95 in 1995 which relied much more heavily on the features of the 80386 and provided support for 32 bit applications.

1991 - June

MS-DOS 5.0, PC-DOS 5.0
In order to promote OS/2 Bill Gates took every opportunity after its release to say 'DOS is dead', however the development of DOS 5.0 lead to the permanent dropping of OS/2 development.

This version, after the mess of version 4, was properly tested through the distribution of Beta versions to over 7,500 users. This version included the ability to load device drivers and TSR programs above the 640KB boundary (into UMBs and the HMA), freeing more RAM for programs. This version marked the end of collaboration between Microsoft and IBM on DOS.

1991 - August

Linux is born with the following post to the Usenet Newsgroup comp.os.minix:

Hello everybody out there using minix-I'm doing a (free) operating system (just a hobby, won't bebig and professional like gnu) for 386(486) AT clones.

The post was by a Finnish college student, Linus Torvalds, and this hobby grew from these humble beginnings into one of the most widely used UNIX-like operating systems in the world today. It now runs on many different types of computer, including the Sun SPARC and the Compaq Alpha, as well as many ARM, MIPS, PowerPC and Motorola 68000 based computers.

In 1992, the GNU project (http://www.gnu.org/) adopted the Linux kernel for use on GNU systems while they waited for the development of their own (Hurd) kernel to be completed. The GNU project's aim is to provide a complete and free UNIX like operating system, combining the Linux or Hurd platform with the a complete suite of free software to run on it. In order to allow it to carry the GNU name, the Linux kernel copyright was changed to the GNU Public License Agreement (http://www.gnu.org/copyleft/gpl.html) on the 1st of February 1992.

1992 - April

Introduction of Windows 3.1

1993 - July 27

Windows NT 3.1, the first release of the Windows NT series, was released. Its name was chosen to match the current version of the 16 bit version of Microsoft Windows. NT contained a completely new 'kernel' at the core of the operating system, unlike Windows 3.x it was not based on top of MS-DOS. It was designed to be platform independant; original development was targetted at the Intel i860 processor but it was ported to MIPS and then to Intel's popular 80386 processor. The 'Win32' API was developed for Windows NT, providing a native 32 bit API that programmers used to the 16 bit versions of Microsoft Windows would be at home with.

1993 - December

MS-DOS 6.0. This included a Hard-Disk compression program called DoubleSpace, but a small computing company called 'Stac' claimed that DoubleSpace was partly a copy of their Compression Program, Stacker. After paying damages Microsoft withdrew DoubleSpace from MS-DOS 6.2, releasing a new program - DriveSpace - with MS-DOS version 6.22. In operation and programming interface DriveSpace remains virtually identical to DoubleSpace. MS-DOS 6.22 remains the last version of MS-DOS released, since Microsoft turned its efforts to Windows '95. Windows '95 (and later) DOS shell reports itself as DOS 7 - and includes a few enhancements, e.g. support for long filenames.

1994 - March 14

Linus Torvalds released version 1.0 of the Linux Kernel.

1994 - September

PC-DOS 6.3 Basically the same as version 5.0 this release by IBM included more bundled software, such as Stacker (the program that caused Microsoft so much embarrassment) and anti-virus software.

1994 - September 21

Microsoft released Windows NT 3.5. This included many features missing from the original 3.1 release, including support for compressed files and Netware compatibility.

1995 - March

Linus released Linux Kernel v1.2.0 (Linux'95).

1995 - May 30

The main feature of Windows NT 3.51 was a version supporting IBM's Power PC processor. Delays in the release of the processor meant delays in the release of Windows NT 3.51 (NT 3.51 only exists because the processor wasn't ready in time for NT 3.5). As the development team waited for the release of the processor they fixed bugs in the existing codebase. This made NT 3.51 reliable and therefore popular with customers.

1995 - August 21 [poss. 23]

Windows '95 was launched by Bill Gates & Microsoft. Unlike previous versions of Windows, Windows '95 is an entire operating system - it does not rely on MS-DOS (although some remnants of the old operating system still exist). Windows '95 was written specially for the 80386 and compatible computers to make 'full' use of its 32 bit processing and multitasking capabilities, and thus in some respects it is much more similar to Windows NT than Windows 3.x. Both Windows 95 and Windows NT provide the Win32 API for programmers, and when Windows NT 4 was released it had an almost identical user interface to Windows 95. Unfortunately, in order to maintain backwards compatibility, Windows 95 doesn't impose the same memory protection and security measures that NT does and so suffers from much worse stability, reliability and security. Despite being remarkable similar in function to OS/2 Warp (produced by IBM and Microsoft several years earlier, but marketed by IBM), Windows '95 has proved very popular.

1996

Windows '95 OSR2 (OEM System Release 2) was released - partly to fix bugs found in release 1 - but only to computer retailers for sale with new systems. There were actually two separated releases of Windows 95 OSR2 before the introduction of Windows '98, the second of which contained both USB and FAT32 support - the main selling points of Windows '98. FAT32 is a new filing system that provides support for disk paritions bigger than 2.1GB and is better at coping with large disks (especially in terms of wasted space).

1996 - June 9

Linux 2.0 released. 2.0 was a significant improvement over the earlier versions: it was the first to support multiple architectures (originally developed for the Intel 386 processor, it now supported the Digital Alpha and would very soon support Sun SPARC many others). It was also the first stable kernel to support SMP, kernel modules, and much more.

1996 - July 31

Windows NT 4.0 was released. The main feature was an update of the user interface to match Windows 95.

1998 - June 25

Microsoft released Windows '98. Some U.S. attorneys tried to block its release since the new O/S interfaces closely with other programs such as Microsoft Internet Explorer and so effectively closes the market of such software to other companies. Microsoft fought back with a letter to the White House suggesting that 26 of its industry allies said that a delay in the release of the new O/S could damage the U.S. economy. The main selling points of Windows '98 were its support for USB and its support for disk paritions greater than 2.1GB.

1999 - Jan 25

Linux Kernel 2.2.0 Released. The number of people running Linux is estimated at over 10million, making it an not only important operating system in the Unix world, but an increasingly important one in the PC world.

2000 - Feb 17

Offical Launch of Windows 2000 - Microsoft's replacement for Windows 95/98 and Windows NT. Claimed to be faster and more reliable than previous versions of Windows. It is actually a descendant of the NT series, and so the trade-off for increased reliability is that it won't run some old DOS-based games. To keep the home market happy Microsoft also released Windows ME, the newest member of the 95/98 series.

2001 - Jan 4

Linux kernel 2.4.0 released.

2001 - March 24

Apple released MacOS X. At its heart is `Darwin', an Open Source operaing system on FreeBSD. Using this MacOS X finally gives Mac users the stabilty benifits of a protected memory architecture along many other enhancements, such as preemptive multitasking. The BSD base also makes porting UNIX applications to MacOS easier and gives Mac users a fully featured command line interface alongside their GUI.

2001 - October 25

Microsoft released Windows XP - the latest version of their Windows operating system. Based on the NT series kernel, it was intended to bring together both the NT/2000 series and the Windows 95/98/ME series into one product. Of, course, it was originally hoped that this would happen with Windows 2000 but that failed. This failure was largely because of compatibility with some older applications, notably for home users problems with MS-DOS based games. Windows XP owes its success in part to some improvments in compability, and in part to time having passed - rendering much of the incompatible software obsolete anyway.

2003 - April 24

Windows Server 2003 is the latest incarnation of what began life as Windows NT. Windows Server 2003 is, as the name suggests, targetted at servers rather than workstations and home PCs, those are the realm of Windows XP. Security and reliability were key aims during the development and release of Windows Server 2003, critical if Windows is to replace the UNIX systems that serve many enterprises.

2003 - October 24

MacOS 10.3 continues to improve MacOS X, with major updates to 'Aqua' (the user interface) as well as performance improvements and new features.

2003 - December 17

Linux kernel 2.6.0 released. Many features from uClinux (designed for embedded microcontrollers) have been integrated, along with support for NUMA (used in large, multi-processor systems). an improved scheduler and scalability improvements help ensure Linux will maintain its reputation for running on everything from small embedded devices to large enterprise-class servers and even mainframes. As always support for new classes of hardware has been significantly improved.

© Copyright 1996-2005, Stephen White

A Brief History of the Internet and Related Networks

Introduction

In 1973, the U.S. Defense Advanced Research Projects Agency (DARPA) initiated a research program to investigate techniques and technologies for interlinking packet networks of various kinds. The objective was to develop communication protocols which would allow networked computers to communicate transparently across multiple, linked packet networks. This was called the Internetting project and the system of networks which emerged from the research was known as the “Internet.” The system of protocols which was developed over the course of this research effort became known as the TCP/IP Protocol Suite, after the two initial protocols developed: Transmission Control Protocol (TCP) and Internet Protocol (IP).

In 1986, the U.S. National Science Foundation (NSF) initiated the development of the NSFNET which, today, provides a major backbone communication service for the Internet. With its 45 megabit per second facilities, the NSFNET carries on the order of 12 billion packets per month between the networks it links. The National Aeronautics and Space Administration (NASA) and the U.S. Department of Energy contributed additional backbone facilities in the form of the NSINET and ESNET respectively. In Europe, major international backbones such as NORDUNET and others provide connectivity to over one hundred thousand computers on a large number of networks. Commercial network providers in the U.S. and Europe are beginning to offer Internet backbone and access support on a competitive basis to any interested parties.

“Regional” support for the Internet is provided by various consortium networks and “local” support is provided through each of the research and educational institutions. Within the United States, much of this support has come from the federal and state governments, but a considerable contribution has been made by industry. In Europe and elsewhere, support arises from cooperative international efforts and through national research organizations. During the course of its evolution, particularly after 1989, the Internet system began to integrate support for other protocol suites into its basic networking fabric. The present emphasis in the system is on multiprotocol interworking, and in particular, with the integration of the Open Systems Interconnection (OSI) protocols into the architecture.

Both public domain and commercial implementations of the roughly 100 protocols of TCP/IP protocol suite became available in the 1980’s. During the early 1990’s, OSI protocol implementations also became available and, by the end of 1991, the Internet has grown to include some 5,000 networks in over three dozen countries, serving over 700,000 host computers used by over 4,000,000 people.

A great deal of support for the Internet community has come from the U.S. Federal Government, since the Internet was originally part of a federally-funded research program and, subsequently, has become a major part of the U.S. research infrastructure. During the late 1980’s, however, the population of Internet users and network constituents expanded internationally and began to include commercial facilities. Indeed, the bulk of the system today is made up of private networking facilities in educational and research institutions, businesses and in government organizations across the globe.

The Coordinating Committee for Intercontinental Networks (CCIRN), which was organized by the U.S. Federal Networking Council (FNC) and the European Reseaux Associees pour la Recherche Europeenne (RARE), plays an important role in the coordination of plans for government- sponsored research networking. CCIRN efforts have been a stimulus for the support of international cooperation in the Internet environment.

Internet Technical Evolution

Over its fifteen year history, the Internet has functioned as a collaboration among cooperating parties. Certain key functions have been critical for its operation, not the least of which is the specification of the protocols by which the components of the system operate. These were originally developed in the DARPA research program mentioned above, but in the last five or six years, this work has been undertaken on a wider basis with support from Government agencies in many countries, industry and the academic community. The Internet Activities Board (IAB) was created in 1983 to guide the evolution of the TCP/IP Protocol Suite and to provide research advice to the Internet community.

During the course of its existence, the IAB has reorganized several times. It now has two primary components: the Internet Engineering Task Force and the Internet Research Task Force. The former has primary responsibility for further evolution of the TCP/IP protocol suite, its standardization with the concurrence of the IAB, and the integration of other protocols into Internet operation (e.g. the Open Systems Interconnection protocols). The Internet Research Task Force continues to organize and explore advanced concepts in networking under the guidance of the Internet Activities Board and with support from various government agencies.

A secretariat has been created to manage the day-to-day function of the Internet Activities Board and Internet Engineering Task Force. IETF meets three times a year in plenary and its approximately 50 working groups convene at intermediate times by electronic mail, teleconferencing and at face-to-face meetings. The IAB meets quarterly face-to-face or by videoconference and at intervening times by telephone, electronic mail and computer-mediated conferences.

Two other functions are critical to IAB operation: publication of documents describing the Internet and the assignment and recording of various identifiers needed for protocol operation. Throughout the development of the Internet, its protocols and other aspects of its operation have been documented first in a series of documents called Internet Experiment Notes and, later, in a series of documents called Requests for Comment (RFCs). The latter were used initially to document the protocols of the first packet switching network developed by DARPA, the ARPANET, beginning in 1969, and have become the principal archive of information about the Internet. At present, the publication function is provided by an RFC editor.

The recording of identifiers is provided by the Internet Assigned Numbers Authority (IANA) who has delegated one part of this responsibility to an Internet Registry which acts as a central repository for Internet information and which provides central allocation of network and autonomous system identifiers, in some cases to subsidiary registries located in various countries. The Internet Registry (IR) also provides central maintenance of the Domain Name System (DNS) root database which points to subsidiary distributed DNS servers replicated throughout the Internet. The DNS distributed database is used, inter alia, to associate host and network names with their Internet addresses and is critical to the operation of the higher level TCP/IP protocols including electronic mail.

There are a number of Network Information Centers (NICs) located throughout the Internet to serve its users with documentation, guidance, advice and assistance. As the Internet continues to grow internationally, the need for high quality NIC functions increases. Although the initial community of users of the Internet were drawn from the ranks of computer science and engineering, its users now comprise a wide range of disciplines in the sciences, arts, letters, business, military and government administration.

Related Networks

In 1980-81, two other networking projects, BITNET and CSNET, were initiated. BITNET adopted the IBM RSCS protocol suite and featured direct leased line connections between participating sites. Most of the original BITNET connections linked IBM mainframes in university data centers. This rapidly changed as protocol implementations became available for other machines. From the beginning, BITNET has been multi-disciplinary in nature with users in all academic areas. It has also provided a number of unique services to its users (e.g., LISTSERV). Today, BITNET and its parallel networks in other parts of the world (e.g., EARN in Europe) have several thousand participating sites. In recent years, BITNET has established a backbone which uses the TCP/IP protocols with RSCS-based applications running above TCP.

CSNET was initially funded by the National Science Foundation (NSF) to provide networking for university, industry and government computer science research groups. CSNET used the Phonenet MMDF protocol for telephone-based electronic mail relaying and, in addition, pioneered the first use of TCP/IP over X.25 using commercial public data networks. The CSNET name server provided an early example of a white pages directory service and this software is still in use at numerous sites. At its peak, CSNET had approximately 200 participating sites and international connections to approximately fifteen countries.

In 1987, BITNET and CSNET merged to form the Corporation for Research and Educational Networking (CREN). In the Fall of 1991, CSNET service was discontinued having fulfilled its important early role in the provision of academic networking service. A key feature of CREN is that its operational costs are fully met through dues paid by its member organizations.

Friday, January 23, 2009

A Brief History of Computers and Networks


As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Powered by water, this "machine" came 140 years before the development of the modern computer.

Shortly after the first mass-produced calculator(1820), Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recognized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's IBM.

Just prior to the introduction of Hollerith's machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and counterclaims of who invents what and when. Part of the problem lies in the international situation that makes much of the research secret. Other problems include poor record-keeping, deception and lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938.

John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The "ABC" is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calculations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is cannibalized by students.

The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a "Universal Machine" capable of "computing" any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a conglomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.

In 1943 development begins on the Electronic Numerical Integrator And Computer (ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of the Moore School, they get help from John von Neumann and others. In 1944, the Havard Mark I is introduced. Based on a series of proposals from Howard Aiken in the late 1930's, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to store instructions and Aiken hires Grace Hopper("Amazing Grace") as one of three programmers working on the machine. Thomas J. Watson Sr. plays a pivotal role involving his company, IBM, in the machine's development.

Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the relays, possibly causing the problem. From this day on, Hopper refers to fixing the system as "debugging". The same year Von Neumann proposes the concept of a "stored program" in a paper that is never officially published.

Work completes on ENIAC in 1946. Although only three years old the machine is woefully behind on technology, but the inventors opt to continue while working on a more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A later version eliminates this problem. To make the machine appear more impressive to reporters during its unveiling, a team member (possibly Eckert) puts translucent spheres(halved ping pong balls) over the lights. The US patent office will later recognize this as the first computer.

The next year scientists employed by Bell Labs complete work on the transistor (John Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in 1956), and by 1948 teams around the world work on a "stored program" machine. The first, nicknamed "Baby", is a prototype of a much larger machine under construction in Britain and is shown in June 1948.

The impetus over the next 5 years for advances in computers is mostly the government and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an employee of that company proposes "reuseable software," code segments that could be extracted and assembled according to instructions in a "higher level language." The concept of compiling is born. Hopper would revise this concept over the next twenty years and her ideas would become an integral part of all modern computers. CBS uses one of the 46 UNIVAC computers produced to predict the outcome of the 1952 Presidential Election. They do not air the prediction for 3 hours because they do not trust the machine.

IBM introduces the 701 the following year. It is the first commercially successful computer. In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the compiler). Two additional languages, LISP and COBOL, are added in 1957 and 1958. Other early languages include ALGOL and BASIC. Although never widely used, ALGOL is the basis for many of today's languages.

With the introduction of Control Data's CDC1604 in 1958, the first transistor powered computer, a new age dawns. Brilliant scientist Seymour Cray heads the development team. This year integrated circuits are introduced by two men, Jack Kilby and John Noyce, working independently. The second network is developed at MIT. Over the next three years computers begin affecting the day-to-day lives of most Americans. The addition of MICR characters at the bottom of checks is common.

In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all computers use these instead of the transistor. Formally building sized computers are now room-sized, and are considerably more powerful. The following year the Atlas becomes operational, displaying many of the features that make today's systems so powerful including virtual memory, pipeline instruction execution and paging. Designed at the University of Manchester, some of the people who developed Colossus thirty years earlier make contributions.

On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main feature of this machine is business oriented...IBM guarantees the "upward compatibility" of the system, reducing the risk that a business would invest in outdated technology. Dartmouth College, where the first network was demonstrated 25 years earlier, moves to the forefront of the "computer age" with the introduction of TSS(Time Share System) a crude(by today's standards) networking system. It is the first Wide Area Network. In three years Randy Golden, President and Founder of Golden Ink, would begin working on this network.

Within a year MIT returns to the top of the intellectual computer community with the introduction of a greatly refined network that features shared resources and uses the first minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major roles in its design.

In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its own operating system, UNIX. One of the many precursors to today's Internet, ARPANet, is quietly launched. Alan Keys, who will later become a designer for Apple, proposes the "personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of technicians begin discussing forming their own company. This company, formed the next year, would be known as Intel. The movie Colossus:The Forbin Project has a supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first feature length movie with the word computer in the title. In 1971, Texas Instruments introduces the first "pocket calculator." It weighs 2.5 pounds.

With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little publicized judicial decision takes the patent for the computer away from Mauchly and Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for the first local area networks.

In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard. The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with the first royal email message.

During the next few years the personal computer explodes on the American scene. Microsoft, Apple and many smaller PC related companies form (and some die). By 1977 stores begin to sell PC's. Continuing today, companies strive to reduce the size and price of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's actually IBM's second attempt, but the first failed miserably). Time selects the computer as its Man of the Year in 1982. Tron, a computer-generated special effects extravaganza is released the same year.

(The source of the site goldenink)